Archive | Human Exploration Telerobotics (HET) RSS

Robotic “Bees” Are About to Join Astronauts in Space

There are some things only humans can do in space. The rest can be left to robots. To free up valuable time for astronauts living and working aboard the International Space Station, we’re sending three robotic helpers to the orbiting outpost. Developed and built at our Ames Research Center in California’s Silicon Valley, the cube-shaped Astrobee robots will each stay as busy as a bee flying around the space station and assisting crew with routine tasks like maintenance and tracking inventory. The robots will also help researchers on the ground carry out experiments, test new technologies and study human-robot interaction in space. Learning how robots can best work with humans in close proximity will be key for exploring the Moon and other destinations. Get to know more about our new robots headed to space:

The Astrobee robots were tested inside a special lab at our Ames Research Center where researchers created a mockup of the space station’s interior.

The flying robots are propelled by fans. They can move in any direction and turn on any axis in space.

Each robot is equipped with cameras and sensors for navigating inside the space station and avoiding obstacles.

Claw power! Astrobees have a robotic arm that can be attached for handling cargo or running experiments.

Astrobee is battery powered. When its battery runs low, the robot will autonomously navigate and dock to a power station to recharge.

The robots can operate in either fully automated mode or under remote control by astronauts or researchers on Earth.

Astrobee builds on the success of SPHERES, our first-generation robotic assistant that arrived at the space station in 2006.

Two of the three Astrobee robots are scheduled to launch to space this month from our Wallops Flight Facility in Virginia! Tune in to the launch at www.nasa.gov/live.

Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com

 

 
 
Read Full Article
*Source: NASA Tumblr

Continue Reading

The Dark Side of the Crater: How Light Looks Different on the Moon

 
 
Things look different on the Moon. Literally.

Because the Moon isn’t big enough to hold a significant atmosphere, there is no air and there are no particles in the air to reflect and scatter sunlight. On Earth, shadows in otherwise bright environments are dimly lit with indirect light from these tiny reflections. That lighting provides enough detail that we get an idea of shapes, holes and other features that could be obstacles to someone – or some robot – trying to maneuver in shadow.

“What you get on the Moon are dark shadows and very bright regions that are directly illuminated by the Sun – the Italian painters in the Baroque period called it chiaroscuro – alternating light and dark,” said Uland Wong, a computer scientist at NASA’s Ames Research Center in Silicon Valley. “It’s very difficult to be able to perceive anything for a robot or even a human that needs to analyze these visuals, because cameras don’t have the sensitivity to be able to see the details that you need to detect a rock or a crater.”

In addition, the dust itself covering the Moon is otherworldly. The way light reflects on the jagged shape of individual grains, along with the uniformity of color, means it looks different if it’s lit from different directions. It loses texture at different lighting angles.

Some of these visual challenges are evident in Apollo mission surface images, but the early lunar missions mostly waited until lunar “afternoon” so astronauts could safely explore the surface in well-lit conditions.

Future lunar rovers may target unexplored polar regions of the Moon to drill for water ice and other volatiles that are essential, but heavy, to take on human exploration missions. At the Moon’s poles, the Sun is always near the horizon and long shadows hide many potential dangers in terrain like rocks and craters. Pure darkness is a challenge for robots that need to use visual sensors to safely explore the surface.

Wong and his team in Ames’ Intelligent Robotics Group are tackling this by gathering real data from simulated lunar soil and lighting.

“We’re building these analog environments here and lighting them like they would look on the Moon with solar simulators, in order to create these sorts of appearance conditions,” said Wong. “We use a lot of 3-dimensional imaging techniques, and use sensors to create algorithms, which will both help the robot safeguard itself in these environments, and let us train people to interpret it correctly and command a robot where to go.”

Above is a set from over 2,500 pairs of stereo camera images taken from at least 12 scenarios of recreated craters and rock formations that Wong and his team collected to accurately simulate the lighting conditions at the Moon’s poles. The goal is to improve the stereo viewing capabilities of robotic systems to effectively navigate unknown terrain and avoid hazards at the Moon’s poles. Credit: NASA/Uland Wong

Above is a set from over 2,500 pairs of stereo camera images taken from at least 12 scenarios of recreated craters and rock formations that Wong and his team collected to accurately simulate the lighting conditions at the Moon’s poles. The goal is to improve the stereo viewing capabilities of robotic systems to effectively navigate unknown terrain and avoid hazards at the Moon’s poles. Credit: NASA/Uland Wong

The team uses a ‘Lunar Lab’ testbed at Ames – a 12-foot-square sandbox containing eight tons of JSC-1A, a human-made lunar soil simulant. Craters, surface ripples and obstacles are shaped with hand tools, and rocks are added to the terrain in order to simulate boulder fields or specific obstacles. Then they dust the terrain and rocks with an added layer of simulant to produce the “fluffy” top layer of lunar soil, erasing shovel and brush marks, and spreading a thin layer on the faces of rocks. Each terrain design in the testbed is generated by statistics based on common features observed from spacecraft around the Moon.

Solar simulator lights are set up around the terrain to create Moon-accurate low-angle, high-contrast illumination. Two cameras, called a stereo imaging pair, mimic how human eyes are set apart to help us perceive depth. The team captured photographs of multiple testbed setups and lighting angles to create a dataset to inform future rover navigation.

“But you can only shovel so much dirt; we are also using physics-based rendering, and are trying to photo-realistically recreate the illumination in these environments,” said Wong. “This allows us to use a supercomputer to render a bunch of images using models that we have decent confidence in, and this gets us a lot more information than we would taking pictures in a lab with three people, for example.”

The result, a Polar Optical Lunar Analog Reconstruction or POLAR dataset, provides standard information for rover designers and programmers to develop algorithms and set up sensors to safely navigate. The POLAR dataset is applicable not only to our Moon, but to many types of planetary surfaces on airless bodies, including Mercury, asteroids, and regolith-covered moons like Mars’ Phobos.

So far, early results show that stereo imaging is promising for use on rovers that will explore the lunar poles.

“One of the mission concepts that’s in development right now, Resource Prospector, that I have the privilege of working on, might be the first mission to land a robot and navigate in the polar regions of the Moon,” said Wong. “And in order to do that, we have to figure out how to navigate where nobody’s ever been.”

This research is funded by the agency’s Advanced Exploration Systems and Game Changing Development programs. NASA’s Solar System Exploration Research Virtual Institute provides the laboratory facilities and operational support.

For more information about NASA technology for future exploration missions, visit:

 
 
Read Full Article
*Source: SSERVI.NASA.gov

Continue Reading

Smartphone Advances Drive Smallsats

Spheres

Terrestrial smartphone technology, based in part on government space research, is finding its way back into space as low-cost, rapidly evolving processors, cameras, GPS receivers and other gear used in bulk by the burgeoning smallsat movement.

In California’s Silicon Valley, where the lifetime of a state-of-the-art smartphone is about one year, engineers at NASA’s Ames Research Center have literally been plugging smartphones into spacecraft to get the most capable hardware into space quickly.


Read Full Story

*Source: AviationWeek.com

Continue Reading

Smart SPHERES Are About to Get A Whole Lot Smarter

Spheres

Smart devices – such as tablets and phones – increasingly are an essential part of everyday life on Earth. The same can be said for life off-planet aboard the International Space Station. From astronaut tweets to Google+ Hangouts, our reliance on these mobile and social technologies means equipment and software upgrades are an everyday occurrence – like buying a new pair of shoes to replace a pair of well-worn ones.

That’s why the Intelligent Robotics Group at NASA’s Ames Research Center in Moffett Field, Calif., with funding from the Technology Demonstration Missions Program in the Space Technology Mission Directorate, is working to upgrade the smartphones currently equipped on a trio of volleyball-sized free-flying satellites on the space station called Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES). In 2011 on the final flight of space shuttle Atlantis, NASA sent the first smartphone to the station and mounted it to SPHERES.


Read Full Story

*Source: NASA.gov

Continue Reading

Smartphone powers Star Wars-inspired NASA robot

Smartphone Spheres

Miniature satellites resembling the flying robot that helped Luke Skywalker with his light saber training are now serving as mission control’s eyes and ears aboard the International Space Station.

It’s hard not to get freakishly excited when science fiction becomes scientific fact — especially when the origins of that science are rooted in Star Wars.

Think back, young Jedis, to the scene where a fresh-off-Tatooine Luke Skywalker is honing his light saber skills under the tutelage of Obi-Wan Kenobi. A round, floating robot called a remote helps Luke practice his Force-finding mojo. Now, NASA is running experiments with miniature satellites, or nanosatellites, that were inspired by that fictional robot.

Roughly the size of a soccer ball, these robots that fly freely in space are called Spheres (which is short for Synchronized Position Hold Engage Reorient Experimental Satellites). Star Wars connection aside, there’s another remarkable detail about Spheres: they’re powered by smartphones, specifically a Google Nexus S.


Read Full Story

*Source: CNET.com

Continue Reading

NASA’s next big thing is very small

Next Big Thing

We often think of NASA in grandiose terms — tackling the biggest problems with the biggest thinking, applying the grandest ideas that mankind can conceive. But now, NASA is thinking small in a big way, applying a DIY ethos to spaceflight, and using commercially available tools and technologies to get the job done.

Instead of gigantic systems costing millions of dollars, and thousands of man hours to produce and launch, the next greatest idea is to focus on the small things — using off-the-shelf products and small-scale design to take an approach to space systems research that is quicker, cheaper, and more efficient. Aboard the International Space Station, the SPHERES (Synchronized Position Hold, Engage, Reorient Experimental Satellites) are already doing just that.


Read Full Story

*Source: CNET.com

Continue Reading

NASA and TopCoder to issue Robonaut 2 ‘sight’ challenge

Robonaut

NASA Tournament Lab is launching two new competitions, this time to give Robonaut 2, the humanoid robot aboard the international space station, the gift of improved “sight.” The challenges are the latest offered by the Tournament Lab in conjunction with the open innovation platform TopCoder.

The first competition calls on participants to figure out how to enable Robonaut 2, or R2, to identify buttons and switches on a console fitted with LED lights. The winning entry would be in the form of an algorithm application that works seamlessly with R2’s cameras in different lighting conditions. The second competition will build off the first, calling on competitors to write an algorithm that controls the robot’s motions based on the new “sight” capability.

Read Full Story

*Source: WashingtonPost.com

Continue Reading

NASA – ‘Smart SPHERES’ Fly High Aboard the International Space Station

Steve Ormsby monitors the SPHERES experiment.  Credit: NASA/ARC

Steve Ormsby monitors the SPHERES experiment. Credit: NASA/ARC

On Dec. 12 engineers at NASA’s Ames Research Center, Moffett Field, Calif., and Johnson Space Center in Houston conducted an experiment using small, free-flying robotic satellites called “Smart SPHERES” aboard the International Space Station.

The Smart SPHERES, located in the Kibo laboratory module, were remotely operated from the International Space Station’s Mission Control Center at Johnson to demonstrate how a free-flying robot can perform surveys for environmental monitoring, inspection and other routine housekeeping tasks.

In the future, small robots could regularly perform routine maintenance tasks allowing astronauts to spend more time working on science experiments. In the long run, free-flying robots like Smart SPHERES also could be used to inspect the exterior of the space station or future deep-space vehicles.


Read Full Story

*Source: NASA.gov

Continue Reading

NASA – ‘Smart SPHERES’ Fly High Aboard the International Space Station

Spheres

Chris Provencher and astronaut Kevin Ford set up and supervise the red Smart SPHERES’ activity. Credit: NASA/ARC

On Dec. 12 engineers at NASA’s Ames Research Center, Moffett Field, Calif., and Johnson Space Center in Houston conducted an experiment using small, free-flying robotic satellites called “Smart SPHERES” aboard the International Space Station.

The Smart SPHERES, located in the Kibo laboratory module, were remotely operated from the International Space Station’s Mission Control Center at Johnson to demonstrate how a free-flying robot can perform surveys for environmental monitoring, inspection and other routine housekeeping tasks.

In the future, small robots could regularly perform routine maintenance tasks allowing astronauts to spend more time working on science experiments. In the long run, free-flying robots like Smart SPHERES also could be used to inspect the exterior of the space station or future deep-space vehicles.

Read Full Story

*Source: NASA.gov

Continue Reading

NASA Telerobotics Team to Demonstrate K10 Rover Sept. 22

K10 Robot

The K10 robot now undergoing testing at NASA’s Ames Research Center in Moffatt Field, Calif. Credits: (NASA/ARC)

Skywatchers and space enthusiasts across the globe will gather Sept. 22 to celebrate “International Observe the Moon Night” – but one group in Moffett Field, Calif., will get an added thrill: the chance to watch a next-generation NASA robot being put through its paces.

NASA’s Surface Telerobotics team, part of the Human Exploration Telerobotics (HET) project, will help make a night under the lunar limb memorable by demonstrating how its K10 rover deploys a telescope antenna – one of a variety of tasks such sophisticated, articulate “handybots” will conduct in the future to support their human counterparts living and working in space and, someday, on other worlds.


Read Full Story

*Source: NASA.gov

Continue Reading

NASA Channels “The Force” With Smart SPHERES

Three satellites fly in formation as part of the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES) investigation. Image Credit: NASA

Three satellites fly in formation as part of the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES) investigation. Image Credit: NASA

In an interesting case of science fiction becoming a reality, NASA has been testing their SPHERES project over the past few years. The SPHERES project (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) involves spherical satellites about the size of a bowling ball. Used inside the International Space Station, the satellites are used to test autonomous rendezvous and docking maneuvers. Each individual satellite features its own power, propulsion, computers and navigational support systems.

Read Full Story

*Source: UniverseToday.com

Continue Reading