Curiosity Rover To Get On-The-Go Photo Capability

Software's see-and-react code, which lets rover snap pictures of rocks without stopping, could be adapted for other uses, such as "calling home" when the rover sees something unusual.

Dan Taylor, Contributor

September 21, 2012

3 Min Read

NASA Curiosity Visual Tour: Mars, Revealed

NASA Curiosity Visual Tour: Mars, Revealed

NASA Curiosity Visual Tour: Mars, Revealed (click image for larger view and for slideshow)

As NASA's Curiosity rover crawls along the surface of Mars, at distances of up to 40 yards per day, its mast-mounted camera scans the surface for rocks of interest. As a next step, the space agency plans to deploy software that makes it easier to target and capture images without stopping along the way.

The software, called Autonomous Exploration for Gathering Increased Science (AEGIS), will let the rover look for rocks of a certain size, shape, or brightness. "The idea is, when it's doing a long drive, a scientist can say, 'Oh, if you see this type of rock in the middle of this drive or at the end of the drive, go ahead and take some high-quality, high-resolution images of it before the rover moves past," said Tara Estlin, a senior member of NASA Jet Propulsion Laboratory's Artificial Intelligence Group and AEGIS project leader. Earlier this week, Curiosity parked in front of a football-size rock, which it photographed. NASA scientists plan to use the rover's spectrometer to determine the rock's composition.

AEGIS, which has been used on the Mars Exploration Rover Opportunity since 2009, will be installed on Curiosity in the next nine to 12 months, Estlin said in an interview with InformationWeek. The AEGIS software, developed by JPL, was named NASA's "software of the year" in 2011.

[ Related: NASA Makes Most Of Curiosity Rover Data. ]

AEGIS will let the rover automatically key in on geological features of interest to the Curiosity project team as the rover moves along on its research mission. An automated image-capture process is important because "we can't stop at every point and wait a day or two for ground to be in the loop," Estlin said.

JPL developed AEGIS on Linux-based systems, then tested the software on research rovers. Opportunity uses the software to take a wide-angle image with a low-resolution camera, then picks out rocks in the image to see if there's something of interest. If so, it takes a high-resolution image using an on-board science camera that's capable of zooming in on the subject. The software has potential beyond picture taking. Its see-and-react code could be adapted to other instruments. "We have versions of the software that could stop the rover and call home if it sees something really interesting," Estlin said. "You could [then] drive closer to the object of interest."

In fact, the software could be used in future space missions, including fly-bys of asteroids or visits to other planets."There are all sorts of surface missions that often don't have a lot of time to determine a response," Estlin said. The AEGIS capability was developed as part of a framework called the Onboard Autonomous Science Investigation System (OASIS), which was designed to let a rover to identify and react to "serendipitous science opportunities," according to NASA.

Read more about:


About the Author(s)

Dan Taylor


Contributing writer Dan Taylor is managing editor of Inside the Navy.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights