Software's see-and-react code, which lets rover snap pictures of rocks without stopping, could be adapted for other uses, such as "calling home" when the rover sees something unusual.
NASA Curiosity Visual Tour: Mars, Revealed
(click image for larger view and for slideshow)
As NASA's Curiosity rover crawls along the surface of Mars, at distances of up to 40 yards per day, its mast-mounted camera scans the surface for rocks of interest. As a next step, the space agency plans to deploy software that makes it easier to target and capture images without stopping along the way.
The software, called Autonomous Exploration for Gathering Increased Science (AEGIS), will let the rover look for rocks of a certain size, shape, or brightness. "The idea is, when it's doing a long drive, a scientist can say, 'Oh, if you see this type of rock in the middle of this drive or at the end of the drive, go ahead and take some high-quality, high-resolution images of it before the rover moves past," said Tara Estlin, a senior member of NASA Jet Propulsion Laboratory's Artificial Intelligence Group and AEGIS project leader.
Earlier this week, Curiosity parked in front of a football-size rock, which it photographed. NASA scientists plan to use the rover's spectrometer to determine the rock's composition.
AEGIS, which has been used on the Mars Exploration Rover Opportunity since 2009, will be installed on Curiosity in the next nine to 12 months, Estlin said in an interview with InformationWeek. The AEGIS software, developed by JPL, was named NASA's "software of the year" in 2011.
AEGIS will let the rover automatically key in on geological features of interest to the Curiosity project team as the rover moves along on its research mission. An automated image-capture process is important because "we can't stop at every point and wait a day or two for ground to be in the loop," Estlin said.
JPL developed AEGIS on Linux-based systems, then tested the software on research rovers. Opportunity uses the software to take a wide-angle image with a low-resolution camera, then picks out rocks in the image to see if there's something of interest. If so, it takes a high-resolution image using an on-board science camera that's capable of zooming in on the subject.
The software has potential beyond picture taking. Its see-and-react code could be adapted to other instruments. "We have versions of the software that could stop the rover and call home if it sees something really interesting," Estlin said. "You could [then] drive closer to the object of interest."
In fact, the software could be used in future space missions, including fly-bys of asteroids or visits to other planets."There are all sorts of surface missions that often don't have a lot of time to determine a response," Estlin said.
The AEGIS capability was developed as part of a framework called the Onboard Autonomous Science Investigation System (OASIS), which was designed to let a rover to identify and react to "serendipitous science opportunities," according to NASA.
InformationWeek Elite 100Our data shows these innovators using digital technology in two key areas: providing better products and cutting costs. Almost half of them expect to introduce a new IT-led product this year, and 46% are using technology to make business processes more efficient.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?
In this special, sponsored radio episode we’ll look at some terms around converged infrastructures and talk about how they’ve been applied in the past. Then we’ll turn to the present to see what’s changing.