Robot history was made on October 8th in the second annual DARPA Grand Challenge, a desert race held near Las Vegas featuring autonomous unmanned vehicles. Sponsored by the Defense Advanced Research Projects Agency, the event was created as a way to spur progress toward developing unmanned battlefield vehicles. Among the technologies guiding the bots were artificial intelligence and the same rules engine technology used in credit checks and business process management deployments.
In the premier DARPA Grand Challenge last year, $1 million was offered to the first vehicle to cross the finish line in less than 10 hours, but none of the competitors finished the race. In fact, the top robot traveled less than eight miles. This year's prize was increased to $2 million, and of the 195 applicants, 43 made the cut as semi-finalists and 23 as finalists.
Five vehicles completed the 132-mile obstacle course. The winner was Stanford [University] Racing Team's "Stanley," a modified Volkswagen Touareg outfitted with GPS, laser range finders, radar, vision systems and seven Pentium M computers to analyze sensor data and geospatial mapping information (www.stanfordracing.org). Stanley finished the course in 6 hours and 54 minutes, averaging 19 miles per hour.
Second place went to the Red Team (www.redteamracing.org) from Carnegie Mellon University. "Sandstorm," a modified first-generation Hummer, finished the course in 7 hours and 4 minutes. Carnegie Mellon also took third place with "H1LANDER," a modified H1 Hummer that finished in 7 hours and 14 minutes.
What made the difference in this year's performance? The teams "have great software and great smarts," said DARPA director Tony Tether. "The Stanford machine uses a learning algorithm and it actually learns the more it's used. I'm told it has hundreds and perhaps more than 1,000 miles of experience in desert terrain."
Race officials withheld final course information until two hours before the race, forcing contestants to rely on more than preprogrammed GPS waypoints. Tank traps, parked vehicles, cones and tunnels forced vehicles to deal with unmapped obstacles in real time. That's where the combination of vision systems, radar and range-finding sensors came in.
In contrast to the large, university-lead teams, semi-finalist Team Jefferson, from startup company Perrone Robotics (www.perronerobotics.com/dgc), competed on a bare-bones budget using open-source and donated off-the-shelf commercial software including a Blaze Advisor rules engine from Fair Isaac. The team's customized dune buggy, "Tommy," relied on Perrone's Java-based Mobile Autonomous X-bot software platform. Fair Isaac's rules engine was used offline to compare the final course data with available geospatial mapping data to program a safest-possible route in less than two hours.
"Last year the teams were pouring through the course data manually within that two-hour window," Perrone explained. Determining routes is "an example of something that can be easily automated with a rules engine. We're going to let the rules engine pour through the data and plot those points, and it will do it accurately within 20 minutes."
Unfortunately, Tommy wasn't a finalist due to an obstacle avoidance mishap during a qualifying trial: The vehicle swerved to avoid a barrier and crashed into a concrete wall. Thus, the offline rules-based system was never put to the test of plotting the Grand Challenge course.
The winning Stanford team had nearly 70 faculty and student contributors, drawn largely from the School of Engineering, and it spent 15 months developing and testing software and systems.
DARPA officials said its research goals had been met, so it has no plans to hold future challenges. The military must now improve the durability and reliability of the technologies. Eighteen of the 23 robot vehicles failed to complete the course.
— Doug Henschen