informa
/
Mobile Devices
Commentary

Ford's Autonomous Car: Under The Hood

Ford is building an autonomous car and the technology for the test vehicle is more familiar than you might think.
10 Best Enterprise Products At CES
10 Best Enterprise Products At CES
(Click image for larger view and slideshow.)

Ford has developed an autonomous car to the point that it is testing it in the snow. For some of us, the question isn't whether the company is testing its car in extreme weather, but how it built it in the first place. On the show floor at CES 2016, InformationWeek had a chance to talk with a Ford software engineer to get a look inside the autonomous test vehicle on display.

One thing you should know is that this was a tough interview to get. Not because Ford PR or the engineer was difficult to work with -- quite the contrary. The problems came because people kept walking up to ask questions during the interview. We were interrupted at least half a dozen times by people who wanted to know more about the project -- or who had ideas to offer for making the car better. Whether they intend to buy or not, people are curious about autonomous vehicles and are eager to talk with engineers who can fill them in on details.

Wayne Williams, the Ford software engineer we spoke with, said that the complexity of an autonomous vehicle starts with the need for the system to create an image of the world around it in a constantly changing stream of data. This image is built from sensor data that informs the virtual driver's ability to detect, identify, and react to everything around it.

In Ford's autonomous vehicle test platform, the data from an array of sensors flows through a network to a processing unit. While they probably bear little resemblance to the system that would be in place in a production vehicle, the systems are built around technology that will seem familiar to enterprise IT professionals.

[ Read Obama Proposes $4 Billion Budget For Self-Driving Cars. ]

To begin with, Williams said that the network itself is standard TCP/IP Ethernet. We talked about other automotive networks and the vulnerability that was demonstrated at Black Hat 2015. He said that security is now top-of-mind for anyone working in automotive electronics, and that the network supporting the virtual driver in Ford's autonomous vehicle is both separate from all other networks on the vehicle and from any wireless communications. Williams said that even engineers who want to update the system have to do so through a physical connection with the vehicle.

CES_2016_Day_2_044.jpg

An array of sensors provides the data for the car's virtual driver.

The sensors that make up the bulk of the nodes on the network are a collection of video, ultra-sound, and other input devices that combine to create a complete multidimensional image of the world around the vehicle. Williams said that the sensors are both critical to the success of the project and one of the key technology challenges for engineers. He indicated that stitching the pieces of the world-defining image together is a problem that is well-defined and solvable. The issue is getting data for the image that is of sufficient quality and resolution to allow the electronic image to be complete.

The virtual driver itself lives in a Linux cluster that sits in the trunk of the test vehicle. Williams said that the cluster is five nodes running Ubuntu Linux. Multiple nodes are required to handle all the sensor input and process it quickly enough to make driving decisions. Asked why there are five nodes in the cluster, Williams was succinct. "That's all that would fit in the trunk," he said.

CES_2016_Day_2_037.jpg

The virtual driver lives in a five-node Linux cluster in the trunk.

As for the software running on the cluster, that's created using standard technology, as well. Williams said that C has been used for the programming in the various applications that make up the virtual driver. When I asked why no more-modern or -specialized language was used, Williams smiled. "They weren't part of the specification," he explained.

During the conversation, one CES attendee walked up and began asking why particular bits of functionality and code weren't lifted from driver assist programs to advance the autonomous vehicle program. Williams was quick to explain that assisting a human driver and creating a virtual driver are two distinct problems that share far less, electronically or conceptually, than it might seem at the outset. He said that the two programs are distinct, with separate management and development teams.

CES_2016_Day_2_042.jpg

Constantly spinning sensors help create a complete image of a virtual world for the control system in the vehicle's trunk.

When asked about any sort of timeline to market, Williams demurred. He is an engineer working on solving a problem. Decisions about turning a solution into a product involve many factors that have nothing to do with engineering.

When asked about the most significant problems that must be solved before the autonomous car can be considered ready to be a product, he was quick to answer. "The remaining issues are societal. Sensors are low-hanging and gainable fruit. Algorithms are going to evolve to be road-ready over the next four to five years," he said. "Government, industry, and consumer groups must be in a discussion over who takes responsibility if things go wrong. For them, the question is 'What's the best, worst answer I can find?'" It's the sort of question that no number of sensors can inform.