AI, Machine Learning Drive Autonomous Vehicle Development - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Data Management // Big Data Analytics
01:05 PM

AI, Machine Learning Drive Autonomous Vehicle Development

The applications for artificial intelligence and machine learning in self-driving vehicles range from infotainment systems to advanced driver assistance systems, according to new report from IHS.

Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent
Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent
(Click image for larger view and slideshow.)

Artificial intelligence and machine learning technology are burgeoning spaces in the tech industry, and a new report from analytics firm IHS indicates AI developers are increasingly focused on the autonomous vehicle space.

AI systems, which continuously learn from experience by their ability to discern and recognize their surroundings, have the potential to be highly beneficial when integrated into an autonomous vehicle's software architecture.

Autonomous vehicles already rely on sophisticated radar systems and multiple cameras and sensors to analyze and adapt to a rapidly changing environment. AI could help these self-driving vehicles recognize patterns and learn from the behavior of other vehicles on the road, according to IHS.

The IHS report estimated unit shipments of AI systems used in infotainment and advanced driver assistance systems (ADAS) systems are expected to rise from just 7 million in 2015 to 122 million by 2025.

(Image: Henrik5000/iStockphoto)

(Image: Henrik5000/iStockphoto)

The AI applications inherent in infotainment systems could include the abilities to direct the vehicle to a gas station if it's running out of fuel and to recognize a faster route due to traffic conditions.

The report also noted that AI applications in the infotainment category could also incorporate speech recognition, gesture recognition (including handwriting recognition), eye tracking and driver monitoring, virtual assistance, and natural language interfaces.

Applications for ADAS and autonomous vehicles involve camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

"It learns, as human beings do, from real sounds, images, and other sensory inputs," Luca De Ambroggi, principal analyst for automotive semiconductors at IHS Technology, wrote in the June 13 report summary. "The system recognizes the car's environment and evaluates the contextual implications for the moving car."

As automakers continue to pour money into the research and production of ADAS systems and autonomous vehicles, the IHS report highlighted two companies in particular -- BMW and electric car specialist Tesla Motors -- as having made particular strides in AI.

The analyst firm noted the 2015 BMW 7 Series was the first car to use a hybrid approach to infotainment human machine interfaces, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity.

[Read more about AI and how it's changing the C-suite.]

In ADAS applications, Tesla claims to implement neural network functionality based on the Mobileye EYEQ3 processor in its autonomous driving control unit.

Mobileye's system-on-a-chip (SoC) offers a solution for computationally intensive applications for real-time visual recognition, for scene interpretation, for traffic sign recognition, and for issuing lane-departure warnings.

While the IHS report notes the hardware required to embed AI and deep learning at mass-production volume is not currently available due to the high cost and size of the computers needed, companies like Mobileye are already starting to partner with global automakers like Nissan and Volkswagen.

Google has also been researching the AI capabilities of its self-driving cars, albeit in a more audible way.

As revealed in the company's monthly report, the software is designed to recognize when honking may help alert other drivers to the vehicle's presence -- for instance, when a driver begins swerving into the car's lane or when another car is backing out of a blind driveway.

Nathan Eddy is a freelance writer for InformationWeek. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin. View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
10 Top Cloud Computing Startups
Cynthia Harvey, Freelance Journalist, InformationWeek,  8/3/2020
Adding Fuel to the MSP vs. In-house IT Debate
Andrew Froehlich, President & Lead Network Architect, West Gate Networks,  8/6/2020
How Enterprises Can Adopt Video Game Cloud Strategy
Joao-Pierre S. Ruth, Senior Writer,  7/28/2020
White Papers
Register for InformationWeek Newsletters
Current Issue
Enterprise Automation: Do More with Less
In this IT Trend Report, we highlight the benefits of automation and the various tools as enterprises navigate turbulent times, try to do more with less, keep their operations running, and stay on track with digital modernizations.
Flash Poll