Artificial intelligence and machine learning technology are burgeoning spaces in the tech industry, and a new report from analytics firm IHS indicates AI developers are increasingly focused on the autonomous vehicle space.
AI systems, which continuously learn from experience by their ability to discern and recognize their surroundings, have the potential to be highly beneficial when integrated into an autonomous vehicle's software architecture.
Autonomous vehicles already rely on sophisticated radar systems and multiple cameras and sensors to analyze and adapt to a rapidly changing environment. AI could help these self-driving vehicles recognize patterns and learn from the behavior of other vehicles on the road, according to IHS.
The IHS report estimated unit shipments of AI systems used in infotainment and advanced driver assistance systems (ADAS) systems are expected to rise from just 7 million in 2015 to 122 million by 2025.
The AI applications inherent in infotainment systems could include the abilities to direct the vehicle to a gas station if it's running out of fuel and to recognize a faster route due to traffic conditions.
The report also noted that AI applications in the infotainment category could also incorporate speech recognition, gesture recognition (including handwriting recognition), eye tracking and driver monitoring, virtual assistance, and natural language interfaces.
Applications for ADAS and autonomous vehicles involve camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).
"It learns, as human beings do, from real sounds, images, and other sensory inputs," Luca De Ambroggi, principal analyst for automotive semiconductors at IHS Technology, wrote in the June 13 report summary. "The system recognizes the car's environment and evaluates the contextual implications for the moving car."
As automakers continue to pour money into the research and production of ADAS systems and autonomous vehicles, the IHS report highlighted two companies in particular -- BMW and electric car specialist Tesla Motors -- as having made particular strides in AI.
The analyst firm noted the 2015 BMW 7 Series was the first car to use a hybrid approach to infotainment human machine interfaces, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity.
In ADAS applications, Tesla claims to implement neural network functionality based on the Mobileye EYEQ3 processor in its autonomous driving control unit.
Mobileye's system-on-a-chip (SoC) offers a solution for computationally intensive applications for real-time visual recognition, for scene interpretation, for traffic sign recognition, and for issuing lane-departure warnings.
While the IHS report notes the hardware required to embed AI and deep learning at mass-production volume is not currently available due to the high cost and size of the computers needed, companies like Mobileye are already starting to partner with global automakers like Nissan and Volkswagen.
Google has also been researching the AI capabilities of its self-driving cars, albeit in a more audible way.
As revealed in the company's monthly report, the software is designed to recognize when honking may help alert other drivers to the vehicle's presence -- for instance, when a driver begins swerving into the car's lane or when another car is backing out of a blind driveway.Nathan Eddy is a freelance writer for InformationWeek. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin. View Full Bio