Tesla Autopilot Crash Under NHTSA Investigation - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Data Management // IoT
Commentary
7/2/2016
12:06 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
LinkedIn
Twitter
RSS
50%
50%

Tesla Autopilot Crash Under NHTSA Investigation

The National Highway Traffic Safety Administration is looking into the circumstances surrounding a fatal accident involving a Tesla being driven under autopilot.

10 Hot Smartphones To Consider Now
10 Hot Smartphones To Consider Now
(Click image for larger view and slideshow.)

The National Highway Traffic Safety Administration has opened an inquiry into the autopilot system in Tesla's Model S following the death of a driver who was using the system.

In a statement posted on the Tesla Motors website on June 30, the company acknowledged the inquiry and characterized the incident as "the first known fatality in just over 130 million miles where Autopilot was activated."

The NHTSA said in a statement Tesla had alerted the agency to the crash, which occurred on May 7 in Williston, Fla.

The Levy Journal Online, which covers Levy County, Fla., where the crash occurred, described the accident based on an account provided by the Florida Highway Patrol. A tractor-trailer was traveling west on US 27A and made a left turn onto NE 140 Court as the Tesla driver was heading in the opposite direction. The Tesla passed underneath the 18-wheeler and its roof collided with the truck. It then continued along the road before striking two fences and a utility pole.

(Image: Google)

(Image: Google)

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said in its statement. "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

The failure of Tesla's computer vision system to distinguish the truck from the similarly colored sky appears to have been compounded by radar code designed to reduce false positives during automated braking. Asked on Twitter why the Tesla's radar didn't detect what its cameras missed, CEO Elon Musk responded, "Radar tunes out what looks like an overhead road sign to avoid false braking events."

The driver of the Model S, identified in media reports as 40-year-old Joshua D. Brown from Canton, Ohio, died on the scene.

The driver of the truck, 62-year-old Frank Baressi, told the Associated Press that Brown was "playing Harry Potter on the TV screen" at the time of the crash.

A spokesperson for the Florida Highway Patrol did not immediately respond to a request to confirm details about the accident.

In its June 30 statement, Tesla said drivers who engage Autopilot are warned to keep both hands on the wheel at all times. Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control.

The incident has stoked doubts about the viability of self-driving cars and the maturity of Tesla's technology. Clearly, a computer vision system that cannot separate truck from sky in certain light conditions could use further improvement. It was unclear at press time whether Tesla will face any liability claims related to its code or sensing hardware.

However, Tesla insisted in its statement that, when Autopilot is used under human supervision, "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

(Image: Tesla)

(Image: Tesla)

In April, at an event in Norway, Musk said, "The probability of having an accident is 50% lower if you have Autopilot on," according to Electrek.

That may be, but data isn't the only consideration. When human lives are at stake, perception and emotion come into play. Automated driving systems will have to be demonstrably better than human drivers before people trust them with their lives.

Yet, perfection is too much to expect from autopilot systems. Machines fail, and fallible people are likely to remain in the loop. In aviation, automation is common. It has prompted concerns that it degrades the skills pilots need when intervention is called for. If the same holds true for cars with autopilot systems, we can expect to become worse drivers, less able to respond to emergencies, even as our autopilot systems reduce fatalities overall.

There may be no getting around the fact that, given current vehicle designs, driving down a highway at high speed entails some degree of risk, whether a person or a computer is at the wheel.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
7/7/2016 | 2:27:50 PM
Auto-pilot: before and after accident
In a June 30 blog, Tesla presented auto-pilot with "lane keeping and automatic braking capabilities – among others – is a driving-assist feature and is not intended to be used as a fully autonomous vehicle technology." That's after the June 7 accident. How did Tesla present auto-pilot to customers before the accident?
Charlie Babcock
100%
0%
Charlie Babcock,
User Rank: Author
7/6/2016 | 7:37:08 PM
Ah, auto-pilot is not equivalent to 'self-driving car'
In the July 2 New York Times, the headline referred to "A Fatality In A Self-Driving Car Forces Tesla To Confront Its Limits." The car was not a self-driving car. It was a software and sensor-enhanced form of cruise control, with Tesla urging drivers using it to not take their hands off the wheel or attention from the road. I would not prejudge the outcome of the investigation by pillorying the driver. But I certainly urge auto-pilot users to put some limits on total trust in auto-pilot. 
InformationWeek Is Getting an Upgrade!

Find out more about our plans to improve the look, functionality, and performance of the InformationWeek site in the coming months.

News
Pandemic Responses Make Room for More Data Opportunities
Jessica Davis, Senior Editor, Enterprise Apps,  5/4/2021
Slideshows
10 Things Your Artificial Intelligence Initiative Needs to Succeed
Lisa Morgan, Freelance Writer,  4/20/2021
News
Transformation, Disruption, and Gender Diversity in Tech
Joao-Pierre S. Ruth, Senior Writer,  5/6/2021
White Papers
Register for InformationWeek Newsletters
2021 State of ITOps and SecOps Report
2021 State of ITOps and SecOps Report
This new report from InformationWeek explores what we've learned over the past year, critical trends around ITOps and SecOps, and where leaders are focusing their time and efforts to support a growing digital economy. Download it today!
Video
Current Issue
Planning Your Digital Transformation Roadmap
Download this report to learn about the latest technologies and best practices or ensuring a successful transition from outdated business transformation tactics.
Slideshows
Flash Poll