Tesla Autopilot Crash Under NHTSA Investigation - InformationWeek
IoT
IoT
Data Management // IoT
Commentary
7/2/2016
12:06 PM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
Google+
LinkedIn
Twitter
RSS
50%
50%

Tesla Autopilot Crash Under NHTSA Investigation

The National Highway Traffic Safety Administration is looking into the circumstances surrounding a fatal accident involving a Tesla being driven under autopilot.

10 Hot Smartphones To Consider Now
10 Hot Smartphones To Consider Now
(Click image for larger view and slideshow.)

The National Highway Traffic Safety Administration has opened an inquiry into the autopilot system in Tesla's Model S following the death of a driver who was using the system.

In a statement posted on the Tesla Motors website on June 30, the company acknowledged the inquiry and characterized the incident as "the first known fatality in just over 130 million miles where Autopilot was activated."

The NHTSA said in a statement Tesla had alerted the agency to the crash, which occurred on May 7 in Williston, Fla.

The Levy Journal Online, which covers Levy County, Fla., where the crash occurred, described the accident based on an account provided by the Florida Highway Patrol. A tractor-trailer was traveling west on US 27A and made a left turn onto NE 140 Court as the Tesla driver was heading in the opposite direction. The Tesla passed underneath the 18-wheeler and its roof collided with the truck. It then continued along the road before striking two fences and a utility pole.

(Image: Google)

(Image: Google)

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla said in its statement. "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

The failure of Tesla's computer vision system to distinguish the truck from the similarly colored sky appears to have been compounded by radar code designed to reduce false positives during automated braking. Asked on Twitter why the Tesla's radar didn't detect what its cameras missed, CEO Elon Musk responded, "Radar tunes out what looks like an overhead road sign to avoid false braking events."

The driver of the Model S, identified in media reports as 40-year-old Joshua D. Brown from Canton, Ohio, died on the scene.

The driver of the truck, 62-year-old Frank Baressi, told the Associated Press that Brown was "playing Harry Potter on the TV screen" at the time of the crash.

A spokesperson for the Florida Highway Patrol did not immediately respond to a request to confirm details about the accident.

In its June 30 statement, Tesla said drivers who engage Autopilot are warned to keep both hands on the wheel at all times. Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control.

The incident has stoked doubts about the viability of self-driving cars and the maturity of Tesla's technology. Clearly, a computer vision system that cannot separate truck from sky in certain light conditions could use further improvement. It was unclear at press time whether Tesla will face any liability claims related to its code or sensing hardware.

However, Tesla insisted in its statement that, when Autopilot is used under human supervision, "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving."

(Image: Tesla)

(Image: Tesla)

In April, at an event in Norway, Musk said, "The probability of having an accident is 50% lower if you have Autopilot on," according to Electrek.

That may be, but data isn't the only consideration. When human lives are at stake, perception and emotion come into play. Automated driving systems will have to be demonstrably better than human drivers before people trust them with their lives.

Yet, perfection is too much to expect from autopilot systems. Machines fail, and fallible people are likely to remain in the loop. In aviation, automation is common. It has prompted concerns that it degrades the skills pilots need when intervention is called for. If the same holds true for cars with autopilot systems, we can expect to become worse drivers, less able to respond to emergencies, even as our autopilot systems reduce fatalities overall.

There may be no getting around the fact that, given current vehicle designs, driving down a highway at high speed entails some degree of risk, whether a person or a computer is at the wheel.

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 3   >   >>
vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/8/2016 | 1:13:26 PM
Re: Telsa: Not AutoPilot ? Then Don't Call It That
I agree @Technocrati - make up a new name or just be boring and call it Driving Assist.  Everyone has a preconceived notion for the term "autopilot" and even if Tesla tries to reframe it for their own purposes, not everyone is going to "get" it.
BobbyB269
50%
50%
BobbyB269,
User Rank: Apprentice
7/7/2016 | 3:41:07 PM
Tesla is not a self-driving car
Tesla's "self-driving" car uses a vigilant-human approach: 

"Tesla notes that Autopilot is meant only to assist drivers, not to replace them. And its onscreen warnings and owner's manual emphasize that drivers should remain vigilant and keep their hands on or near the wheel at all times."  --nytimes.com

Google OTOH, has monitored drivers (employees) while they were driving "vigilant-human approach automobiles" and found the drivers were often profoundly distracted and even napping. They realized that the vigilant-human approach was scary because most humans were lulled into totally trusting the car after hundreds of miles without incident. As a result, Google is approaching it from the perspective that the car must be reliably self-driving, with no steering, no brake pedals and no accelerator pedals. In-other-words, until their cars can drive WITHOUT ANY driver assistance, they are not good enough. 

I figure that in the long run, the only approach that will survive the market and the regulators will be Google's approach to self-driving cars. Tesla will either figure this out or lose the market. 

 
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
7/7/2016 | 2:27:50 PM
Auto-pilot: before and after accident
In a June 30 blog, Tesla presented auto-pilot with "lane keeping and automatic braking capabilities – among others – is a driving-assist feature and is not intended to be used as a fully autonomous vehicle technology." That's after the June 7 accident. How did Tesla present auto-pilot to customers before the accident?
Technocrati
50%
50%
Technocrati,
User Rank: Ninja
7/7/2016 | 2:20:49 PM
Telsa: Not AutoPilot ? Then Don't Call It That

"...Autopilot, despite its name, is intended as an assistive feature rather than an alternative to manual control."

 

 

Tesla should change the naming.  People associate the term in the traditional sense and since anyone who can afford a Tesla thinks these are wonder machines.  It is easy to see how this function could be misconstrued.

Technocrati
50%
50%
Technocrati,
User Rank: Ninja
7/7/2016 | 2:01:44 PM
Re: Ah, auto-pilot is not equivalent to 'self-driving car'

@Charlie  Thanks for the clarification and to everyone along this thread that understands the issue better than I.  

If nothing else, this might serve to have people think before they entrust their life to a vehicle with Tesla stamped on it.

jastroff
0%
100%
jastroff,
User Rank: Ninja
7/7/2016 | 10:56:10 AM
Re: Just some thoughts
Agree. I've always thought cars, etc. were weapons -- they can kill easily. Keep control, always.
jastroff
0%
100%
jastroff,
User Rank: Ninja
7/7/2016 | 10:56:10 AM
Re: Just some thoughts
Agree. I've always thought cars, etc. were weapons -- they can kill easily. Keep control, always.
Charlie Babcock
100%
0%
Charlie Babcock,
User Rank: Author
7/6/2016 | 7:37:08 PM
Ah, auto-pilot is not equivalent to 'self-driving car'
In the July 2 New York Times, the headline referred to "A Fatality In A Self-Driving Car Forces Tesla To Confront Its Limits." The car was not a self-driving car. It was a software and sensor-enhanced form of cruise control, with Tesla urging drivers using it to not take their hands off the wheel or attention from the road. I would not prejudge the outcome of the investigation by pillorying the driver. But I certainly urge auto-pilot users to put some limits on total trust in auto-pilot. 
TerryB
50%
50%
TerryB,
User Rank: Ninja
7/6/2016 | 1:16:13 PM
Re: Just some thoughts
We'll know this technology has made it when it is capable of eliminating an impaired driver from getting a DWI. And even if technology is sound, is society ever going to let the primary "operator" off the hook for staying below the tested level anyway?

To me (not because I'm a big drinker!), this is the killer app of a driverless car. Just replacing the driving work you do only goes towards the convienience/laziness factor human drivers have. But a car in the midwest that can get you home from a bar birthday party without a $40 cab trip (or $200 hotel) or risk of blowing a .085 at a checkpoint has a huge financial and safety benefit. I'd argue the technology is already better than these people that are repeat DWI offenders blowing .24 when pulled over. I'm sure the tech doesn't get on wrong side of divided highway and not know it already. 
vnewman2
50%
50%
vnewman2,
User Rank: Ninja
7/5/2016 | 4:37:08 PM
Re: Just some thoughts
Agree - what Telsa has implemented is like an "advanced cruise control" - an assisted driver technology much like assisted parking.  But people will treat it like autopilot, as people will do under a false sense of security - and unfortunate accidents like this will be more common.
Page 1 / 3   >   >>
2018 State of the Cloud
2018 State of the Cloud
Cloud adoption is growing, but how are organizations taking advantage of it? Interop ITX and InformationWeek surveyed technology decision-makers to find out, read this report to discover what they had to say!
News
IT Budgets: Traditional Still Bigger than Cloud
Jessica Davis, Senior Editor, Enterprise Apps,  9/20/2018
Commentary
Building a Smart City Doesn't Have a Common Blueprint
Guest Commentary, Guest Commentary,  9/18/2018
Commentary
AWS vs. Azure: Users Share Their Experiences
Guest Commentary, Guest Commentary,  9/7/2018
Register for InformationWeek Newsletters
Video
Current Issue
The Next Generation of IT Support
The workforce is changing as businesses become global and technology erodes geographical and physical barriers.IT organizations are critical to enabling this transition and can utilize next-generation tools and strategies to provide world-class support regardless of location, platform or device
White Papers
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Sponsored Video
Flash Poll