Tesla Stands By Autopilot, Plans More Educational Outreach

Tesla CEO Elon Musk vows to continue use of the carmaker's Autopilot feature in the wake of a recent fatal crash of a Model S using the technology, and says he plans to step up efforts to educate customers about the system.

Dawn Kawamoto, Associate Editor, Dark Reading

July 13, 2016

4 Min Read
<p align="left">(Image: AdrianHancu/iStockphoto)</p>

Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent

Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent


Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent (Click image for larger view and slideshow.)

Tesla Motors CEO Elon Musk says he is standing firm in the use of Tesla's self-driving Autopilot feature, despite a recent fatal crash of a Model S using the technology. He also noted a fair number of the company's customers do not know how the system works.

Musk, in an interview with The Wall Street Journal, said the company does not plan to disable this feature. Currently, the National Highway Traffic Safety Administration is investigating the May 7 fatal accident that killed a 40-year-old man while he drove his Model S with the Autopilot activated. The accident marks the first known fatality to occur when Autopilot was in use.

The Autopilot software feature, launched in October as a beta, requires driver activation to kick in, but has a fair number of Tesla owners baffled about how to use it, according to Musk.

"A lot of people don't understand what it is and how you turn it on," Musk told the Journal.  As a result, Musk said Tesla is planning to post a blog that delves into how Autopilot works and the responsibilities of the driver once this self-driving feature is turned on.

Tesla has previously addressed concerns about the way drivers behave when Autopilot is activated. Musk tackled the topic during the company's Q3 2015 earnings conference call.

"There's been some fairly crazy videos on YouTube ... this is not good," Musk said, during the conference call. "And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it."

Tesla advises drivers they need to remain "engaged" while they are behind the wheel. A Tesla spokeswoman told the Journal, "Failing to periodically place hands on the steering wheel violates terms drivers agree to when enabling the feature."  

In the fatal May 7 accident, Tesla, according to the Journal, found that Autopilot did not activate the emergency braking system because it did not draw a distinction between the bright sky and a truck's white trailer.

Meanwhile, the NHTSA has its investigation underway.

"[The] NHTSA submitted its Information Request to Tesla as a standard step in its Preliminary Evaluation of the design and performance of Tesla's automated driving systems in use at the time of the May 7 crash in Florida. NHTSA has not made any determination about the presence or absence of a defect in the subject vehicles," according to an NHTSA statement provided to InformationWeek.

In its request for information for the 2015 Model S involved in the May 7 crash, the NHTSA sent a nine-page letter to Mathew Schwall, director of field performance engineering for Tesla, stating:

This letter is to inform you that the Office of Defects Investigation (ODI) of the National Highway Traffic Safety Administration (NHTSA) has opened Preliminary Evaluation PE16-007 to examine the performance of the Automatic Emergency Braking (AEB) system and any other forward crash mitigation or forward crash avoidance systems enabled and in use at the time of the fatal crash involving a model year (MY) 2015 Tesla Model S that was reported to ODI by Tesla, and to request information to assist us in our investigation.

The letter requests a laundry list of information regarding not only the Model S in the May 7 accident, but also other information supplied to or gathered by Tesla relating to Autopilot crashes, complaints, or incidents involving other Tesla cars.

Since NHTSA began the investigation in late June, two other crashes have occurred in which the drivers say that the Autopilot was turned on.

[Read AI, Machine Learning Drive Autonomous Vehicle Development.]

Earlier this month, in a non-injury accident, a Model X veered off the road and crashed against a guard rail that destroyed its passenger side and tore off the front wheel, according to a report on electrek.co.

Tesla, according to electrek, stated that its data logs indicated the driver did not take action after an alert was issued to take hold of the steering wheel prior to the accident. But the driver, in a CNN report, says he speaks Mandarin and his car was set to English.  

Another accident occurred this month, this one involving a 2016 Model X. The driver claimed that Autopilot was activated when the car hit a guard rail on the Pennsylvania Turnpike and then crossed over some lanes before hitting a concrete median, according to a Detroit Free Press report. No injuries were reported.

Tesla, according to the Free Press, stated it could find no evidence that the Autopilot was activated at the time of the crash.

About the Author(s)

Dawn Kawamoto

Associate Editor, Dark Reading

Dawn Kawamoto is an Associate Editor for Dark Reading, where she covers cybersecurity news and trends. She is an award-winning journalist who has written and edited technology, management, leadership, career, finance, and innovation stories for such publications as CNET's News.com, TheStreet.com, AOL's DailyFinance, and The Motley Fool. More recently, she served as associate editor for technology careers site Dice.com.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights