Tesla Stands By Autopilot, Plans More Educational Outreach - InformationWeek
IoT
IoT
IT Life

Tesla Stands By Autopilot, Plans More Educational Outreach

Tesla CEO Elon Musk vows to continue use of the carmaker's Autopilot feature in the wake of a recent fatal crash of a Model S using the technology, and says he plans to step up efforts to educate customers about the system.

Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent
Google, Tesla And Apple Race For Electric, Autonomous Vehicle Talent
(Click image for larger view and slideshow.)

Tesla Motors CEO Elon Musk says he is standing firm in the use of Tesla's self-driving Autopilot feature, despite a recent fatal crash of a Model S using the technology. He also noted a fair number of the company's customers do not know how the system works.

Musk, in an interview with The Wall Street Journal, said the company does not plan to disable this feature. Currently, the National Highway Traffic Safety Administration is investigating the May 7 fatal accident that killed a 40-year-old man while he drove his Model S with the Autopilot activated. The accident marks the first known fatality to occur when Autopilot was in use.

The Autopilot software feature, launched in October as a beta, requires driver activation to kick in, but has a fair number of Tesla owners baffled about how to use it, according to Musk.

"A lot of people don't understand what it is and how you turn it on," Musk told the Journal.  As a result, Musk said Tesla is planning to post a blog that delves into how Autopilot works and the responsibilities of the driver once this self-driving feature is turned on.

Tesla has previously addressed concerns about the way drivers behave when Autopilot is activated. Musk tackled the topic during the company's Q3 2015 earnings conference call.

"There's been some fairly crazy videos on YouTube ... this is not good," Musk said, during the conference call. "And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it."

(Image: AdrianHancu/iStockphoto)

(Image: AdrianHancu/iStockphoto)

Tesla advises drivers they need to remain "engaged" while they are behind the wheel. A Tesla spokeswoman told the Journal, "Failing to periodically place hands on the steering wheel violates terms drivers agree to when enabling the feature."  

In the fatal May 7 accident, Tesla, according to the Journal, found that Autopilot did not activate the emergency braking system because it did not draw a distinction between the bright sky and a truck's white trailer.

Meanwhile, the NHTSA has its investigation underway.

"[The] NHTSA submitted its Information Request to Tesla as a standard step in its Preliminary Evaluation of the design and performance of Tesla's automated driving systems in use at the time of the May 7 crash in Florida. NHTSA has not made any determination about the presence or absence of a defect in the subject vehicles," according to an NHTSA statement provided to InformationWeek.

In its request for information for the 2015 Model S involved in the May 7 crash, the NHTSA sent a nine-page letter to Mathew Schwall, director of field performance engineering for Tesla, stating:

This letter is to inform you that the Office of Defects Investigation (ODI) of the National Highway Traffic Safety Administration (NHTSA) has opened Preliminary Evaluation PE16-007 to examine the performance of the Automatic Emergency Braking (AEB) system and any other forward crash mitigation or forward crash avoidance systems enabled and in use at the time of the fatal crash involving a model year (MY) 2015 Tesla Model S that was reported to ODI by Tesla, and to request information to assist us in our investigation.

The letter requests a laundry list of information regarding not only the Model S in the May 7 accident, but also other information supplied to or gathered by Tesla relating to Autopilot crashes, complaints, or incidents involving other Tesla cars.

Since NHTSA began the investigation in late June, two other crashes have occurred in which the drivers say that the Autopilot was turned on.

[Read AI, Machine Learning Drive Autonomous Vehicle Development.]

Earlier this month, in a non-injury accident, a Model X veered off the road and crashed against a guard rail that destroyed its passenger side and tore off the front wheel, according to a report on electrek.co.

Tesla, according to electrek, stated that its data logs indicated the driver did not take action after an alert was issued to take hold of the steering wheel prior to the accident. But the driver, in a CNN report, says he speaks Mandarin and his car was set to English.  

Another accident occurred this month, this one involving a 2016 Model X. The driver claimed that Autopilot was activated when the car hit a guard rail on the Pennsylvania Turnpike and then crossed over some lanes before hitting a concrete median, according to a Detroit Free Press report. No injuries were reported.

Tesla, according to the Free Press, stated it could find no evidence that the Autopilot was activated at the time of the crash.

Dawn Kawamoto is an Associate Editor for Dark Reading, where she covers cybersecurity news and trends. She is an award-winning journalist who has written and edited technology, management, leadership, career, finance, and innovation stories for such publications as CNET's ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Technocrati
100%
0%
Technocrati,
User Rank: Ninja
7/13/2016 | 8:06:00 PM
Re: Bad Technology, not Uneducated Users
"...but in the end, the technologist has to make it right for the end user."

 

@jastroff    I agree with your point and the point of your former boss.  It is the responsibility of the software maker to explain functions in a way a non-techie will understand.  

I find it odd and a bit arrogant that Mr. Musk seems to be implying Tesla drivers don't understand the feature.  

 

Whose fault is that ?   Come on Mr. Musk this is CS101.

Technocrati
50%
50%
Technocrati,
User Rank: Ninja
7/13/2016 | 8:00:14 PM
Auto Pilot, Tesla and Fine Print

 

"...Failing to periodically place hands on the steering wheel violates terms drivers agree to when enabling the feature."  

 

Talk about a catch-22.   The users doesn't understand what the feature truly is and then if they perish due to it - Tesla holds no liability ?  

I think Mr. Musk should think about allocating more resources to the company's legal defense fund.  Looks like he will need it.

jastroff
50%
50%
jastroff,
User Rank: Ninja
7/13/2016 | 6:35:27 PM
Bad Technology, not Uneducated Users
When I was learning how to design consumer technologies in financial services, the mantra of our founder was there's no bad customers, but badly designed technology -- or words to that effect. We had to make it easier and easier to use. 

When I taught human factors, I would tell my students  -- if you arrive at a door, and you don't know whether to push it or pull the handle, it's a badly designed door. You are not at fault. We had several discussions about where hinges go and what makes them a visible clue.

That's what made the GUI interface a triumph for Apple, while MS-DOS was still delivering ASCII interfaces until Win 95. Another thing the first MAC did  - the 3.5 inch floppy could only go int he slot one way -- the right way. Unlike the 5 1/4 floppy, which could (and did) go in at least 2 wrong ways.

So it is disheartening (to say the least)  for a technology leader to state:

>> The Autopilot software feature, launched in October as a beta, requires driver activation to kick in, but has a fair number of Tesla owners baffled about how to use it, according to Musk.

So he thinks they need more education.

Users are, well, not that smart, right?

You can argue degress of difficulty in driving a car to getting an online loan, but in the end, the technologist has to make it right for the end user.
Commentary
Why Your Company's AI Strategy May Not Be Its Own
Lisa Morgan, Freelance Writer,  3/18/2019
Commentary
Q&A: Deloitte's Lisa Noon on Inclusivity and Cloud Evolution
Joao-Pierre S. Ruth, Senior Writer,  3/15/2019
Commentary
Empowering Women in the Workplace 365 Days a Year
Guest Commentary, Guest Commentary,  3/19/2019
White Papers
Register for InformationWeek Newsletters
2018 State of the Cloud
2018 State of the Cloud
Cloud adoption is growing, but how are organizations taking advantage of it? Interop ITX and InformationWeek surveyed technology decision-makers to find out, read this report to discover what they had to say!
Video
Current Issue
Security and Privacy vs. Innovation: The Great Balancing Act
This InformationWeek IT Trend Report will help you better understand and address the growing challenge of balancing the need for innovation with the real-world threats and regulations.
Slideshows
Flash Poll