Federal regulator finds Tesla Autopilot has ‘critical safety gap’ linked to hundreds of collisions

Technology

In this article

A Tesla Model X burns after crashing on U.S. Highway 101 in Mountain View, California, U.S. on March 23, 2018. 
S. Engleman | Via Reuters

Federal authorities say a “critical safety gap” in Tesla‘s Autopilot system contributed to at least 467 collisions, 13 resulting in fatalities and “many others” resulting in serious injuries.

The findings come from a National Highway Traffic Safety Administration analysis of 956 crashes in which Tesla Autopilot was thought to have been in use. The results of the nearly three-year investigation were published Friday.

Tesla’s Autopilot design has “led to foreseeable misuse and avoidable crashes,” the NHTSA report said. The system did not “sufficiently ensure driver attention and appropriate use.”

The agency also said it was opening a new probe into the effectiveness of a software update Tesla previously issued as part of a recall in December. That update was meant to fix Autopilot defects that NHTSA identified as part of this same investigation.

The voluntary recall via an over-the-air software update covered 2 million Tesla vehicles in the U.S., and was supposed to specifically improve driver monitoring systems in Teslas equipped with Autopilot.

NHTSA suggested in its report Friday that the software update was probably inadequate, since more crashes linked to Autopilot continue to be reported.

In one recent example, a Tesla driver in Snohomish County, Washington, struck and killed a motorcyclist on April 19, according to records obtained by CNBC and NBC News. The driver told police he was using Autopilot at the time of the collision.

The NHTSA findings are the most recent in a series of regulator and watchdog reports that have questioned the safety of Tesla’s Autopilot technology, which the company has promoted as a key differentiator from other car companies.

On its website, Tesla says Autopilot is designed to reduce driver “workload” through advanced cruise control and automatic steering technology.

Tesla has not issued a response to Friday’s NHTSA report and did not respond to a request for comment sent to Tesla’s press inbox, investor relations team and to the company’s vice president of vehicle engineering, Lars Moravy.

Earlier this month, Tesla settled a lawsuit from the family of Walter Huang, an Apple engineer and father of two, who died in a crash when his Tesla Model X with Autopilot features switched on hit a highway barrier. Tesla has sought to seal from public view the terms of the settlement.

In the face of these events, Tesla and CEO Elon Musk signaled this week that they are betting the company’s future on autonomous driving.

“If somebody doesn’t believe Tesla’s going to solve autonomy, I think they should not be an investor in the company,” Musk said on Tesla’s earnings call Tuesday. He added, “We will, and we are.”

Musk has for years promised customers and shareholders that Tesla would be able to turn its existing cars into self-driving vehicles with a software update. However, the company only offers driver assistance systems and has not produced self-driving vehicles to date.

He has also made safety claims about Tesla’s driver assistance systems without allowing third-party review of the company’s data.

For example, in 2021, Elon Musk claimed in a post on social media, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Philip Koopman, an automotive safety researcher and Carnegie Mellon University associate professor of computer engineering, said he views Tesla’s marketing and claims as “autonowashing.” He also said in response to NHTSA’s report that he hopes Tesla will take the agency’s concerns seriously moving forward.

“People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety,” Koopman said. “Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle. Tesla could improve monitoring so drivers can’t routinely become absorbed in their cellphones while Autopilot is in use.”

NBC’s Robert Wile contributed to this report.

Articles You May Like

Test your political knowledge in the Politics Hub’s 2024 quiz
Surprisingly low retail sales in key Christmas shopping month – official figures
US government avoids shutdown after funding bill clears Congress
‘Germany stands in dark hours with Magdeburg’: Memorial service held for Christmas market attack victims
Awesomely Weird Alibaba EV of the Week: This 18 MPH suitcase is a terrible idea