Tesla recalls 2 million cars with ‘insufficient’ Autopilot safety controls

Tesla is recalling more than 2 million vehicles to fix Autopilot systems that U.S. safety regulators determined did not have enough controls to prevent misuse, the largest recall of Tesla’s driver-assistance software to date.

The National Highway Traffic Safety Administration said Tesla’s method of ensuring drivers are still paying attention while the driver-assistance system is activated is “insufficient.”

“There may be an increased risk of a crash,” the agency wrote, in some situations when the system is engaged “and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged.”

17 fatalities, 736 crashes: The shocking toll of Tesla’s Autopilot

The recall comes days after The Washington Post published an investigation that found Teslas in Autopilot had repeatedly been involved in deadly crashes on roads where the software was not intended to be used.

NHTSA said Tesla will send out a software update to fix the problems affecting its 2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles, effectively encompassing all Tesla vehicles equipped with Autopilot on U.S. roads. Autopilot is a standard feature on Tesla’s vehicles; only some early Tesla models are not equipped with the software.

“Automated technology holds great promise for improving safety but only when it is deployed responsibly; today’s action is an example of improving automated systems by prioritizing safety,” NHTSA said in a statement.

Tesla did not immediately respond to requests for comment early Wednesday.

In a statement this week responding to the Washington Post report, Tesla said it has a “moral obligation” to continue improving its safety systems, while adding that it’s “morally indefensible” to not make these features available to a wider set of consumers. The company argues that vehicles in Autopilot perform more safely than those in normal driving, citing the lower frequency of crashes when the software is enabled.

“The Tesla team looks forward to continuing our work with them towards our common goal of eliminating as many deaths and injuries as possible on our roadways,” reads the company’s post on X, the platform formerly known as Twitter.

Tesla drivers run Autopilot where it’s not intended — with deadly consequences

Federal regulators with NHTSA have been investigating the software for more than two years in a probe examining more than a dozen crashes involving Teslas in Autopilot and parked emergency vehicles. The agency also started requiring in 2021 that automakers deploying driver-assistance software report crashes involving the technology to the agency.

In all, NHTSA said it reviewed 956 crashes allegedly involving Autopilot before zeroing in on 322 software-related crashes that involved “frontal impacts and impacts from potential inadvertent disengagement of the system.”

The Post story reported Tesla’s acknowledgments, based on user manuals, legal documents and statements to regulators, that the key Autopilot feature called Autosteer is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” Despite that, drivers managed to activate Autopilot in locations other than those intended for the software — at times with deadly consequences.

In its recall notice, NHTSA said: “Autosteer is designed and intended for use on controlled-access highways when the feature is not operating in conjunction with the Autosteer on City Streets feature,” a more advanced version known as Full Self-Driving.

“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature,” the recall notice said.

Tesla typically addresses NHTSA software recalls through remote updates, meaning the vehicles do not have to be returned to service centers to meet the agency’s requirements. Tesla has remedied multiple software flaws with remote updates at NHTSA’s behest, including issuing a fix to Full Self-Driving software in 2021 after cars started sharply activating their brakes at highway speeds.

Tesla chief executive Elon Musk has decried NHTSA as the “fun police” and has taken issue with regulators’ terminology, posting on X that the use of the word “‘recall’ for an over-the-air software update is anachronistic and just flat wrong!”

Tesla’s policy chief Rohan Patel hailed the work of both Tesla and its regulators in a post on X.

“The regulatory system is working about as well as it can given the lack of clear regulations in this field,” he said, adding that those who had “demonized” the company and NHTSA were “on the wrong side of history.”

The investigation will remain open “to support an evaluation of the effectiveness of the remedies deployed by Tesla,” NHTSA said.

Why Tesla Autopilot shouldn’t be used in as many places as you think

The Post report revealed at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver-assistance software could not reliably operate. The Post’s report was based on an analysis of two federal databases, legal records and other public documents.

The recall comes after a years-long investigation into crashes while the Autopilot system was activated. According to a timeline released by NHTSA, Tesla cooperated with repeated inquiries starting in August 2021, concluding in a series of meetings in early October 2023. In those meetings, Tesla “did not concur” with the agency’s safety analysis but proposed several “over-the-air” software updates to address the issue.

When Autopilot is activated, the driver is still considered the “operator” of the vehicle. That means the person is responsible for the vehicle’s movement, with hands on the steering wheel at all times and attention being paid to the surroundings at all times in readiness to brake.

In a related safety recall report, NHTSA said the risk of collision can increase if the driver fails to “maintain continuous and sustained responsibility for the vehicle” or fails to recognize when Autopilot turns off.

The software update, which was to be deployed on “certain affected vehicles” starting Dec. 12, will add extra controls and alerts to “encourage the driver to adhere to their continuous driving responsibility,” the recall report said. The update also will include controls that prevent Autosteer from engaging outside of areas where it is supposed to work as well as a feature that can suspend a driver’s Autosteer privileges if the person repeatedly fails to stay engaged at the wheel.

The company’s stock fell around 2.7 percent in trading Wednesday, even as broader stock market indexes were flat.


Credit
1: https://www.washingtonpost.com/technology/2023/12/13/tesla-autopilot-recall/

The Mention Sources Can Contact is to remove/Changing this articles

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top