Tesla self-driving crash reports prompt NHTSA investigation

Some 2.4 million Tesla vehicles will be evaluated.

October 18, 2024, 1:52 PM

The National Highway Traffic Safety Administration (NHTSA) has opened an investigation into Tesla's Full Self-Driving (FSD) feature after receiving reports of four crashes, one of which caused a pedestrian fatality.

According to NHTSA, the FSD system's engineering controls failed to "react appropriately to reduced roadway visibility conditions." NHTSA states that the crashes occurred in "reduced roadway visibility" that "arose from conditions such as sun glare, fog, or airborne dust" while the vehicles' FSD mode was engaged.

"In one of the crashes, the Tesla vehicle fatally struck a pedestrian," according to the report, while "one additional crash in these conditions involved a reported injury."

NHTSA will examine an estimated 2.4 million Teslas, including 2016-2024 Models S and X vehicles, 2017-2024 Model 3 vehicles, 2020-2024 Model Y vehicles, and 2023-2024 model year Cybertrucks equipped with the FSD system.

Close-up of Tesla logo on a red wall at Tesla showroom, Santana Row, San Jose, California, August 3, 2024.
Smith Collection/gado/Gado via Getty Images

The NHTSA preliminary examination of the FSD system will assess its ability to "detect and respond appropriately to reduced roadway visibility conditions." The agency will also investigate "whether any other similar FSD crashes have occurred in reduced roadway visibility conditions and, if so, the contributing circumstances for those crashes," as well as if there were any updates to Tesla's FSD system "that may affect the performance of FSD in reduced roadway visibility conditions."

According to Tesla's website, "The currently enabled Autopilot and Full Self-Driving (Supervised) features require active driver supervision and do not make the vehicle autonomous."

Tesla did not immediately respond to an ABC News request for comment.

The NHTSA investigation follows Tesla's recall of around two million vehicles last December over issues with its autopilot system. The company addressed the issue with a software update for the affected models.

"We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems," the company said at the time. "At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury."

Related Topics