Equities

Tesla Autopilot Tied to 467 Crashes, Faces Safety Scrutiny

NHTSA links Tesla Autopilot to 467 crashes and 13 deaths, questioning the system's safety and effectiveness.

By Jack Wilson

4/26, 13:37 EDT
Tesla, Inc.
article-main-img

Key Takeaway

  • NHTSA links Tesla's Autopilot to 467 collisions, including 13 fatalities, highlighting critical safety gaps and design flaws.
  • Despite a recall covering 2 million vehicles for software updates, concerns persist over Autopilot's safety and effectiveness.
  • Tesla faces regulatory scrutiny and legal challenges over the marketing and safety claims of its driver assistance systems.

Autopilot Safety Concerns

Federal authorities have identified a "critical safety gap" in Tesla's Autopilot system, attributing it to at least 467 collisions, including 13 fatalities and numerous serious injuries. This information emerges from a National Highway Traffic Safety Administration (NHTSA) analysis of 956 crashes believed to involve Tesla's Autopilot. The nearly three-year investigation's findings, published on Friday, highlight the system's design flaws, which have led to "foreseeable misuse and avoidable crashes" due to insufficient measures to ensure driver attention and appropriate use.

Recall and Investigation

In response to identified defects, Tesla initiated a voluntary recall in December, covering 2 million U.S. vehicles via an over-the-air software update aimed at enhancing driver monitoring systems. However, NHTSA has raised concerns about the effectiveness of this update, noting ongoing reports of Autopilot-related crashes. A new probe has been opened to assess the recall's adequacy, especially after a recent incident in Snohomish County, Washington, where a Tesla driver, reportedly using Autopilot, fatally struck a motorcyclist on April 19.

Regulatory Scrutiny and Legal Challenges

Tesla's Autopilot and its more advanced Full Self-Driving system have faced significant scrutiny from regulators and watchdogs, questioning the safety of these technologies. Despite Tesla's claims of reducing driver "workload" through advanced cruise control and automatic steering, the company has encountered legal actions, including a lawsuit settlement with the family of Walter Huang, an Apple engineer who died in a crash while using Autopilot. Tesla's approach to marketing and safety claims, often referred to as "autonowashing" by critics like Philip Koopman, a Carnegie Mellon University associate professor, has been criticized for potentially overestimating the capabilities of their driver assistance systems.

Street Views

  • Philip Koopman, Carnegie Mellon University (Neutral on Tesla's Autopilot capabilities):

    "People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety... Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle. Tesla could improve monitoring so drivers can’t routinely become absorbed in their cellphones while Autopilot is in use."