August 10th, 2024

Tesla and NHTSA withhold data on Autopilot crashes, report suggests

An investigation reveals Tesla and NHTSA allegedly conceal details of over 200 crashes involving Autopilot, raising safety concerns and highlighting potential consumer misperceptions about the system's capabilities and limitations.

Read original articleLink Icon
Tesla and NHTSA withhold data on Autopilot crashes, report suggests

An investigation by the Wall Street Journal has revealed that Tesla and the National Highway Traffic Safety Administration (NHTSA) are allegedly withholding information regarding over 200 crashes involving Tesla's Autopilot system. The report indicates that both entities have classified critical details about these incidents as confidential business information. The investigation highlights that the Autopilot system has faced challenges in navigating obstacles, with some vehicles veering off the road or colliding with emergency vehicles. Tesla claims it can only share data when legally required, while the NHTSA cites privacy laws as a reason for not disclosing specific crash details. The Autopilot system, marketed as an advanced driver assistance tool, does not provide full autonomy and requires drivers to remain attentive. Critics argue that the marketing of Autopilot may mislead consumers about its capabilities, as it relies solely on cameras rather than radar or LiDAR technology. The report raises concerns about the safety implications of the Autopilot system and the transparency of data related to its performance in real-world scenarios.

- Tesla and NHTSA are accused of concealing crash data related to Autopilot.

- The investigation covered over 200 incidents, revealing issues with the system's obstacle navigation.

- Tesla claims data sharing is limited to legal requirements, while NHTSA cites privacy laws.

- Autopilot is marketed as an advanced driver assistance system, not a fully autonomous solution.

- Critics highlight potential consumer misperceptions regarding Autopilot's capabilities.

Related

Tesla Autopilot leads driver onto active train tracks, mistaking it for road

Tesla Autopilot leads driver onto active train tracks, mistaking it for road

A Tesla driver in Northern California mistakenly drove onto train tracks due to autopilot confusion. No injuries reported. Police stress understanding technology limitations. Investigation ongoing. Reminder to stay vigilant while using automated features.

Tesla prioritizes Musk's and other 'VIP' drivers' data to train FSD

Tesla prioritizes Musk's and other 'VIP' drivers' data to train FSD

Tesla gives priority to data from VIPs like Elon Musk and select high-profile drivers to train its self-driving AI, raising concerns about resource distribution. An army of annotators reviews footage to improve driving behaviors, despite some discomfort among workers.

Tesla prioritizes Musk's and VIP drivers' data to train self-driving software

Tesla prioritizes Musk's and VIP drivers' data to train self-driving software

Tesla gives special attention to VIPs like Elon Musk and select drivers to enhance its self-driving AI, focusing on their driving data to improve Autopilot and Full Self-Driving software. Concerns arise over resource distribution and distractions from achieving true autonomy.

Tesla in Seattle-area crash that killed motorcyclist was using FSD system

Tesla in Seattle-area crash that killed motorcyclist was using FSD system

A Tesla using "Full Self Driving" was involved in a fatal crash near Seattle, leading to the driver's arrest for vehicular homicide. The investigation continues regarding potential charges.

Hacked Tesla FSD computers disclose alarming raw data on deadly accidents

Hacked Tesla FSD computers disclose alarming raw data on deadly accidents

An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.

Link Icon 1 comments