Tesla FSD crashes in fog, sun glare–Feds open new safety investigation
Federal safety investigators are examining Tesla's "full self-driving" feature after four crashes, including one fatality. The investigation will assess FSD's performance in low-visibility conditions and potential financial impacts.
Read original articleFederal safety investigators have launched their 14th investigation into Tesla, focusing on the automaker's "full self-driving" (FSD) feature. The National Highway Traffic Safety Administration (NHTSA) is examining four reported crashes involving Teslas using FSD, which occurred under conditions of reduced visibility such as fog, sun glare, and airborne dust. One of these incidents resulted in the death of a pedestrian in Rimrock, Arizona, in November 2023. The investigation aims to assess FSD's capability to detect and respond to low-visibility situations, as Tesla relies solely on a camera-based system rather than a more sophisticated sensor setup. NHTSA will also explore whether there are additional similar crashes and evaluate any updates made to the FSD system by Tesla. The outcome of this investigation could have significant financial implications for Tesla, potentially leading to a costly recall or the disabling of FSD, which would impact the company's revenue and market perception.
- NHTSA has opened its 14th investigation into Tesla, focusing on the FSD feature.
- Four crashes, including one fatality, prompted the investigation.
- The inquiry will assess FSD's performance in low-visibility conditions.
- Tesla's reliance on a camera-only system is under scrutiny.
- Potential outcomes could include recalls or disabling FSD, affecting Tesla's revenue.
Related
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found safety concerns, requiring human intervention every 13 miles, with advanced capabilities but dangerous behaviors, highlighting the need for driver vigilance.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.
Tesla's Full Self-Driving software under investigation by NHTSA
The NHTSA is investigating Tesla's Full Self-Driving software after crashes in low visibility, including a fatality. Tesla also faces legal challenges over driver-assistance claims and scrutiny of its Autopilot system.
US to probe Tesla's 'Full Self-Driving' system after pedestrian killed
The U.S. government is investigating Tesla's Full Self-Driving system after a fatal pedestrian incident, focusing on its performance in low visibility, which may affect future autonomous vehicle regulations.
Related
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found safety concerns, requiring human intervention every 13 miles, with advanced capabilities but dangerous behaviors, highlighting the need for driver vigilance.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.
Tesla's Full Self-Driving software under investigation by NHTSA
The NHTSA is investigating Tesla's Full Self-Driving software after crashes in low visibility, including a fatality. Tesla also faces legal challenges over driver-assistance claims and scrutiny of its Autopilot system.
US to probe Tesla's 'Full Self-Driving' system after pedestrian killed
The U.S. government is investigating Tesla's Full Self-Driving system after a fatal pedestrian incident, focusing on its performance in low visibility, which may affect future autonomous vehicle regulations.