Tesla drivers say new FSD update is repeatedly running red lights
Tesla's latest FSD software update is causing vehicles to run red lights, raising safety concerns. The company faces federal investigations and criticism for overstating its self-driving technology's capabilities.
Read original articleTesla drivers are reporting that the latest update of the company's "Full Self-Driving" (FSD) software is causing vehicles to run red lights, raising safety concerns. Users on platforms like Reddit have shared experiences where their cars attempted to proceed through red lights, prompting fears about the software's reliability. One driver noted they managed to stop the vehicle before it ran a red light, while another mentioned experiencing similar issues at lower speeds. A YouTuber also documented a situation where their Tesla tried to drive through a red light in Newark, New Jersey. These incidents come amid ongoing federal investigations into Tesla's driver assist software, which has been linked to at least 13 fatal accidents. The company has faced criticism for allegedly overstating the capabilities of its self-driving technology. As Tesla aims to transition towards robotaxis, it faces challenges related to crashes and public perception. Experts suggest that a software patch may address the red light issue, but there are calls for Tesla to reconsider its branding of the FSD software until it meets safety standards.
- Tesla's latest FSD update is reportedly causing cars to run red lights.
- Drivers express concerns about the reliability and safety of the software.
- The company is under federal investigation for its driver assist technology.
- Tesla has faced criticism for exaggerating the capabilities of its self-driving cars.
- There are calls for Tesla to change the branding of its FSD software until it is safe.
Related
Elon Musk signals reaching limit of Tesla's HW3 despite self-driving promise
Elon Musk acknowledged Tesla's Hardware 3 (HW3) self-driving computer is nearing its limits, prompting a shift to the more powerful HW4 for new releases, raising concerns about achieving full autonomy for HW3 vehicles.
Tesla in Seattle-area crash that killed motorcyclist was using FSD system
A Tesla using "Full Self Driving" was involved in a fatal crash near Seattle, leading to the driver's arrest for vehicular homicide. The investigation continues regarding potential charges.
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla and NHTSA withhold data on Autopilot crashes, report suggests
An investigation reveals Tesla and NHTSA allegedly conceal details of over 200 crashes involving Autopilot, raising safety concerns and highlighting potential consumer misperceptions about the system's capabilities and limitations.
Tesla deletes its blog post stating all cars have self-driving hardware
Tesla removed all blog posts before 2019, including one promising Full Self-Driving hardware. Current FSD requires human oversight, leading to customer disputes over upgrades and uncertainty about future commitments.
I own a Tesla, and I chose not to buy FSD because in 2019 when I bought it, it simply didn't work based on what I had read. Earlier this year, when Tesla gave a trial to all Tesla owners, I tried it out and was surprised at how good it was, but it still make bad mistakes, one time even curbing the rear wheel while navigating a curve to the right [0], despite it knowing exactly where the curb was.
The problem is that you still have to babysit it. And if you have the babysit a self-driving feature, then it's entirely useless. It becomes nothing more than a party trick to show off to friends about how neat it is.
Autopilot is great. Love it, especially on road trips. But until FSD is good enough to drive my drunk ass home from a party and even park in my driveway, I'm staying away (and staying sober!).
[0] Here's the curb it hit: https://maps.app.goo.gl/BLeqSywhXHRyf7a39. With how wide that lane is, there's no reason it should have cut it so close that it hit the curb. There wasn't even anybody next to me.
https://x.com/bradsferguson/status/1828031824158683439
As it has been for literal years and has resulted in at a minimum one recorded instance of literally and knowingly running down at least one child exiting a school bus:
https://x.com/tesla2moon/status/1770599114494898310
https://www.youtube.com/watch?v=_ZiSZbWIrzA
https://www.youtube.com/watch?v=Ly6Juveo-7Y
Reported by Tesla confirming ADAS usage as incident 13781-5100 in the NHTSA SGO database [1].
https://www.wnct.com/on-your-side/crime-tracker/tesla-driver...
[1] https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...
Still, it's worrisome to judge based on anecdotes. I'd really like to see some sort of overall statistics that compare Tesla FSD to other cars. Tesla claims their FSD system, overall, saves lives. Does it?
The Waymo statistics, I find pretty convincing that Waymo is safer than human-driven taxis. There are still bad Waymo anecdotes, and they should continue to improve, but overall it seems like a good thing for safety.
Tesla FSD, I just don't know.
Comma's 3X seems to be something that increases driving convenience for people, but doesn't over-promise / under-deliver on its capabilities. Should Level 2 be what manufacturers be shooting for when they're shipping their 2024/5 models?
(Though certainly strive for higher levels in the future.)
Related
Elon Musk signals reaching limit of Tesla's HW3 despite self-driving promise
Elon Musk acknowledged Tesla's Hardware 3 (HW3) self-driving computer is nearing its limits, prompting a shift to the more powerful HW4 for new releases, raising concerns about achieving full autonomy for HW3 vehicles.
Tesla in Seattle-area crash that killed motorcyclist was using FSD system
A Tesla using "Full Self Driving" was involved in a fatal crash near Seattle, leading to the driver's arrest for vehicular homicide. The investigation continues regarding potential charges.
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla and NHTSA withhold data on Autopilot crashes, report suggests
An investigation reveals Tesla and NHTSA allegedly conceal details of over 200 crashes involving Autopilot, raising safety concerns and highlighting potential consumer misperceptions about the system's capabilities and limitations.
Tesla deletes its blog post stating all cars have self-driving hardware
Tesla removed all blog posts before 2019, including one promising Full Self-Driving hardware. Current FSD requires human oversight, leading to customer disputes over upgrades and uncertainty about future commitments.