Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.
Read original articleAn independent evaluation of Tesla's Full Self Driving (FSD) system revealed significant safety concerns, with human intervention required every 13 miles during testing. The study, conducted by AMCI Testing over 1,000 miles in Southern California, highlighted both the advanced capabilities and dangerous behaviors of the FSD system. While FSD demonstrated impressive maneuvers, such as yielding to pedestrians and navigating blind curves, it also exhibited alarming actions, including running red lights and veering into oncoming traffic. The director of AMCI Testing, Guy Mangiamele, noted that the system's initial impressive performance could lead to dangerous complacency among drivers, who may become overly reliant on the technology. The unpredictability of FSD's behavior raises concerns about its programming and the underlying machine learning algorithms. Issues such as delayed lane changes and miscalculations during critical moments were also reported, suggesting that the system's reliability is still in question. Overall, while FSD shows potential, the findings underscore the necessity for drivers to remain vigilant and ready to intervene.
- Tesla's Full Self Driving requires human intervention approximately every 13 miles.
- The system demonstrated both advanced driving capabilities and dangerous behaviors.
- Drivers may become complacent due to FSD's initial impressive performance.
- Unpredictable behavior and programming inadequacies raise safety concerns.
- Continuous monitoring and readiness to intervene are essential for users of FSD.
Related
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla drivers say new FSD update is repeatedly running red lights
Tesla's latest FSD software update is causing vehicles to run red lights, raising safety concerns. The company faces federal investigations and criticism for overstating its self-driving technology's capabilities.
Tesla FSD no longer offered for purchase
Tesla has updated its Full Self-Driving package to "Full Self-Driving (Supervised)," removing previous autonomy promises. Current owners face uncertainty about future features, while software updates enhance capabilities.
Tesla self-driving promises are getting weaker on new cars
Tesla has downgraded its self-driving promises, now emphasizing "supervised" driving. The price of the Full Self-Driving package has decreased, impacting used Tesla values and raising concerns about future commitments.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found safety concerns, requiring human intervention every 13 miles, with advanced capabilities but dangerous behaviors, highlighting the need for driver vigilance.
I drove it from Houston to Amarillo (600 miles) and had to touch the wheel only a couple times. That includes pulling me off the freeway, into the freaking parking spot next to the Supercharger, and finally through my neighborhood to the front of my house.
For the price I don't think the MY or M3 can be beat and will surely be very high on the list for my family's next vehicle
I do take issue with the claim that "its seeming infallibility in anyone's first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency". Waymo claims this as well, citing it as the reason they never released a driver assistance feature and went for full autonomy instead. However, this is is essentially speculation that has not been borne out in practice. There has been no epidemic of crashes from people abusing Autopilot or FSD. It's been on the roads for years. If "dangerous complacency" was a real problem we it would be blindingly obvious from the statistics by now. There have been just a few well publicized cases but statistically there is no evidence that this "dangerous complacency" problem is worse than normal driving.
I’m going to get rid of the car soon. This feature cost 7,500 euros but its resale value is essentially zero because everyone knows it’s a complete joke.
Obviously my next car won’t be from this scam company. Worst purchase I ever made.
It's not perfect and people shouldn't expect that. But I don't understand how anyone experiences FSD and isn't amazed. It's not unsafe -- if anything my interventions are because it's being _too_ safe/timind.
Weather forecasting isn't perfect. But it's pretty good! And it's getting better! Just because weather forecasting isn't perfect doesn't mean I won't use it and it doesn't mean we should stop improving it.
What if the police officer gives a verbal command?
FSD, as Tesla markets it, is hype. They are no where close to a marketable solution for the use cases they advertise - Robotaxis, the ability to step out of your car and have it park somewhere else by itself, etc.
Yes, they will get it to 99% at some point - but 99% is not good enough for legal liability.
FSD is an ambitious goal and I don’t criticize Musk for pursuing it. But I will criticize him for using it as a distraction from the fact that Tesla has a stale and/or uncompetitive product line and is rapidly losing their lead to both conventional OEMs and Chinese EV makers.
Anyone trying FSD in a crowded city environment would shit their pants. Unprotected lefts are very often a mess, and interventions are legion. It is really a breath of fresh air to hear news outlets report finally about the actual state of the technology.
With that said, it works MUCH better for me than it did a couple years ago and I find most of the time I disengage it is not because it actually needed human intervention but because it wasn't aware of the social norms at certain intersections.
For example, I have an intersection near my house that requires you to inch out and essentially floor it first chance you get if you want any hope whatsoever of taking a left hand turn, but FSD will (understandably, I think) not do that.
The car ahead of me decides to clean their windshield and 3 droplets get on my windshield? Tesla: WIPER LUDICROUS SPEED ENGAGED!
I just drove next to a semi and got blasted with a tsunami of water in a rain-storm? Tesla: ...
If you any evidence that the market isn't rigged, but just full of gullible idiots, Tesla is it.
Related
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla drivers say new FSD update is repeatedly running red lights
Tesla's latest FSD software update is causing vehicles to run red lights, raising safety concerns. The company faces federal investigations and criticism for overstating its self-driving technology's capabilities.
Tesla FSD no longer offered for purchase
Tesla has updated its Full Self-Driving package to "Full Self-Driving (Supervised)," removing previous autonomy promises. Current owners face uncertainty about future features, while software updates enhance capabilities.
Tesla self-driving promises are getting weaker on new cars
Tesla has downgraded its self-driving promises, now emphasizing "supervised" driving. The price of the Full Self-Driving package has decreased, impacting used Tesla values and raising concerns about future commitments.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found safety concerns, requiring human intervention every 13 miles, with advanced capabilities but dangerous behaviors, highlighting the need for driver vigilance.