September 26th, 2024

Tesla Full Self Driving requires human intervention every 13 miles

An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.

Read original articleLink Icon
Tesla Full Self Driving requires human intervention every 13 miles

An independent evaluation of Tesla's Full Self Driving (FSD) system revealed significant safety concerns, with human intervention required every 13 miles during testing. The study, conducted by AMCI Testing over 1,000 miles in Southern California, highlighted both the advanced capabilities and dangerous behaviors of the FSD system. While FSD demonstrated impressive maneuvers, such as yielding to pedestrians and navigating blind curves, it also exhibited alarming actions, including running red lights and veering into oncoming traffic. The director of AMCI Testing, Guy Mangiamele, noted that the system's initial impressive performance could lead to dangerous complacency among drivers, who may become overly reliant on the technology. The unpredictability of FSD's behavior raises concerns about its programming and the underlying machine learning algorithms. Issues such as delayed lane changes and miscalculations during critical moments were also reported, suggesting that the system's reliability is still in question. Overall, while FSD shows potential, the findings underscore the necessity for drivers to remain vigilant and ready to intervene.

- Tesla's Full Self Driving requires human intervention approximately every 13 miles.

- The system demonstrated both advanced driving capabilities and dangerous behaviors.

- Drivers may become complacent due to FSD's initial impressive performance.

- Unpredictable behavior and programming inadequacies raise safety concerns.

- Continuous monitoring and readiness to intervene are essential for users of FSD.

Link Icon 17 comments
By @ado__dev - 7 months
I've had FSD since the very first beta and honestly even the 13 mile number is generous. Maybe on freeway only driving it's every 13 miles. On city streets, it's more like every 1-2 miles requires manual intervention unless you want to be the biggest nuisance on the road and a total jerk to everyone around you.
By @carlgreene - 7 months
I had a really negative view on FSD from just reading and seeing stuff online until I finally decided to rent a Model Y on Turo with FSD...I was absolutely blown away.

I drove it from Houston to Amarillo (600 miles) and had to touch the wheel only a couple times. That includes pulling me off the freeway, into the freaking parking spot next to the Supercharger, and finally through my neighborhood to the front of my house.

For the price I don't think the MY or M3 can be beat and will surely be very high on the list for my family's next vehicle

By @modeless - 7 months
This is true, however it has been improving quickly. Releases are coming once every couple of months. There is already a newer release than the ones they tested, and each release is noticeably better. There is no indication of a ceiling yet.

I do take issue with the claim that "its seeming infallibility in anyone's first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency". Waymo claims this as well, citing it as the reason they never released a driver assistance feature and went for full autonomy instead. However, this is is essentially speculation that has not been borne out in practice. There has been no epidemic of crashes from people abusing Autopilot or FSD. It's been on the roads for years. If "dangerous complacency" was a real problem we it would be blindingly obvious from the statistics by now. There have been just a few well publicized cases but statistically there is no evidence that this "dangerous complacency" problem is worse than normal driving.

By @pavlov - 7 months
Like a fool I purchased the FSD feature on a new Tesla in March 2019. All this time later, it still does absolutely nothing in my country. It’s actively dangerous to use because it can’t even recognize speed limits and will happily drive at 120 km/h in a 100 km/h zone.

I’m going to get rid of the car soon. This feature cost 7,500 euros but its resale value is essentially zero because everyone knows it’s a complete joke.

Obviously my next car won’t be from this scam company. Worst purchase I ever made.

By @adamwong246 - 7 months
Why should I trust Tesla's AI with my life, much less everyone else's? They couldn't even get the CyberTruck's trim right! It's wild that we have not demanded greater governmental oversight over consumer AI products but in time it will become inevitable.
By @bdjsiqoocwk - 7 months
"requires human intervention every 13 miles" is a horrible metric, because it makes it sound it's not so bad until you remember that the moments for intervention are unpredictable and are also when you're just about to die.
By @nkrebs13 - 7 months
FSD is getting so good so fast. The difference between 1 year ago and now is night and day. It's a godsend for road trips and it amazes me with each passing month's improvements.

It's not perfect and people shouldn't expect that. But I don't understand how anyone experiences FSD and isn't amazed. It's not unsafe -- if anything my interventions are because it's being _too_ safe/timind.

Weather forecasting isn't perfect. But it's pretty good! And it's getting better! Just because weather forecasting isn't perfect doesn't mean I won't use it and it doesn't mean we should stop improving it.

By @avalys - 7 months
How does it respond when there is a police officer directing traffic?

What if the police officer gives a verbal command?

FSD, as Tesla markets it, is hype. They are no where close to a marketable solution for the use cases they advertise - Robotaxis, the ability to step out of your car and have it park somewhere else by itself, etc.

Yes, they will get it to 99% at some point - but 99% is not good enough for legal liability.

FSD is an ambitious goal and I don’t criticize Musk for pursuing it. But I will criticize him for using it as a distraction from the fact that Tesla has a stale and/or uncompetitive product line and is rapidly losing their lead to both conventional OEMs and Chinese EV makers.

By @jphalimi - 7 months
I am so, so tired of Tesla's claims that FSD is "multiple times safer than humans" when the data they base these claims on are basically people using FSD in a totally safe environment which made them use FSD in the first place (mostly long straight highways).

Anyone trying FSD in a crowded city environment would shit their pants. Unprotected lefts are very often a mess, and interventions are legion. It is really a breath of fresh air to hear news outlets report finally about the actual state of the technology.

By @throwaway2016a - 7 months
Like other here I think 13 miles may be generous for city driving but pretty reasonable for highway.

With that said, it works MUCH better for me than it did a couple years ago and I find most of the time I disengage it is not because it actually needed human intervention but because it wasn't aware of the social norms at certain intersections.

For example, I have an intersection near my house that requires you to inch out and essentially floor it first chance you get if you want any hope whatsoever of taking a left hand turn, but FSD will (understandably, I think) not do that.

By @shellfishgene - 7 months
In Germany someone is sueing Tesla over "phantom braking", and the judge ordered an idependent court-appointed expert to check. After 600 km of driving the car braked without reason, and the expert wrote in his report the situation was dangerous enough that he had to stop any further testing. This is now official record and can be referred to in other lawsuits. We'll see what happens...
By @seshagiric - 7 months
I use the monthly subscription every couple of months on my Model Y and the FSD has become quite good. For me the two recurring problems are does not follow traffic rules when merging and navigating to exit in case of back 2 back traffic on highway simply does not work. However in rest of the cases its pretty good or perhaps better way to say the best in market right now.
By @bigtones - 7 months
Waymo requires interventions about that often driving in San Francisco as well from my experience over many trips. Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.
By @y-c-o-m-b - 7 months
FSD?! They can't even fix the damn windshield wipers!

The car ahead of me decides to clean their windshield and 3 droplets get on my windshield? Tesla: WIPER LUDICROUS SPEED ENGAGED!

I just drove next to a semi and got blasted with a tsunami of water in a rain-storm? Tesla: ...

By @oxqbldpxo - 7 months
I enjoy driving
By @MisterTea - 7 months
Lucky 13.
By @Workaccount2 - 7 months
It's incredible that Tesla is nearly a $1T corporation because it is about to announce robo taxi. Meanwhile it's actual car sales are shrinking quarter by quarter. And it's CEO supports the presidential candidate that wants to do away with carbon credits (~40% of tesla's net income).

If you any evidence that the market isn't rigged, but just full of gullible idiots, Tesla is it.