October 18th, 2024

US to probe Tesla's 'Full Self-Driving' system after pedestrian killed

The U.S. government is investigating Tesla's Full Self-Driving system after a fatal pedestrian incident, focusing on its performance in low visibility, which may affect future autonomous vehicle regulations.

Read original articleLink Icon
US to probe Tesla's 'Full Self-Driving' system after pedestrian killed

The U.S. government has initiated an investigation into Tesla's "Full Self-Driving" (FSD) system following a fatal incident involving a pedestrian in low visibility conditions. The National Highway Traffic Safety Administration (NHTSA) is examining whether the FSD system contributed to the accident, which raises concerns about the safety and reliability of autonomous driving technologies. This inquiry is part of a broader scrutiny of Tesla's self-driving features, which have faced criticism and regulatory challenges in the past. The investigation aims to assess the system's performance in adverse weather conditions and its ability to detect and respond to pedestrians effectively. Tesla has previously stated that its FSD system is designed to improve safety, but incidents like this highlight the ongoing debate over the readiness of autonomous vehicles for public roads. The outcome of this investigation could have significant implications for Tesla and the future of self-driving technology in the automotive industry.

- The U.S. is investigating Tesla's Full Self-Driving system after a pedestrian was killed.

- The National Highway Traffic Safety Administration is leading the inquiry.

- The investigation focuses on the system's performance in low visibility conditions.

- This incident adds to ongoing scrutiny of Tesla's autonomous driving features.

- The outcome may impact the future of self-driving technology and regulations.

Link Icon 68 comments
By @rootusrootus - 6 months
I'm on my second free FSD trial, just started for me today. Gave it another shot, and it seems largely similar to the last free trial they gave. Fun party trick, surprisingly good, right up until it's not. A hallmark of AI everywhere, is how great it is and just how abruptly and catastrophically it fails occasionally.

Please, if you're going to try it, keep both hands on the wheel and your foot ready for the brake. When it goes off the rails, it usually does so in surprising ways with little warning and little time to correct. And since it's so good much of the time, you can get lulled into complacence.

I never really understand the comments from people who think it's the greatest thing ever and makes their drive less stressful. Does the opposite for me. Entertaining but exhausting to supervise.

By @TheAlchemist - 6 months
Tesla released a promotional video in 2016 saying that with FSD a human driver is not necessary and that "The person in the driver's seat is only there for legal reasons". The video was staged as we've learned in 2022.

2016 folks... Even with today's FSD which is several orders of magnitude better than the one in the video, you would still probably have a serious accident within a week (and I'm being generous here) if you didn't seat in the driver's seat.

How Trevor Milton got sentenced for fraud and the people responsible for this were not is a mystery to me.

By @bastawhiz - 6 months
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

By @massysett - 6 months
"Tesla says on its website its FSD software in on-road vehicles requires active driver supervision and does not make vehicles autonomous."

Despite it being called "Full Self-Driving."

Tesla should be sued out of existence.

By @Fomite - 6 months
"Driver is mostly disengaged, but then must intervene in a sudden fail state" is also one of the most dangerous types of automation due to how long it takes the driver to reach full control as well.
By @deergomoo - 6 months
This is an opinion almost certainly based more in emotion than logic, but I don't think I could trust any sort of fully autonomous driving system that didn't involve communication with transmitters along the road itself (like a glideslope and localiser for aircraft approaches) and with other cars on the road.

Motorway driving sure, there it's closer to fancy cruise control. But around town, no thank you. I regularly drive through some really crappily designed bits of road, like unlabelled approaches to multi-lane roundabouts where the lane you need to be in for a particular exit sorta just depends on what the people in front and to the side of you happen to have chosen. If it's difficult as a human to work out what the intent is, I don't trust a largely computer vision-based system to work it out.

The roads here are also in a terrible state, and the lines on them even moreso. There's one particular patch of road where the lane keep assist in my car regularly tries to steer me into the central reservation, because repair work has left what looks a bit like lane markings diagonally across the lane.

By @AlchemistCamp - 6 months
The interesting question is how good self-driving has to be before people tolerate it.

It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

By @alexjplant - 6 months
> The collision happened because the sun was in the Tesla driver's eyes, so the Tesla driver was not charged, said Raul Garcia, public information officer for the department.

Am I missing something or is this the gross miscarriage of justice that it sounds like? The driver could afford a $40k vehicle but not $20 polarized shades from Amazon? Negligence is negligence.

By @UltraSane - 6 months
I'm astonished at how long Musk has been able to keep his autonomous driving con going. He has been lying about it to inflate Tesla shares for 10 years now.
By @daghamm - 6 months
While at it, please also investigate why it is sometimes impossible to leave a damaged vehicle. This has resulted in people dying more than once:

https://apnews.com/article/car-crash-tesla-france-fire-be8ec...

By @InsomniacL - 6 months
As I come over the top of a crest, there was suddenly a lot of sun glare and the my Model Y violently swerved to the left, fortunately I had just overtaken a car on a two lane, dual carriageway and hadn't moved back to the left hand lane yet.

The driver I had just overtaken, although he wasn't very close anymore slowed right down to get away from me and I didn't blame him.

That manoeuvre in another car likely would have put it on two wheels.

They say FSD crashes less often than a human per mile driven, but I can only use FSD on roads like motorways, so I don't think it's a fair comparison.

I don't trust FSD, I still use it occasionally but never in less than ideal conditions. Typically when doing something like changing the music on a motorway.

It probably is safer than just me driving alone, when it's in good conditions on a straight road with light traffic with an alert driver.

By @rKarpinski - 6 months
'Pedestrian' in this context seems pretty misleading

"Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. "

edit: Parent article was changed... I was referring to the title of the NPR article.

By @testfrequency - 6 months
I was in a Model 3 Uber yesterday and my driver had to serve onto and up a curb to avoid an (idiot) who was trying to turn into traffic going in the other direction.

The Model 3 had every opportunity in the world to brake and it didn’t, we were probably only going 25mph. I know this is about FSD here, but that moment 100% made me realize Tesla has awful obstacle avoidance.

I just happen to be looking forward and it was a very plain and clear T-Bone avoidance, and at no point did the car handle or trigger anything.

Thankfully everyone was ok, but the front lip got pretty beat up from driving up the curb. Of course the driver at fault that caused the whole incident drove off.

By @gitaarik - 6 months
It concerns me that these Tesla's can suddenly start acting differently after a software update. Seems like a great target for a cyber attack. Or just a fail from the company. A little bug that is accidentally spread to millions of cars all over the world.

And how is this regulated? Say the software gets to a point that we deem it safe for full self driving, then it gets approved on the road, and then Tesla adds a new fancy feature to their software and rolls out an update. How are we to be confident that it's safe?

By @botanical - 6 months
Only the US government can allow corporations to beta test unproven technology on the public.

Governments should carry out comprehensive tests on a self-driving car's claimed capabilities. This is the same as cars without proven passenger safety (Euro NCAP) aren't allowed to be on roads carrying passengers.

By @dietsche - 6 months
I would like more details. There are definitely situations where neither a car nor a human could respond quickly enough to a situation on the road.

for example, I recently hit a deer. The dashcam shows that I had less than 100 feet from when the deer became visible due to terrain to impact while driving at 60 mph. Keeping in mind that stopping a car in 100 feet at 60 mph is impossible. Most vehicles need more than triple that without accounting for human reaction time.

By @jqpabc123 - 6 months
By now, most people have probably heard that Tesla's attempt at "Full Self Driving" is really anything but --- after a decade of promises. The vehicle owners manual spells this out.

As I understand it, the contentious issue is the fact that unlike most others, their attempt works mostly from visual feedback.

In low visibility situations, their FSD has limited feedback and is essentially driving blind.

It appears that Musk may be seeking a political solution to this technical problem.

By @drodio - 6 months
I drive a 2024 Tesla Model Y and another person in my family drives a 2021 Model Y. Both cars are substantially similar (the 2021 actually has more sensors than the 2024, which is strictly cameras-only).

Both cars are running 12.5 -- and I agree that it's dramatically improved over 12.3.

I really enjoy driving. I've got a #vanlife Sprinter that I'll do 14 hour roadtrips in with my kids. For me, the Tesla's self-driving capability is a "nice to have" -- it sometimes drives like a 16 year old who just got their license (especially around braking. Somehow it's really hard to nail the "soft brake at a stop sign" which seems like it should be be easy. I find that passengers in the car are most uncomfortable when the car brakes like this -- and I'm the most embarrassed because they all look at me like I completely forgot how to do a smooth stop at a stop sign).

Other times, the Tesla's self-driving is magical and nearly flawless -- especially on long highway road trips, like up to Tahoe. Even someone like me who loves doing road trips really appreciates the ability to relax and not have to be driving.

But here's one observation I've had that I don't see quite sufficiently represented in the comments:

The other person in my family with the 2021 Model Y does not like to drive like I do, and they really appreciate that the Tesla is a better driver than they feel themselves to be. And as a passenger in their car, I also really appreciate that when the Tesla is driving, I generally feel much more comfortable in the car. Not always, but often.

There's so much variance in us as humans around driving skills and enjoyment. It's easy to lump us together and say "the car isn't as good as the human." And I know there's conflicting data from Tesla and NHTSA about whether in aggregate, Teslas are safer than human drivers or not.

But what I definitely know from my experience is that the Tesla is already a better driver than many humans are -- especially those that don't enjoy driving. And as @modeless points out, the rate of improvement is now vastly accelerating.

By @23B1 - 6 months
"Move fast and kill people"

Look, I don't know who needs to hear this, but just stop supporting this asshole's companies. You don't need internet when you're camping, you don't need a robot to do your laundry, you don't need twitter, you can find more profitable and reliable places to invest.

By @Aeolun - 6 months
I love how the image in the article has a caption that says it tells you to pay attention to the road, but I had to zoom in all the way to figure out where that message actually was.

I’d expect something big and red with a warning triangle or something, but it’s a tiny white message in the center of the screen.

By @frabjoused - 6 months
I don't understand why this debate/probing is not just data driven. Driving is all big data.

https://www.tesla.com/VehicleSafetyReport

This report does not include fatalities, which seems to be the key point in question. Unless the above report has some bias or is false, Teslas in autopilot appear 10 times safer than the US average.

Is there public data on deaths reported by Tesla?

And otherwise, if the stats say it is safer, why is there any debate at all?

By @xvector - 6 months
My Tesla routinely tries to kill me on absolutely normal California roads in normal sunny conditions, especially when there are cars parked on the side of the road (it often brakes thinking I'm about to crash into them, or even swerves into them thinking that's the "real" lane).

Elon's Unsupervised FSD dreams are a good bit off. I do hope they happen though.

By @graeme - 6 months
Will the review assess overall mortality of the vehicles compared to similar cars, and overall mortality while FSD is in use?
By @aanet - 6 months
About damn time NHTSA opened this full scale investigation. Tesla's "autonowashing" has gone on for far too long.

Per Reuters [1] "The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles. The preliminary evaluation is the first step before the agency could seek to demand a recall of the vehicles if it believes they pose an unreasonable risk to safety."

Roughly 2.4 million Teslas in question, with "Full Self Driving" software after 4 reported collisions and one fatality.

NHTSA is reviewing the ability of FSD’s engineering controls to "detect and respond appropriately to reduced roadway visibility conditions."

Tesla has, of course, rather two-facedly called its FSD as SAE Level-2 for regulatory purposes, while selling its "full self driving" but also requiring supervision. ¯\_(ツ)_/¯ ¯\_(ツ)_/¯

No other company has been so irresponsible to its users, and without a care for any negative externalities imposed on non-consenting road users.

I treat every Tesla driver as a drunk driver, steering away whenever I see them on highways.

[FWIW, yes, I work in automated driving and know a thing or two about automotive safety.]

[1] https://archive.is/20241018151106/https://www.reuters.com/bu...

By @siliconc0w - 6 months
Traffic jams and long monotonous roads are really where these features, getting to level 3 on those should be the focus over trying to maintain a fiction of level 5 everywhere. (And like other comments, >2 should automatically mean liability)
By @kjkjadksj - 6 months
One thing thats a little weird with the constant tesla framing of fsd being better than the average driver, is this assumption that a tesla owner might be an average driver. The “average” driver includes people who total their cars, who kill pedestrians, who drive drunk, who go 40 over. Meanwhile I’ve never been in an accident. For me and probably for many other drivers, their own individual average performance is much better than the average of all drivers. And given that its a possibility that relying on fsd is much worse for you than not in terms of rate of risk.
By @metabagel - 6 months
Cruise control with automatic following distance and lane-keeping are such game changers, that autonomous driving isn’t necessary for able drivers.

OK, the lane-keeping isn’t quite there, but I feel like that’s solvable.

By @mcintyre1994 - 6 months
Something I find weird about riding in a Tesla is that they have a mode that a bunch of Uber drivers seem to use where it shows a sort of diagram on the screen of what the car perceives to be its surroundings. This seems to be mostly bad - cars jump in and out, things appear from nowhere right next to the car, things randomly disappear. If that's produced using the same inputs the car uses for self driving then I'm not surprised it has all these issues.
By @wg0 - 6 months
In all the hype of AI etc, if you think about it then the foundational problem is that even Computer Vision is not a solved problem at the human level of accuracy and that's at the heart of the issue of both Tesla and that Amazon checkout.

Otherwise as thought experiment, imagine just a tiny 1 Inch tall person glued to the grocery trolley and another sitting on each shelf - just these two alone are all you need for "automated checkout".

By @JumpinJack_Cash - 6 months
Unpopular take: Even with perfect FSD which is much better than the average human driver (say having the robotic equivalent of a Lewis Hamilton in every car) the productivity and health gains won't be as great as people anticipate.

Sure way less traffic deaths but the spike in depression especially among males would be something very big. Life events are much outside of our control, having a 5000lbs thing that can get to 150mph if needed and responds exactly to the accelerator, brake and steering wheel input...well that makes people feel in control and very powerful while behind the aforementioned steering wheel.

Also productivity...I don't know...people think a whole lot and do a whole lot of self reflection while they are driving and when they arrive at destination they just implement the thoughts they had while driving. The ability to talk on the phone has been there for quite some time now too, so thinking and communicating can be done while driving already, what would FSD add?

By @lowbloodsugar - 6 months
I'm not turning FSD on until it is a genuine autonomous vehicle that requires no input from me and never disengages. Until Tesla is, under the law, the legal driver of the vehicle, and suffers all the legal impact, you'd have to be mental to let it drive for you. It's like asking, "Hey, here's a chauffeur who has killed several people so far, all over the world. You want him to drive?" Or "Hey, here's a chauffeur. You're fine, you can read a book. But at some point, right when something super dangerous is about to happen, he's going to just panic and stop driving, and then you have to stop whatever you're doing and take over." That's fucking mental.
By @DoesntMatter22 - 6 months
Each version has improved. FSD is realistically the hardest thing humanity as ever tried to do. It involves an enormous amount of manpower, compute power and human discoveries, and has to work right in billions of scenarios.

Building a self flying plane is comically easy by comparison. Building Starship is easier by comparison.

By @gnuser - 6 months
I worked in 18 a wheeler automation unicorn.

Never rode in one once for a reason.

By @ivewonyoung - 6 months
> NHTSA said it was opening the inquiry after four reports of crashes where FSD was engaged during reduced roadway visibility like sun glare, fog, or airborne dust. A pedestrian was killed in Rimrock, Arizona, in November 2023 after being struck by a 2021 Tesla Model Y, NHTSA said. Another crash under investigation involved a reported injury

> The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles.

This is good, but also for context 45 thousand people are killed in auto accidents in just the US every year, making 4 report crashes and 1 reported fatality for 2.4 million vehicles over 8 years look miniscule by comparison, or even better than many human drivers.

By @TeslaCoils - 6 months
Works most of the time, Fails at the worst time - Supervision absolutely necessary...
By @dzhiurgis - 6 months
What is FSD uptake rate. I bet it’s less than 1% since in most countries it’s not even available…
By @lrvick - 6 months
All these self driving car companies are competing to see whose proprietary firmware and sensors kill the fewest people. This is insane.

I will -never- own a self driving car unless the firmware is open source, reproducible, remotely attestable, and built/audited by several security research firms and any interested security researchers from the public before all new updates ship.

It is the only way to avoid greedy execs from cutting corners to up profit margins like VW did with faking emissions tests.

Proprietary safety tech is evil, and must be made illegal. Compete with nicer looking more comfortable cars with better miles-to-charge, not peoples lives.

By @amelius - 6 months
By @nemo44x - 6 months
I’m a Tesla fan but I have to say anecdotally that it seems like Teslas represent an outsize number of bad drivers in my observations. Is it the FSD that’s a bit too aggressive and sporadic? Lots of lane changing, etc?

They’re up there with Dodge Ram drivers.

By @Teknomancer - 6 months
This is just an opinion. The only way forward with automated and autonomous vehicles is through industry cooperation and standardization. The Tesla approach to the problem is inadequate, lacking means for interoperability, and relying on inferior detection mechanisms. Somebody who solves these problems and does it by offering interoperability and standards applied to all automakers wins.

Sold Tesla investments. The company is on an unprofitable downward spiral trajectory. The CEO is a total clown. Reinvested on advise in Diamler, after Mercedes-Benz and Diamler Trucks North America demonstrated their research and work into creating true autonomous technology and safe global industry standardizations.

By @JTbane - 6 months
How can you possibly have a reliable self-driving car without LIDAR?
By @roydivision - 6 months
I'm still baffled as to how this is allowed anywhere near public roads. The only reason I can think of is that the US values trade above risk of killing people.
By @quitit - 6 months
"Full Self-Driving" but it's not "full" self-driving, as it requires active supervision.

So it's marketed with a nod and wink, as if the supervision requirement is just a peel away disclaimer to satisfy old and stuffy laws that are out of step with the latest technology. When in reality it really does need active supervision.

But the nature of the technology is this approach invites the driver to distraction, because what's the use in "full self driving" if one needs to have their hands on the wheel and feet near the pedals ready to take control at a moments notice? Worsening this problem is that the Teslas have shown themselves to drive erratically at unexpected times such as phantom braking or misidentifying natural phenomena for traffic lights.

One day people will look back on letting FSD exist in the market and roll their eyes in disbelief of the recklessness.

By @Animats - 6 months
If Trump is elected, this probe will be stopped.
By @FergusArgyll - 6 months
Are the insurance prices different if you own a Tesla with FSD? if not, why not?
By @kvgr - 6 months
I don't understand how it is possible to be used on public roads...
By @knob - 6 months
Didn't Uber have something similar happen? Ran over a woman in Phoenix?
By @whiplash451 - 6 months
Asking genuinely: is FSD enabled/accessible in EU?
By @jgalt212 - 6 months
The SEC is clearly afraid of Musk. I wonder what the intimidation factor is at NHTSA.
By @bastloing - 6 months
It was way safer to ride a horse and buggy
By @Rebuff5007 - 6 months
Tesla testing and developing FSD with normal consumer drivers frankly seems criminal. Test drivers for AV companies get advanced driver training, need to filed detailed reports about the cars response to various driving scenarios, and generally are paid to be as attentive as possible. The fact that any old tech-bro or un-assuming old lady can buy this thing and be on their phone when the car could potentially turn into oncoming traffic is mind boggling.
By @masto - 6 months
I have such a love-hate relationship with this thing. I don't think Tesla's approach will ever be truly autonomous, and they do a lot of things to push it into unsafe territory (thanks to you know who at the helm). I am a tech enthusiast and part of the reason I bought this car (before you know who revealed himself to be you know what) is that they were the furthest ahead and I wanted to experience it. If they had continued on the path I'd hoped, they'd have put in more sensors, not taken them out for cost-cutting and then tried to gaslight people about it. And all this hype about turning your car into a robotaxi while you're not using it is just stupid.

On the other hand, I'd hate for the result of all this to be to throw the ADAS out with the bathwater. The first thing I noticed even with the early "autopilot" is that it made long road trips much more bearable. I would arrive at my destination without feeling exhausted, and I attribute a lot of that to not having to spend hours actively making micro adjustments to speed and steering. I know everyone thinks they're a better driver than they are, and it's those other people who can't be trusted, but I do feel that when I have autopilot/FSD engaged, I am paying attention, less fatigued, and actually have more cognitive capacity freed up to watch for dangerous situations.

I had to pick someone up at LaGuardia Airport yesterday, a long annoying drive in heavy NYC-area traffic. I engaged autosteer for most of the trip both ways (and disengaged it when I didn't feel it was appropriate), and it made it much more bearable.

I'm neither fanboying nor apologizing for Tesla's despicable behavior. But I would be sad if, in the process of regulating this tech, it got pushed back too far.

By @fortran77 - 6 months
I have FSD in my Plaid. I don't use it. Too scary.
By @sanp - 6 months
This will go away once Trump wins
By @Yeul - 6 months
Now we know why Musk wants Trump to win. To completely subjugate the state to the whims of it's billionaire class. Going back to the 19th century's gilded age.

https://www.bbc.com/news/articles/cg78ljxn8g7o

By @yieldcrv - 6 months
Come on US, regulate interstate commerce and tell them to delete these cameras

Lidar is goated and if tesla didn’t want that they can pursue a different perception solution, allowing for innovation

But just visual cameras aiming to replicate us, ban that

By @lopkeny12ko - 6 months
> NHTSA said it was opening the inquiry after four reports of crashes where FSD was engaged during reduced roadway visibility like sun glare, fog, or airborne dust. A pedestrian was killed in Rimrock, Arizona, in November 2023 after being struck by a 2021 Tesla Model Y, NHTSA said.

This is going to be another extremely biased investigation.

1. A 2021 Model Y is not on HW4.

2. FSD in November 2023 is not FSD 12.5, the current version. Any assessment of FSD on such outdated software is not going to be representative of the current experience.

By @soerxpso - 6 months
For whatever it's worth, Teslas with Autopilot enabled crash about once every 4.5M miles driven, whereas the overall rate in the US is roughly one crash every 70K miles driven. Of course, the selection effects around that stat can be debated (people probably enable autopilot in situations that are safer than average, the average tesla owner might be driving more carefully or in safer areas than the average driver, etc), but it is a pretty significant difference. (Those numbers are what I could find at a glance; DYOR if you'd like more rigor).

We have a lot of traffic fatalities in the US (in some states, an entire order of magnitude worse than in some EU countries), but it's generally not considered an issue. Nobody asks, "These agents are crashing a lot; are they really competent to drive?" when the agent is human, but when the agent is digital it becomes a popular question even with a much lower crash rate.