US to probe Tesla's 'Full Self-Driving' system after pedestrian killed
The U.S. government is investigating Tesla's Full Self-Driving system after a fatal pedestrian incident, focusing on its performance in low visibility, which may affect future autonomous vehicle regulations.
Read original articleThe U.S. government has initiated an investigation into Tesla's "Full Self-Driving" (FSD) system following a fatal incident involving a pedestrian in low visibility conditions. The National Highway Traffic Safety Administration (NHTSA) is examining whether the FSD system contributed to the accident, which raises concerns about the safety and reliability of autonomous driving technologies. This inquiry is part of a broader scrutiny of Tesla's self-driving features, which have faced criticism and regulatory challenges in the past. The investigation aims to assess the system's performance in adverse weather conditions and its ability to detect and respond to pedestrians effectively. Tesla has previously stated that its FSD system is designed to improve safety, but incidents like this highlight the ongoing debate over the readiness of autonomous vehicles for public roads. The outcome of this investigation could have significant implications for Tesla and the future of self-driving technology in the automotive industry.
- The U.S. is investigating Tesla's Full Self-Driving system after a pedestrian was killed.
- The National Highway Traffic Safety Administration is leading the inquiry.
- The investigation focuses on the system's performance in low visibility conditions.
- This incident adds to ongoing scrutiny of Tesla's autonomous driving features.
- The outcome may impact the future of self-driving technology and regulations.
Related
Tesla in Seattle-area crash that killed motorcyclist was using FSD system
A Tesla using "Full Self Driving" was involved in a fatal crash near Seattle, leading to the driver's arrest for vehicular homicide. The investigation continues regarding potential charges.
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla and NHTSA withhold data on Autopilot crashes, report suggests
An investigation reveals Tesla and NHTSA allegedly conceal details of over 200 crashes involving Autopilot, raising safety concerns and highlighting potential consumer misperceptions about the system's capabilities and limitations.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.
Tesla's Full Self-Driving software under investigation by NHTSA
The NHTSA is investigating Tesla's Full Self-Driving software after crashes in low visibility, including a fatality. Tesla also faces legal challenges over driver-assistance claims and scrutiny of its Autopilot system.
Please, if you're going to try it, keep both hands on the wheel and your foot ready for the brake. When it goes off the rails, it usually does so in surprising ways with little warning and little time to correct. And since it's so good much of the time, you can get lulled into complacence.
I never really understand the comments from people who think it's the greatest thing ever and makes their drive less stressful. Does the opposite for me. Entertaining but exhausting to supervise.
2016 folks... Even with today's FSD which is several orders of magnitude better than the one in the video, you would still probably have a serious accident within a week (and I'm being generous here) if you didn't seat in the driver's seat.
How Trevor Milton got sentenced for fraud and the people responsible for this were not is a mystery to me.
- It failed with a cryptic system error while driving
- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.
- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.
- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.
- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.
- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.
After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.
Despite it being called "Full Self-Driving."
Tesla should be sued out of existence.
Motorway driving sure, there it's closer to fancy cruise control. But around town, no thank you. I regularly drive through some really crappily designed bits of road, like unlabelled approaches to multi-lane roundabouts where the lane you need to be in for a particular exit sorta just depends on what the people in front and to the side of you happen to have chosen. If it's difficult as a human to work out what the intent is, I don't trust a largely computer vision-based system to work it out.
The roads here are also in a terrible state, and the lines on them even moreso. There's one particular patch of road where the lane keep assist in my car regularly tries to steer me into the central reservation, because repair work has left what looks a bit like lane markings diagonally across the lane.
It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.
Am I missing something or is this the gross miscarriage of justice that it sounds like? The driver could afford a $40k vehicle but not $20 polarized shades from Amazon? Negligence is negligence.
https://apnews.com/article/car-crash-tesla-france-fire-be8ec...
The driver I had just overtaken, although he wasn't very close anymore slowed right down to get away from me and I didn't blame him.
That manoeuvre in another car likely would have put it on two wheels.
They say FSD crashes less often than a human per mile driven, but I can only use FSD on roads like motorways, so I don't think it's a fair comparison.
I don't trust FSD, I still use it occasionally but never in less than ideal conditions. Typically when doing something like changing the music on a motorway.
It probably is safer than just me driving alone, when it's in good conditions on a straight road with light traffic with an alert driver.
"Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. "
edit: Parent article was changed... I was referring to the title of the NPR article.
The Model 3 had every opportunity in the world to brake and it didn’t, we were probably only going 25mph. I know this is about FSD here, but that moment 100% made me realize Tesla has awful obstacle avoidance.
I just happen to be looking forward and it was a very plain and clear T-Bone avoidance, and at no point did the car handle or trigger anything.
Thankfully everyone was ok, but the front lip got pretty beat up from driving up the curb. Of course the driver at fault that caused the whole incident drove off.
And how is this regulated? Say the software gets to a point that we deem it safe for full self driving, then it gets approved on the road, and then Tesla adds a new fancy feature to their software and rolls out an update. How are we to be confident that it's safe?
Governments should carry out comprehensive tests on a self-driving car's claimed capabilities. This is the same as cars without proven passenger safety (Euro NCAP) aren't allowed to be on roads carrying passengers.
for example, I recently hit a deer. The dashcam shows that I had less than 100 feet from when the deer became visible due to terrain to impact while driving at 60 mph. Keeping in mind that stopping a car in 100 feet at 60 mph is impossible. Most vehicles need more than triple that without accounting for human reaction time.
As I understand it, the contentious issue is the fact that unlike most others, their attempt works mostly from visual feedback.
In low visibility situations, their FSD has limited feedback and is essentially driving blind.
It appears that Musk may be seeking a political solution to this technical problem.
Both cars are running 12.5 -- and I agree that it's dramatically improved over 12.3.
I really enjoy driving. I've got a #vanlife Sprinter that I'll do 14 hour roadtrips in with my kids. For me, the Tesla's self-driving capability is a "nice to have" -- it sometimes drives like a 16 year old who just got their license (especially around braking. Somehow it's really hard to nail the "soft brake at a stop sign" which seems like it should be be easy. I find that passengers in the car are most uncomfortable when the car brakes like this -- and I'm the most embarrassed because they all look at me like I completely forgot how to do a smooth stop at a stop sign).
Other times, the Tesla's self-driving is magical and nearly flawless -- especially on long highway road trips, like up to Tahoe. Even someone like me who loves doing road trips really appreciates the ability to relax and not have to be driving.
But here's one observation I've had that I don't see quite sufficiently represented in the comments:
The other person in my family with the 2021 Model Y does not like to drive like I do, and they really appreciate that the Tesla is a better driver than they feel themselves to be. And as a passenger in their car, I also really appreciate that when the Tesla is driving, I generally feel much more comfortable in the car. Not always, but often.
There's so much variance in us as humans around driving skills and enjoyment. It's easy to lump us together and say "the car isn't as good as the human." And I know there's conflicting data from Tesla and NHTSA about whether in aggregate, Teslas are safer than human drivers or not.
But what I definitely know from my experience is that the Tesla is already a better driver than many humans are -- especially those that don't enjoy driving. And as @modeless points out, the rate of improvement is now vastly accelerating.
Look, I don't know who needs to hear this, but just stop supporting this asshole's companies. You don't need internet when you're camping, you don't need a robot to do your laundry, you don't need twitter, you can find more profitable and reliable places to invest.
I’d expect something big and red with a warning triangle or something, but it’s a tiny white message in the center of the screen.
https://www.tesla.com/VehicleSafetyReport
This report does not include fatalities, which seems to be the key point in question. Unless the above report has some bias or is false, Teslas in autopilot appear 10 times safer than the US average.
Is there public data on deaths reported by Tesla?
And otherwise, if the stats say it is safer, why is there any debate at all?
Elon's Unsupervised FSD dreams are a good bit off. I do hope they happen though.
Per Reuters [1] "The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles. The preliminary evaluation is the first step before the agency could seek to demand a recall of the vehicles if it believes they pose an unreasonable risk to safety."
Roughly 2.4 million Teslas in question, with "Full Self Driving" software after 4 reported collisions and one fatality.
NHTSA is reviewing the ability of FSD’s engineering controls to "detect and respond appropriately to reduced roadway visibility conditions."
Tesla has, of course, rather two-facedly called its FSD as SAE Level-2 for regulatory purposes, while selling its "full self driving" but also requiring supervision. ¯\_(ツ)_/¯ ¯\_(ツ)_/¯
No other company has been so irresponsible to its users, and without a care for any negative externalities imposed on non-consenting road users.
I treat every Tesla driver as a drunk driver, steering away whenever I see them on highways.
[FWIW, yes, I work in automated driving and know a thing or two about automotive safety.]
[1] https://archive.is/20241018151106/https://www.reuters.com/bu...
OK, the lane-keeping isn’t quite there, but I feel like that’s solvable.
Otherwise as thought experiment, imagine just a tiny 1 Inch tall person glued to the grocery trolley and another sitting on each shelf - just these two alone are all you need for "automated checkout".
Sure way less traffic deaths but the spike in depression especially among males would be something very big. Life events are much outside of our control, having a 5000lbs thing that can get to 150mph if needed and responds exactly to the accelerator, brake and steering wheel input...well that makes people feel in control and very powerful while behind the aforementioned steering wheel.
Also productivity...I don't know...people think a whole lot and do a whole lot of self reflection while they are driving and when they arrive at destination they just implement the thoughts they had while driving. The ability to talk on the phone has been there for quite some time now too, so thinking and communicating can be done while driving already, what would FSD add?
Building a self flying plane is comically easy by comparison. Building Starship is easier by comparison.
Never rode in one once for a reason.
> The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles.
This is good, but also for context 45 thousand people are killed in auto accidents in just the US every year, making 4 report crashes and 1 reported fatality for 2.4 million vehicles over 8 years look miniscule by comparison, or even better than many human drivers.
I will -never- own a self driving car unless the firmware is open source, reproducible, remotely attestable, and built/audited by several security research firms and any interested security researchers from the public before all new updates ship.
It is the only way to avoid greedy execs from cutting corners to up profit margins like VW did with faking emissions tests.
Proprietary safety tech is evil, and must be made illegal. Compete with nicer looking more comfortable cars with better miles-to-charge, not peoples lives.
https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...
They’re up there with Dodge Ram drivers.
Sold Tesla investments. The company is on an unprofitable downward spiral trajectory. The CEO is a total clown. Reinvested on advise in Diamler, after Mercedes-Benz and Diamler Trucks North America demonstrated their research and work into creating true autonomous technology and safe global industry standardizations.
So it's marketed with a nod and wink, as if the supervision requirement is just a peel away disclaimer to satisfy old and stuffy laws that are out of step with the latest technology. When in reality it really does need active supervision.
But the nature of the technology is this approach invites the driver to distraction, because what's the use in "full self driving" if one needs to have their hands on the wheel and feet near the pedals ready to take control at a moments notice? Worsening this problem is that the Teslas have shown themselves to drive erratically at unexpected times such as phantom braking or misidentifying natural phenomena for traffic lights.
One day people will look back on letting FSD exist in the market and roll their eyes in disbelief of the recklessness.
On the other hand, I'd hate for the result of all this to be to throw the ADAS out with the bathwater. The first thing I noticed even with the early "autopilot" is that it made long road trips much more bearable. I would arrive at my destination without feeling exhausted, and I attribute a lot of that to not having to spend hours actively making micro adjustments to speed and steering. I know everyone thinks they're a better driver than they are, and it's those other people who can't be trusted, but I do feel that when I have autopilot/FSD engaged, I am paying attention, less fatigued, and actually have more cognitive capacity freed up to watch for dangerous situations.
I had to pick someone up at LaGuardia Airport yesterday, a long annoying drive in heavy NYC-area traffic. I engaged autosteer for most of the trip both ways (and disengaged it when I didn't feel it was appropriate), and it made it much more bearable.
I'm neither fanboying nor apologizing for Tesla's despicable behavior. But I would be sad if, in the process of regulating this tech, it got pushed back too far.
Lidar is goated and if tesla didn’t want that they can pursue a different perception solution, allowing for innovation
But just visual cameras aiming to replicate us, ban that
This is going to be another extremely biased investigation.
1. A 2021 Model Y is not on HW4.
2. FSD in November 2023 is not FSD 12.5, the current version. Any assessment of FSD on such outdated software is not going to be representative of the current experience.
We have a lot of traffic fatalities in the US (in some states, an entire order of magnitude worse than in some EU countries), but it's generally not considered an issue. Nobody asks, "These agents are crashing a lot; are they really competent to drive?" when the agent is human, but when the agent is digital it becomes a popular question even with a much lower crash rate.
Related
Tesla in Seattle-area crash that killed motorcyclist was using FSD system
A Tesla using "Full Self Driving" was involved in a fatal crash near Seattle, leading to the driver's arrest for vehicular homicide. The investigation continues regarding potential charges.
Hacked Tesla FSD computers disclose alarming raw data on deadly accidents
An investigation into Tesla's Full Self-Driving system revealed data linking its decisions to accidents. Issues include sudden veering and failure to recognize obstacles, raising concerns about safety and reliability.
Tesla and NHTSA withhold data on Autopilot crashes, report suggests
An investigation reveals Tesla and NHTSA allegedly conceal details of over 200 crashes involving Autopilot, raising safety concerns and highlighting potential consumer misperceptions about the system's capabilities and limitations.
Tesla Full Self Driving requires human intervention every 13 miles
An evaluation of Tesla's Full Self Driving system found it requires human intervention every 13 miles, exhibiting both advanced capabilities and dangerous behaviors, raising significant safety concerns for users.
Tesla's Full Self-Driving software under investigation by NHTSA
The NHTSA is investigating Tesla's Full Self-Driving software after crashes in low visibility, including a fatality. Tesla also faces legal challenges over driver-assistance claims and scrutiny of its Autopilot system.