It is important to note that this is with safety drivers. Professional driver + their most advanced "Robotaxi" FSD version under test with careful scrutiny is 4x worse than the average non-professional driver alone and averaging 57,000 miles per minor collision.
Yet it is quite odd how Tesla also reports that untrained customers using old versions of FSD with outdated hardware average 1,500,000 miles per minor collision [1], a literal 3000% difference, when there are no penalties for incorrect reporting.
The comparison to human crash rates needs more context. These low-speed incidents (1-4 mph backing into a fixed object) rarely get reported in human driver statistics because they usually do not involve police reports or injuries. The NHTSA SGO database counts all ADS incidents regardless of severity, while human driver baselines come from reported incidents.
That said, the redaction issue is the real story. Waymo publishes detailed narratives. Zoox publishes detailed narratives. Tesla marks everything confidential. When every other company is transparent and one is not, that tells you something about what they are finding in the data. You cannot independently assess fault or system failure, which makes any comparison meaningless.
WarmWash
The problem Tesla faces and their investors are unaware of, is that just because you have a Model Y that has driven you around for thousands of miles without incident does not mean Tesla has autonomous driving solved.
Tesla needs their FSD system to be driving hundreds of thousands of miles without incident. Not the 5,000 miles Michael FSD-is-awesome-I-use-it-daily Smith posts incessantly on X about.
There is this mismatch where overly represented people who champion FSD say it's great and has no issues, and the reality is none of them are remotely close to putting in enough miles to cross the "it's safe to deploy" threshold.
A fleet of robotaxis will do more FSD miles in an afternoon than your average Tesla fanatic will do in a decade. I can promise you that Elon was sweating hard during each of the few unsupervised rides they have offered.
show comments
lateforwork
Tesla's Robotaxis are bringing a bad name to the entire field of autonomous driving. The average consumer isn't going to make a distinction between Tesla vs. Waymo. When they hear about these Robotaxi crashes, they will assume all robotic driving is crash prone, dangerous and irresponsible.
show comments
Traster
I said in earlier reports about this, it's difficult to draw statistical comparisons with humans because there's so little data. Having said that, it is clear that this system just isn't ready and it's kind of wild that a couple of those crashes would've been easily preventable with parking sensors that come equipped as standard on almost every other car.
In some spaces we still have rule of law - when xAI started doing the deepfake nude thing we kind of knew no one in the US would do anything but jurisdictions like the EU would. And they are now. It's happening slowly but it is happening. Here though, I just don't know if there's any institution in the US that is going to look at this for what it is - an unsafe system not ready for the road - and take action.
show comments
jackp96
I'm not an Elon fan at all, and I'm highly skeptical of Tesla's robotaxi efforts in general, but the context here is that only one of these seems like a true crash?
I'm curious how crashes are reported for humans, because it sounds like 3 of the 5 examples listed happened at like 1-4 mph, and the fourth probably wasn't Tesla's fault (it was stationary at the time). The most damning one was a collision with a fixed object at a whopping 17 mph.
Tesla sucks, but this feels like clickbait.
show comments
vessenes
Interesting crash list. A bunch of low speed crashes, one bus hit the Tesla while the Tesla was stationary, and one 17mph into static object (ouch).
For those complaining about Tesla's redactions - fair and good. That said, Tesla formed its media strategy at a time when gas car companies and shorts bought ENTIRE MEDIA ORGs just to trash them to back their short. Their hopefulness about a good showing on the media side died with Clarkson and co faking dead batteries in a roadster test -- so, yes, they're paranoid, but also, they spent years with everyone out to get them.
show comments
pnw
Electrek is a highly biased source, the editor has a grudge against Elon and Tesla. It's really unfortunate since it used to be one of the best EV sites.
show comments
maxdo
electrec as always.
```
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
```
so in reality one crash with fixed object, the rest is... questionable, and it's not a crash as you portrait. Such statistic will not even go into human reports, as it goes into non driving incidents, parking lot etc.
show comments
whimsicalism
I'm no tesla lover, but I doubt that 4mph backing into an object is something that would be reported in a human context so I'm not sure a '4x' number is really comparative vs. sensationalized.
show comments
nelsonic
Did anyone actually read the article before commenting? The crashes were all minor. No injuries. If anything this shows Tesla making an effort to report everything. A 2mph bump isn’t a “crash” it’s barely anything. The 17mph collision may have caused some minor damage to the “fixed object” but not clear from the article.
show comments
iknowstuff
Such slop. First, they take NHTSA SGO "crashes" which explicitly includes basically any physical impact with property damage e.g. 1–2 mph “backed into a pole/tree”.
Then they compare that numerator to Tesla’s own “minor collision” benchmark — which is not police-reported fender benders; it’s a telemetry-triggered “collision event” keyed to airbag deployment or delta-V ≥ 8 km/h. Different definitions. Completely bogus ratio.
Any comparison to police-reported crashes is hilariously stupid for obvious reasons.
On top of that, the denominator is hand-waved ("~800k paid miles extrapolated"), which is extra sketchy because SGO crashes can happen during non-paid repositioning/parking while "paid miles" excludes those segments. And we’re talking 14 events in one geofenced, early rollout in Austin so your confidence interval is doing backflips. If you want a real claim vs humans, do matched Austin exposure, same reportable-crash criteria, severity stratification, and show uncertainty bands.
But what you get instead is clickbait so stop falling for this shit please HN.
fabian2k
It's impressive how bad they're at hiring the safety drivers. This is not even measuring how good the Robotaxi itself is, right now it's only measuring how good Tesla is at running this kind of test. This is not inspiring any confidence.
Though maybe the safety drivers are good enough for the major stuff, and the software is just bad enough at low speed and low distance collisions where the drivers don't notice as easily that the car is doing something wrong before it happens.
ProfessorZoom
Is there any place online to read the incident reports? For example Waymo in CA there's a gov page to read them, I read 9 of them and they were all not at the fault of Waymo, so I'm wondering how many of these crashes are similar (ie at a red light and someone rear ends them)
show comments
legitster
Also keep in mind all of the training and data and advanced image processing has only ever been trained on cities with basically perfect weather conditions for driving (maybe with the exception of fog in San Francisco).
We are still a long, long, long way off for someone to feel comfortable jumping in a FSD cab on a rainy night in in New York.
leesec
Funny to see the comments here vs the thread the other day where a Waymo hit a child.
There's no real discussion to be had on any of this. Just people coming in to confirm their biases.
As for me, I'm happy to make and take bets on Tesla beating Waymo. I've heard all these arguments a million times. Bet some money
show comments
guywithahat
This is something Electrek does regularly and isn't unique to this article but I don't like how they suggests the Tesla crash reports are doing something shady by following the reporting guidelines. Tesla is reporting things by the books, and when Electrek doesn't like how the laws are laid out they blame Tesla. Electrek wants Tesla to publish separate press notes, and since they don't they take their frustration out on the integrity of the article, which is worse for everyone.
ggm
Given how minor these are, you think they'd get in front of the conspiracy by full disclosure.
smileson2
ill stick to the bus
show comments
nova22033
He going to fix this by having grok redefine "widespread"
Tesla CEO Elon Musk said at the World Economic Forum in Davos that the company’s robotaxis will be “widespread” in the U.S. by the end of 2026.
jeffbee
Their service is way worse than you think, in every way. The actual unsupervised Robotaxi service doesn't cover a geofenced area of Austin, like Waymo does. It traverses a fixed route along South Congress Avenue, like a damned bus.
Grimblewald
I spew elon hate every chance I get and I maintain I am being too kind on him.
lbrito
Now imagine if all those billions in taxes had been used to build real transit infrastructure instead of subsidizing Tesla.
show comments
yieldcrv
Waymo is licensing out their "Driver" software to cars that fit the specification
if Tesla drops the ego they could obtain Waymo software and track record on future Tesla hardware
chinathrow
Well, how about time to take them off the roads then?
pengaru
It's a fusion of jazz and funk!
hermitcrab
"Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA’s confidentiality provisions."
Given the way Musk has lied and lied about Tesla's autonomous driving capabilities, that can't be much of a surprise to anyone.
ModernMech
Honestly I thought everyone was clear how this was going to go after the initial decapitation from 2016, but it seems like everyone's gonna allow these science experiments to keep causing damage until someone actually regulates them with teeth.
anonym29
This data seems very incomplete and potentially misleading.
>The new crashes include [...] a crash with a bus while the Tesla was stationary
Doesn't this imply that the bus driver hit the stationary Tesla, which would make the human bus driver at fault and the party responsible for causing the accident? Why should a human driver hitting a Tesla be counted against Tesla's safety record?
It's possible that the Tesla could've been stopped in a place where it shouldn't have, like in the middle of an intersection (like all the Waymos did during the SF power outage), but there aren't details being shared about each of these incidents by Electrek.
>The new crashes include [...] a collision with a heavy truck at 4 mph
The chart shows only that the Tesla was driving straight at 4mph when this happened, not whether the Tesla hit the truck or the truck hit the Tesla.
Again, it's entirely possible that the Tesla hit the truck, but why aren't these details being shared? This seems like important data to consider when evaluating the safety of autonomous systems - whether the autonomous system or human error was to blame for the accident.
I appreciate that Electrek at least gives a mention of this dynamic:
>Tesla fans and shareholders hold on to the thought that the company’s robotaxis are not responsible for some of these crashes, which is true, even though that’s much harder to determine with Tesla redacting the crash narrative on all crashes, but the problem is that even Tesla’s own benchmark shows humans have fewer crashes.
Aren't these crash details / "crash narrative" a matter of public record and investigations? By e.g. either NHTSA, or by local law enforcement? If not, shouldn't it be? Why should we, as a society, rely on the automaker as the sole source of information about what caused accidents with experimental new driverless vehicles? That seems like a poor public policy choice.
outside1234
Just imagine how bad it is going to be when they take the human driver out of the car.
No idea how these things are being allowed on the road. Oh wait, yes I do. $$$$
LightBug1
Move fast and hospitalise people.
arein3
A minor fender-bender is not a crash
4x worse than humans is misleading, I bet it's better than humans, by a good margin.
show comments
small_model
The source is a well known anti Tesla, anti Musk site, the owner has a psychotic hatred from Tesla and Elon after being a balanced click bait site for years. Ignore.
show comments
ArchieScrivener
Good, who cares. Autonomous driving is an absolute waste of time. We need autodrone transport for civilian traffic. The skies have been waiting.
It is important to note that this is with safety drivers. Professional driver + their most advanced "Robotaxi" FSD version under test with careful scrutiny is 4x worse than the average non-professional driver alone and averaging 57,000 miles per minor collision.
Yet it is quite odd how Tesla also reports that untrained customers using old versions of FSD with outdated hardware average 1,500,000 miles per minor collision [1], a literal 3000% difference, when there are no penalties for incorrect reporting.
[1] https://www.tesla.com/fsd/safety
The comparison to human crash rates needs more context. These low-speed incidents (1-4 mph backing into a fixed object) rarely get reported in human driver statistics because they usually do not involve police reports or injuries. The NHTSA SGO database counts all ADS incidents regardless of severity, while human driver baselines come from reported incidents.
That said, the redaction issue is the real story. Waymo publishes detailed narratives. Zoox publishes detailed narratives. Tesla marks everything confidential. When every other company is transparent and one is not, that tells you something about what they are finding in the data. You cannot independently assess fault or system failure, which makes any comparison meaningless.
The problem Tesla faces and their investors are unaware of, is that just because you have a Model Y that has driven you around for thousands of miles without incident does not mean Tesla has autonomous driving solved.
Tesla needs their FSD system to be driving hundreds of thousands of miles without incident. Not the 5,000 miles Michael FSD-is-awesome-I-use-it-daily Smith posts incessantly on X about.
There is this mismatch where overly represented people who champion FSD say it's great and has no issues, and the reality is none of them are remotely close to putting in enough miles to cross the "it's safe to deploy" threshold.
A fleet of robotaxis will do more FSD miles in an afternoon than your average Tesla fanatic will do in a decade. I can promise you that Elon was sweating hard during each of the few unsupervised rides they have offered.
Tesla's Robotaxis are bringing a bad name to the entire field of autonomous driving. The average consumer isn't going to make a distinction between Tesla vs. Waymo. When they hear about these Robotaxi crashes, they will assume all robotic driving is crash prone, dangerous and irresponsible.
I said in earlier reports about this, it's difficult to draw statistical comparisons with humans because there's so little data. Having said that, it is clear that this system just isn't ready and it's kind of wild that a couple of those crashes would've been easily preventable with parking sensors that come equipped as standard on almost every other car.
In some spaces we still have rule of law - when xAI started doing the deepfake nude thing we kind of knew no one in the US would do anything but jurisdictions like the EU would. And they are now. It's happening slowly but it is happening. Here though, I just don't know if there's any institution in the US that is going to look at this for what it is - an unsafe system not ready for the road - and take action.
I'm not an Elon fan at all, and I'm highly skeptical of Tesla's robotaxi efforts in general, but the context here is that only one of these seems like a true crash?
I'm curious how crashes are reported for humans, because it sounds like 3 of the 5 examples listed happened at like 1-4 mph, and the fourth probably wasn't Tesla's fault (it was stationary at the time). The most damning one was a collision with a fixed object at a whopping 17 mph.
Tesla sucks, but this feels like clickbait.
Interesting crash list. A bunch of low speed crashes, one bus hit the Tesla while the Tesla was stationary, and one 17mph into static object (ouch).
For those complaining about Tesla's redactions - fair and good. That said, Tesla formed its media strategy at a time when gas car companies and shorts bought ENTIRE MEDIA ORGs just to trash them to back their short. Their hopefulness about a good showing on the media side died with Clarkson and co faking dead batteries in a roadster test -- so, yes, they're paranoid, but also, they spent years with everyone out to get them.
Electrek is a highly biased source, the editor has a grudge against Elon and Tesla. It's really unfortunate since it used to be one of the best EV sites.
electrec as always.
``` The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds. ```
so in reality one crash with fixed object, the rest is... questionable, and it's not a crash as you portrait. Such statistic will not even go into human reports, as it goes into non driving incidents, parking lot etc.
I'm no tesla lover, but I doubt that 4mph backing into an object is something that would be reported in a human context so I'm not sure a '4x' number is really comparative vs. sensationalized.
Did anyone actually read the article before commenting? The crashes were all minor. No injuries. If anything this shows Tesla making an effort to report everything. A 2mph bump isn’t a “crash” it’s barely anything. The 17mph collision may have caused some minor damage to the “fixed object” but not clear from the article.
Such slop. First, they take NHTSA SGO "crashes" which explicitly includes basically any physical impact with property damage e.g. 1–2 mph “backed into a pole/tree”.
Then they compare that numerator to Tesla’s own “minor collision” benchmark — which is not police-reported fender benders; it’s a telemetry-triggered “collision event” keyed to airbag deployment or delta-V ≥ 8 km/h. Different definitions. Completely bogus ratio.
Any comparison to police-reported crashes is hilariously stupid for obvious reasons.
On top of that, the denominator is hand-waved ("~800k paid miles extrapolated"), which is extra sketchy because SGO crashes can happen during non-paid repositioning/parking while "paid miles" excludes those segments. And we’re talking 14 events in one geofenced, early rollout in Austin so your confidence interval is doing backflips. If you want a real claim vs humans, do matched Austin exposure, same reportable-crash criteria, severity stratification, and show uncertainty bands.
But what you get instead is clickbait so stop falling for this shit please HN.
It's impressive how bad they're at hiring the safety drivers. This is not even measuring how good the Robotaxi itself is, right now it's only measuring how good Tesla is at running this kind of test. This is not inspiring any confidence.
Though maybe the safety drivers are good enough for the major stuff, and the software is just bad enough at low speed and low distance collisions where the drivers don't notice as easily that the car is doing something wrong before it happens.
Is there any place online to read the incident reports? For example Waymo in CA there's a gov page to read them, I read 9 of them and they were all not at the fault of Waymo, so I'm wondering how many of these crashes are similar (ie at a red light and someone rear ends them)
Also keep in mind all of the training and data and advanced image processing has only ever been trained on cities with basically perfect weather conditions for driving (maybe with the exception of fog in San Francisco).
We are still a long, long, long way off for someone to feel comfortable jumping in a FSD cab on a rainy night in in New York.
Funny to see the comments here vs the thread the other day where a Waymo hit a child.
There's no real discussion to be had on any of this. Just people coming in to confirm their biases.
As for me, I'm happy to make and take bets on Tesla beating Waymo. I've heard all these arguments a million times. Bet some money
This is something Electrek does regularly and isn't unique to this article but I don't like how they suggests the Tesla crash reports are doing something shady by following the reporting guidelines. Tesla is reporting things by the books, and when Electrek doesn't like how the laws are laid out they blame Tesla. Electrek wants Tesla to publish separate press notes, and since they don't they take their frustration out on the integrity of the article, which is worse for everyone.
Given how minor these are, you think they'd get in front of the conspiracy by full disclosure.
ill stick to the bus
He going to fix this by having grok redefine "widespread"
https://www.cnbc.com/2026/01/22/musk-tesla-robotaxis-us-expa...
Tesla CEO Elon Musk said at the World Economic Forum in Davos that the company’s robotaxis will be “widespread” in the U.S. by the end of 2026.
Their service is way worse than you think, in every way. The actual unsupervised Robotaxi service doesn't cover a geofenced area of Austin, like Waymo does. It traverses a fixed route along South Congress Avenue, like a damned bus.
I spew elon hate every chance I get and I maintain I am being too kind on him.
Now imagine if all those billions in taxes had been used to build real transit infrastructure instead of subsidizing Tesla.
Waymo is licensing out their "Driver" software to cars that fit the specification
if Tesla drops the ego they could obtain Waymo software and track record on future Tesla hardware
Well, how about time to take them off the roads then?
It's a fusion of jazz and funk!
"Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA’s confidentiality provisions."
Given the way Musk has lied and lied about Tesla's autonomous driving capabilities, that can't be much of a surprise to anyone.
Honestly I thought everyone was clear how this was going to go after the initial decapitation from 2016, but it seems like everyone's gonna allow these science experiments to keep causing damage until someone actually regulates them with teeth.
This data seems very incomplete and potentially misleading.
>The new crashes include [...] a crash with a bus while the Tesla was stationary
Doesn't this imply that the bus driver hit the stationary Tesla, which would make the human bus driver at fault and the party responsible for causing the accident? Why should a human driver hitting a Tesla be counted against Tesla's safety record?
It's possible that the Tesla could've been stopped in a place where it shouldn't have, like in the middle of an intersection (like all the Waymos did during the SF power outage), but there aren't details being shared about each of these incidents by Electrek.
>The new crashes include [...] a collision with a heavy truck at 4 mph
The chart shows only that the Tesla was driving straight at 4mph when this happened, not whether the Tesla hit the truck or the truck hit the Tesla.
Again, it's entirely possible that the Tesla hit the truck, but why aren't these details being shared? This seems like important data to consider when evaluating the safety of autonomous systems - whether the autonomous system or human error was to blame for the accident.
I appreciate that Electrek at least gives a mention of this dynamic:
>Tesla fans and shareholders hold on to the thought that the company’s robotaxis are not responsible for some of these crashes, which is true, even though that’s much harder to determine with Tesla redacting the crash narrative on all crashes, but the problem is that even Tesla’s own benchmark shows humans have fewer crashes.
Aren't these crash details / "crash narrative" a matter of public record and investigations? By e.g. either NHTSA, or by local law enforcement? If not, shouldn't it be? Why should we, as a society, rely on the automaker as the sole source of information about what caused accidents with experimental new driverless vehicles? That seems like a poor public policy choice.
Just imagine how bad it is going to be when they take the human driver out of the car.
No idea how these things are being allowed on the road. Oh wait, yes I do. $$$$
Move fast and hospitalise people.
A minor fender-bender is not a crash
4x worse than humans is misleading, I bet it's better than humans, by a good margin.
The source is a well known anti Tesla, anti Musk site, the owner has a psychotic hatred from Tesla and Elon after being a balanced click bait site for years. Ignore.
Good, who cares. Autonomous driving is an absolute waste of time. We need autodrone transport for civilian traffic. The skies have been waiting.
In before, 'but it is a regulation nightmare...'