this post was submitted on 02 Apr 2025
1123 points (98.8% liked)

Technology

68306 readers
4099 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

you are viewing a single comment's thread
view the rest of the comments
[–] jonne@infosec.pub 37 points 2 days ago* (last edited 2 days ago) (1 children)

Yep, that one was purely about hitting a certain KPI of 'miles driven on autopilot without incident'. If it turns off before the accident, technically the driver was in control and to blame, so it won't show up in the stats and probably also won't be investigated by the NTSB.

[–] NeoNachtwaechter@lemmy.world 14 points 2 days ago (2 children)

so it won't show up in the stats

Hopefully they wised up by now and record these stats properly....?

[–] KayLeadfoot@fedia.io 23 points 2 days ago (2 children)

NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

The companies themselves do all sorts of wildcat shit with their numbers. Tesla's claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that's what they say on their stock earnings calls. Of course, that's not true, not based on any data I've seen, they haven't published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

[–] NotMyOldRedditName@lemmy.world 2 points 1 day ago* (last edited 1 day ago)

So to drive with FSD is 8x safer than your average human driver.

WITH a supervising human.

Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it's current state.

[–] b3an@lemmy.world 2 points 1 day ago (1 children)

Fascinating! I don’t know all this. Thanks

[–] KayLeadfoot@fedia.io 1 points 1 day ago
[–] jonne@infosec.pub 8 points 2 days ago

If they ever fixed it, I'm sure Musk fired whomever is keeping score now. He's going to launch the robotaxi stuff soon and it's going to kill a bunch of people.