this post was submitted on 02 Apr 2025
1123 points (98.8% liked)

Technology

68306 readers
4099 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

you are viewing a single comment's thread
view the rest of the comments
[–] KayLeadfoot@fedia.io 86 points 2 days ago (2 children)

Accurate.

Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

  1. The car's cameras don't detect the biker, or it just doesn't stop for some reason.
  2. The driver isn't paying attention to detect the system failure.
  3. The Tesla's driver alertness tech fails to detect that the driver isn't paying attention.

Taking out the driver will make this already-unacceptably-lethal system even more lethal.

[–] jonne@infosec.pub 66 points 2 days ago (3 children)
  1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
[–] KayLeadfoot@fedia.io 64 points 2 days ago (1 children)

... Also accurate.

God, it really is a nut punch. The system detects the crash is imminent.

Rather than automatically try to evade... the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

[–] jonne@infosec.pub 37 points 2 days ago* (last edited 2 days ago) (1 children)

Yep, that one was purely about hitting a certain KPI of 'miles driven on autopilot without incident'. If it turns off before the accident, technically the driver was in control and to blame, so it won't show up in the stats and probably also won't be investigated by the NTSB.

[–] NeoNachtwaechter@lemmy.world 14 points 2 days ago (2 children)

so it won't show up in the stats

Hopefully they wised up by now and record these stats properly....?

[–] KayLeadfoot@fedia.io 23 points 2 days ago (2 children)

NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

The companies themselves do all sorts of wildcat shit with their numbers. Tesla's claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that's what they say on their stock earnings calls. Of course, that's not true, not based on any data I've seen, they haven't published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

[–] NotMyOldRedditName@lemmy.world 2 points 1 day ago* (last edited 1 day ago)

So to drive with FSD is 8x safer than your average human driver.

WITH a supervising human.

Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it's current state.

[–] b3an@lemmy.world 2 points 1 day ago (1 children)

Fascinating! I don’t know all this. Thanks

[–] KayLeadfoot@fedia.io 1 points 1 day ago
[–] jonne@infosec.pub 8 points 2 days ago

If they ever fixed it, I'm sure Musk fired whomever is keeping score now. He's going to launch the robotaxi stuff soon and it's going to kill a bunch of people.

[–] NeoNachtwaechter@lemmy.world 18 points 2 days ago

Even when it is just milliseconds before the crash, the computer turns itself off.

Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

[–] br3d@lemmy.world 8 points 2 days ago* (last edited 2 days ago) (1 children)

There's at least two steps before those three:

-1. Society has been built around the needs of the auto industry, locking people into car dependency

  1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
[–] grue@lemmy.world 3 points 2 days ago (1 children)
  1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody

That's a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.

You're absolutely right about point -1 though.

[–] explodicle@sh.itjust.works 3 points 2 days ago (1 children)

build, sell and drive

You two don't seem to strongly disagree. The driver is liable but should then sue the builder/seller for "self driving" fraud.

[–] grue@lemmy.world 2 points 2 days ago (1 children)

Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

[–] explodicle@sh.itjust.works 2 points 1 day ago (1 children)

I would assume everyone here would agree with that 😘

[–] grue@lemmy.world 2 points 1 day ago (1 children)

I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.

[–] monarch@lemm.ee 1 points 1 day ago

Lemmy is super pro FOSS.