this post was submitted on 23 May 2025
126 points (98.5% liked)

Technology

77058 readers
2782 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

(page 2) 38 comments
sorted by: hot top controversial new old
[–] sidtirouluca@lemm.ee 1 points 6 months ago (3 children)

self driving is the future, but im glad im not a beta tester.

[–] KayLeadfoot@fedia.io 1 points 6 months ago (1 children)

You're probably right about the future, but like damn, I wish they would slow their roll and use LiDAR

[–] FaceDeer@fedia.io 0 points 6 months ago (1 children)

Elon Musk decided they absolutely would not use lidar, years ago when lidar was expensive enough that a decision like that made economic sense to at least try making work. Nowadays lidar is a lot cheaper but for whatever reason Musk has drawn a line in the sand and refuses to back down on it.

Unlike many people online these days I don't believe that Musk is some kind of sheer-luck bought-his-way-into-success grifter, he has been genuinely involved in many of the decisions that made his companies grow. But this is one of the downsides of that (Cybertruck is another). He's forced through ideas that turned out to be amazing, but he's also forced through ideas that sucked. He seems to be increasingly having trouble distinguishing them.

[–] Buffalox@lemmy.world 1 points 6 months ago* (last edited 6 months ago)

Musk has drawn a line in the sand and refuses to back down on it.

From what I heard the upcoming Tesla robotaxi test cars based on model Y are supposed to have LIDAR. But it's ONLY the robotaxi version that has it.

He seems to be increasingly having trouble distinguishing them.

Absolutely, seems to me he has been delusional for years, and it's getting worse.

load more comments (2 replies)
[–] DarrinBrunner@lemmy.world 1 points 6 months ago

It got the most recent update, and thought a tunnel was a wall.

[–] vegeta@lemmy.world 1 points 6 months ago
[–] RaptorBenn@lemmy.world 0 points 6 months ago (1 children)

Not really worth talking about unless the crash rate is higher than human average.

[–] KayLeadfoot@fedia.io 1 points 6 months ago

Imagine if people treated airbags that way XD

If Ford airbags just plain worked, and then Tesla airbags worked 999 times out of 1,000, would the correct answer be to say "well thems the breaks, there is no room for improvement, because dangerously flawed airbags are way safer than no airbags at all."

Like, no. No, no, no. Cars get recalled for flaws that are SO MUCH less dangerous.

[–] itisileclerk@lemmy.world 0 points 6 months ago (1 children)

Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a "Cartrial" (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is "Low priority to have". There are prefectly fast and saf self-driving solutions like High-speed Trains.

load more comments (1 replies)
[–] melsaskca@lemmy.ca 0 points 6 months ago (1 children)

I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

load more comments (1 replies)
[–] RandomStickman@fedia.io 0 points 6 months ago (1 children)

Anything outside of a freshly painted and paved LA roads at high noon while it's sunny isn't ready for self drivings it seems

[–] Bonesince1997@lemmy.world 0 points 6 months ago (1 children)

Or silly tunnels you can't get out of.

[–] Zwuzelmaus@feddit.org 1 points 6 months ago

Tunnels are extra dangerous. Not because of the likelihood of an accident, but because of the situation if an accident happens. It blocks the tunnels easily, fills it with smoke, and kills hundreds.

Except newly built tunnels in rich countries.

[–] Skyrmir@lemmy.world 0 points 6 months ago (3 children)

I use autopilot all the time on my boat. No way in hell I'd trust it in a car. They all occasionally get suicidal. Mine likes to lull you into a sense of false security, then take a sharp turn into a channel marker or cargo ship at the last second.

[–] dependencyinjection@discuss.tchncs.de 1 points 6 months ago (1 children)

Exactly. My car doesn’t have AP, but it does have a shed load of sensors and sometimes it just freaks out about stuff being too close to car for no discernible reason. Really freaks me out as I’m like what you see bro we just driving down the motorway.

[–] ayyy@sh.itjust.works 1 points 6 months ago

For mine, it’s the radar seeing the retro-reflective stripes on utility poles being brighter than it expects.

[–] echodot@feddit.uk 1 points 6 months ago* (last edited 6 months ago) (1 children)

Isn't there a plane whose autopilot famously keeps trying to crash into the ground. The general advice is to just not let it do that, whenever it looks like it's about to crash into the ground, pull up instead.

[–] GamingChairModel@lemmy.world 1 points 6 months ago

All the other answers here are wrong. It was the Boeing 737-Max.

They fit bigger, more fuel efficient engines on it that changed the flight characteristics, compared to previous 737s. And so rather than have pilots recertify on this as a new model (lots of flight hours, can't switch back), they designed software to basically make the aircraft seem to behave like the old model.

And so a bug in the cheaper version of the software, combined with a faulty sensor, would cause the software to take over and try to override the pilots and dive downward instead of pulling up. Two crashes happened within 5 months, to aircraft that were pretty much brand new.

It was grounded for a while as Boeing fixed the software and hardware issues, and, more importantly, updated all the training and reference materials for pilots so that they were aware of this basically secret setting that could kill everyone.

[–] SynopsisTantilize@lemm.ee 0 points 6 months ago (5 children)

They have auto pilot on boats? I never even thought about that existing. Makes sense, just never heard of it until just now!

[–] JohnEdwa@sopuli.xyz 1 points 6 months ago

They've technically had autopilots for over a century, the first one was the oil tanker J.A Moffett in 1920. Though the main purpose of it is to keep the vessel going dead straight as otherwise wind and currents turn it, so using modern car terms I think it would be more accurate to say they have lane assist? Commercial ones can often do waypoint navigation, following a set route on a map, but I don't think that's very common on personal vessels.

load more comments (4 replies)
[–] gamermanh@lemmy.dbzer0.com -1 points 6 months ago (1 children)

No serious injuries

How unfortunate

[–] TeddE@lemmy.world 0 points 6 months ago (2 children)

Look, I respect where you're coming from. May I presume your line of reasoning is in the vein of "elon musk sucks and thus anyone who buys their stuff is a Nazi and should die" - but that is far, far too loose of a chain of logic to justify sending a man to death alone. Perhaps if you said that they should be held accountable with the death penalty on the table? But c'mon - are you really the callous monster your comment paints you as?

[–] ayyy@sh.itjust.works -1 points 6 months ago (2 children)

These aren’t passive victims, they are operating harmfully dangerous machines at high speeds on roads shared with the rest of us.

load more comments (2 replies)
[–] gamermanh@lemmy.dbzer0.com -2 points 6 months ago

I give 0 ducks about Nazis who drive the Nazi car. The more of them that oven themselves in them the better

[–] AA5B@lemmy.world -5 points 6 months ago* (last edited 6 months ago)

It’s ready, but you’re assuming an entirely general taxi service. It will be carefully constrained like Wayno was. It will be limited to easy streets and times, probably lower speeds, where there is less chance of problems. It’s ready for that.

There’s always a reason. I agree with the author: most likely it misinterpreted a shadow as a solid obstacle. I’m not excusing it but humans do that too, and Tesla will likely ensure it doesn’t come up in their taxi service.

Remember that robotaxi doesn’t actually exist yet. I’m pretty sure the plan is to start with Model Y having human safety drivers. it’s ready for that

I did a trial to find out for myself and my reason for it not being ready yet is a bit different. Full self-driving did perfectly under “normal” conditions, and every time it made me nervous was an edge case. However it made me realize driving is all edge cases. It’s not ready and may never be

load more comments
view more: ‹ prev next ›