this post was submitted on 16 Mar 2025
1548 points (99.2% liked)

Not The Onion

14986 readers
1900 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

you are viewing a single comment's thread
view the rest of the comments
[–] MidsizedSedan@lemmy.world 32 points 11 hours ago (6 children)

All these years, I always thought all self driving cars used LiDAR or something to see in 3D/through fog. How was this allowed on the roads for so long?

[–] Feersummendjinn@feddit.uk 36 points 9 hours ago (2 children)

They originally the model S had front facing radar and ultrasonic sensors all round, the car combined the information to corroborate it's visual interpretation.
According to reports years ago the radar saved Tesla's from multiple pileups when it detected crashes multiple cars ahead (that the driver couldn't see).
Elmo in his infinite ego demanded both the radar and ultrasonics be removed, since he could drive with out that input so the car should be able to.. also it is cheaper.

[–] Ronno@feddit.nl 14 points 8 hours ago (1 children)

Exactly, my previous car (BMW) once saved me in the fog by emergency braking for something I wasn't able to see yet. My current car (Tesla) shuts down almost all safety features when the camera's can't see anything, so I doubt it will help me in such situations. The only time my Tesla works well is in perfect conditions, but I don't live in California.

[–] raspberriesareyummy@lemmy.world 0 points 1 hour ago (1 children)

Exactly, my previous car (BMW) once saved me in the fog by emergency braking for something I wasn’t able to see yet.

If you were driving at a speed at which the low visibility would have gotten you into into an accident due to some obstable you weren't able to see yet, you were driving too fast. Simple, isn't it?

[–] jj4211@lemmy.world 2 points 1 hour ago

While true, it's still nice that super-human senses are looking out for the driver on their behalf. Also it's nice if super-human senses allow for braking earlier and closer to graceful rather than standing hard on the brakes because of late notice.

Fog is one example, but sudden blinding glare could be another situation that could be mitigated by things like radar and lidar. Human driver may unexpectedly be blinded and operating at unsafe speed without any way of knowing that glare was coming in advance.

[–] cley_faye@lemmy.world 1 points 1 hour ago (1 children)

I'd be very curious to know how much cheaper it is. Sure, there's R&D to integrate that with everything, but that cost is split across all units sold. It feels like the actual sensors, at this scale, can't add a significant amount to the final price.

[–] KayLeadfoot@fedia.io 1 points 49 minutes ago

Back when Elon made avoiding LiDAR a core part of his professional personality, it was fairly expensive. But as any tech genius can tell ya, component prices drop rapidly for electronics.

Now, radar is dirty cheap. Everything has radar. Radar was removed from Teslas. A radar sensor for my truck is $75, probably much less at scale orders.

LiDAR sensors cost anywhere from $500-$1,500 for a vehicle of this type, near as I can tell (this type being Level 2 autonomy rather than something like a Waymo. A well-kitted out self-driving vehicle has 4 LiDAR sensors).

Here is the LiDAR module currently used on the Mercedes S-Class, it's $400 used: https://www.ebay.com/itm/285816360464

It's a hideously small cost-savings in 2025 for a luxury vehicle like a Tesla. Any rational company would've reversed course after the first stationary-object-strike fatality. Tesla is not a rational company.

[–] Ledericas@lemm.ee 14 points 11 hours ago (1 children)

tesla uses cameras only, i think waymo uses lidar.

[–] dan@upvote.au 7 points 10 hours ago* (last edited 10 hours ago) (1 children)

Most non Tesla brands that have some sort of self-driving functionality use lidar and/or radar. I've got a BMW iX and as far as I know it uses cameras, radar, lidar, and ultrasonic sensors.

[–] Natanox@discuss.tchncs.de 3 points 2 hours ago

It's the only sensible approach. Not just is the notion that "humans use just their eyes too" completely wrong (otherwise how would be able to tell that something is off with the car "with our butt"?), computers are not even remotely close to our understanding and rapid interpretation of the world around us or cooperation beyond of what's pre-programmed, which is necessary to deal with unforeseen circumstances. Cars must offset this somehow, and the simplest way to do so is with vast sensor suites that give them as much information as possible. Of course many humans also utterly fail at cooperation and defensive driving, but that's another problem.

[–] Breadhax0r@lemmy.world 14 points 11 hours ago (2 children)

I remember reading that tesla only uses cameras for it's self driving. My 2018 Honda uses radar for the adaptive cruise so the technology exists, musk is just an idiot.

[–] SkyezOpen@lemmy.world 3 points 11 hours ago (1 children)

Does it? My 2023 model throws a shit fit if it's cold and I assume the camera covers are iced over.

[–] Ilovethebomb@lemm.ee 3 points 10 hours ago

It probably has cameras as well, for lane guidance etc.

My Mazda complains if the windscreen is dirty for the same reason.

[–] NotMyOldRedditName@lemmy.world -1 points 10 hours ago* (last edited 10 hours ago) (1 children)

Radar doesn't detect stopped objects at high speed. It'd hit the wall too on radar alone.

This has to be solved by vision and or lidar.

[–] Breadhax0r@lemmy.world 4 points 7 hours ago (1 children)

Unless your car is traveling faster than the speed of light, radar will detect objects in front of it. But yeah, I was trying to imply that for a complex system like self driving musk is a buffoon for relying on a single system instead of creating a more robust package of sensors.

[–] NotMyOldRedditName@lemmy.world 2 points 1 hour ago* (last edited 15 minutes ago)

They get filtered out and the car will not act on it because there is so much noise from stationary objects all around you. The car essentially wouldn't drive at all if it didn't filter them out.

At high speeds, the radar in all cars is used to detect moving objects and the change in velocity of those objects.

Radar will not prevent running into this wall at 40mph.

People can downvote me all they want, but that doesn't change anything.

Only vison and / or lidar would stop for that wall at 40mph.

Edit: aside from clarity on the above this is the expected outcomes

Radar in cars today: hit the wall

Vision: probably all hit the wall but could be sufficiently programmed to not if they trained on it.

Lidar: would not hit the wall.

[–] TheYang@lemmy.world 6 points 8 hours ago (2 children)

They do.

But "all self driving cars" are practically only from waymo.
Level 4 Autonomy is the point at which it's not required that a human can intercede at any moment, and as such has to be actively paying attention and be sober.
Tesla is not there yet.

On the other hand, this is an active attack against the technology.
Mirrors or any super-absorber (possibly vantablack or similar) would fuck up LIDAR. Which is a good reason for diversifying the Sensors.

On the other hand I can understand Tesla going "Humans use visible light only, in principle that has to be sufficient for a self driving car as well", because, in principle I agree. In practice... well, while this seems much more click-bait than an actual issue for a self-driving taxi, diversifying your Input chain makes a lot of sense in my book. On the other hand, if it would cost me 20k more down the road, and Cameras would reach the same safety, I'd be a bit pissed.

[–] octopus_ink@slrpnk.net 6 points 3 hours ago* (last edited 3 hours ago) (1 children)

On the other hand I can understand Tesla going “Humans use visible light only, in principle that has to be sufficient for a self driving car as well”, because, in principle I agree.

The whole idea is they should be safer than us at driving. It only takes fog (or a painted wall) to conclude that won't be achieved with cameras only.

On the other hand, if it would cost me 20k more down the road, and Cameras would reach the same safety,

You had a lot of hands in this paragraph. 😀

I'm exceptionally doubtful that the related costs were anywhere near this number, and it's inconceivable to me that cameras only could ever be as safe as having a variety of inputs.

Musk's ethos is clear, both in business and government. He will make whatever short term decisions his greed and the ketamine tell him to make, and fuck whatever happens down the road. Let's not work so hard to sanewash him like the media has Trump.

[–] TheYang@lemmy.world 0 points 2 hours ago* (last edited 2 hours ago)

The whole idea is they should be safer than us at driving. It only takes fog (or a painted wall) to conclude that won’t be achieved with cameras only.

Well, I do still think that cameras could reach "superhuman" levels of safety.
(very dense) Fog makes the cameras useless, A self driving car would have to slow way down / shut itself off. If they are part of a variety of inputs they drop out as well, reducing the available information. How would you handle that then? If that would have to drop out/slow down as much, ~~you gain nothing again~~ /e: my original interpretation is obviously wrong, you get the additional information whenever the environment permits.
And for the painted wall. Cameras should be able to detect that. It's just that Tesla presumably hasn't implemented defenses against active attacks yet.

You had a lot of hands in this paragraph. 😀
I like to keep spares on me.

I’m exceptionally doubtful that the related costs were anywhere near this number.

cost has been developing rapidly. Pretty sure several years ago (about when tesla first started announcing to be ready in a year or two) it was in the tens of thousands. But you're right, more current estimations seem to be more in the range of $500-2000 per unit, and 0-4 units per car.

it’s inconceivable to me that cameras only could ever be as safe as having a variety of inputs.
Well, diverse sensors always reduce the chance of confident misinterpretation.
But they also mean you can't "do one thing, and do it well", as now you have to do 2-4 things (camera, lidar, radar, sonar) well. If one were to get to the point where you have either one really good data-source, or four really shitty ones, it becomes conceivable to me.

From what I remember there is distressingly little oversight for allowing self-driving-cars on the road, as long as the Company is willing to be on the hook for accidents.

[–] atempuser23@lemmy.world 1 points 10 minutes ago

vantablack Lidar won't be effected by vantablack out side of a lab experiment. It picks up contamination very quickly and can't be effectively cleaned.

[–] FiskFisk33@startrek.website 1 points 10 hours ago

they generally do