This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Absolutely.
I've got a car with level 2 automation, and after using it for a few months, I can say that it works really well, but you still need to be engaged to drive the car.
What it is good at... Maintaining lanes, even in tricky situation with poor paint/markings. Maintaining speed and distance from the car in front of you.
What it is not good at... Tricky traffic, congestion, or sudden stops. Lang changes. Accounting for cars coming up behind you. Avoiding road hazards.
I use it mostly like an autopilot. The car takes some of the monotonous workload out of driving, which allows me to move my focus from driving the car to observing traffic, other drivers, and road conditions.
The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.
What I don't get is how this false advertising for years hasn't caused Tesla bankruptcy already?
Because the US is an insane country where you can straight up just break the law and as long as you're rich enough you don't even get a slap on the wrist. If some small startup had done the same thing they'd have been shut down.
What I don't get is why teslas aren't banned all over the world for being so fundamentally unsafe.
What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.
I've argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.
Well, because 99% of the time, it's fairly decent. That 1%'ll getchya tho.
To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.
Someone who doesn't understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.
Many Tesla owners are definitely dead many times, on the inside.
..It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn't even time for human intervention, but I frequently had to take over when I used to use it (post v13)
That's probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.
Let's say that it's only 0.01% risk, that's still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.
It wouldn't be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they're never going to add lidar scanners so is literally never going to get any better it's always going to be this bad.
...is literally never going to get any better it's always going to be this bad.
Hey now! That's unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won't know which until it tries to kill you in new and unexpected ways :j
The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.
I wouldn't really called it a solved problem when waymo with lidar is crashing into physical objects
NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.
It'd probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.
Just because you see a car working perfectly, doesn't mean it always is working perfectly.
For no reason?
They are running proprietary software in the car that people don't even know what is happening in background of. Every electric car needs to be turned into an open source car so that the car cannot be tampered with, no surveillancing, etc etc
Everyone should advocate for that because the alternative is this with Tesla. And I know nobody wants this happening to other car manufacturers cars as well
I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week
That's because it won't, that's because Elmo musk is gasp a liar. Always has been. That robo taxi is actuyab older lie he used a couple of years prior, but he dusted it lfft and re-used it.
Anytime Elmo says that he's confident they can do it now, he means that they're nowhere near a real product. Anytime he says "next year" it means that it won't ever happen. Anytime he says that they alrethave a product, it just needs to me produced, it means that it'll never happy
He is a vaporware con man who has been cheating people (and mostly the US government) out of billions
Literally look at all of his promises over the last decade, you start seeing patterns. It's always almost there.
SpaceX, arguay his most successful company that he actually did with his leadership is a shit show of lies. According to him we'd be having colonies on Mars by now, it's what he took 3 billion dollars in funding for, and he literally isn't at 1% of that. Yet, he keeps claiming, within a few years now! Three billion dollars and he managed to blow up a banana over the Indian ocean, and obliterate a launch pad
If I commit fraud in the thousands, take thousands and then don't deliver, I go to jail. He does it with countless billions and he's still out there. Bit alas, his behavior finally is catching up with him, Tesla is going off a cliff bow that nobody wants to drive a Nazi brick anymore
Literally look at all of his promises over the last decade, you start seeing patterns. It’s always almost there.
Cheers for the guy/gal that maintains an updated list of all his bullshit, check it out sometime
Holy shit that is a treasure trove! Thanks kind stranger!
If it was open source tech people could check it and see if it really is capable for themselves but because it's not we don't know what it is missing to be way way better
Nah, on the 5 levels of autonomous driving, telsas as at level 2
Elmo isn't even close but that wint stip him from just lying about it because that is what Elmo does best
Elon took the wheel because that person made a mean tweet about him
"It crashed!"
"Yes but it did it all by itself!"
Except for the last 0.05 seconds before the crash where the human was put in control. Therefore, the human caused the crash.
The problem with automation is complacency. Especially in something that people already have a very hard time taking seriously like driving where cell phone distraction, conversations, or just zoning out is super common.
“Kill me” it said in a robotic voice that got slower, glitchier, and deeper as it drove off the road.
EXTERMINAAAAAATE!!!
I mean, if Elon was my dad, I'd probably have some suicidal tendencies too.
Don't drive Tesla
I am never getting into a self driving car. I don't understand why we are investing money into this technology when people can already drive cars on their own, and we should be moving towards robust public transportation systems anyway. A waste of time and resources to... what exactly? Stare at your phone for a few extra minutes a day? Work from home and every city having robust electric transit systems is what the future is supposed to be.
To be fair, that grey tree trunk looked a lot like a road
It's fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don't stop for pedestrians or drive off a cliff. So freaking what, that's the price for progress my friend!
I'd like to think this is unnecessary but just in case here's a /s for y'all.
GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless