this post was submitted on 02 May 2025
1053 points (98.6% liked)
memes
14515 readers
4002 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.
The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.
Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.
Which is why we need laws about human responsibility for decisions made by AI (or software in general).
I did an internship at a bank way back, and my role involved a lot of processing of spreadsheets from different departments. I automated a heckton of that with Visual Basic, which my boss was okay with, but I was dismayed to learn that I wasn't saving anyone's time except my own, because after the internship was finished, all of the automation stuff would have to be deleted. The reason was because of a rule (I think a company policy rather than a law) that required that any code has to be the custody of someone, for accountability purposes — "accountability" in this case meaning "if we take unmaintained code for granted, then we may find an entire department's workflow crippled at some point in the future, with no-one knowing how it's meant to work".
It's quite a different thing than what you're talking about, but in terms of the implementation, it doesn't seem too far off.
Cars already do that without AI. If someone driving a car kills you and they aren't drunk, they probably won't get in any trouble and the car manufacturers never face any penalty for 40,000 deaths and 2,000,000 injuries per year they cause in the USA alone.
Yes, good point, there is a clear through line here
FYI: Scapegoat.
Lol well, Escape Goat 1 & 2 were just too damn good at being tough as nails indie platformers and now the word is hopelessly Escape Goat not Scapegoat in my head I am afraid.
Which is honestly just the end game of a practice that's been getting worse for decades. It's partly why stuff was outsourced. The more layers between us and the atrocities, the less humanity can focus on reacting to them.
There's been a concerted effort to introduce as many possible layers as they can to divide people and break up communities in order to break humans ability to empathize (and then use that empathy to affect change).
It reminds me of how apparently firing squad executions used to have only some of the guns loaded with live guns, and the rest with blanks. This way, the executioners could do some moral gymnastics to convince themselves that they hadn't just killed a person