this post was submitted on 26 Nov 2025
361 points (96.9% liked)
Technology
77090 readers
3677 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It never fails to amaze me, how much C-level people are disconnected from reality.
Abstraction layers. They are so detached from everyone else through abstraction layers that we're nothing more than D2 NPC character sheets to them. That's why when a Luigi, alegedly, breaks through all of the abstraction layers and brings a leaded reality check to these fucking parasites they double down on palantir like projects to keep themselves safe while making the state even more oppressive and invasive of everyone's privacy.
Right?! At the end of the day, they're still just people. Gotta eat, gotta sleep, gotta shit. I will never understand how any individual gets so much money/power/attention because they're all just god damn people and in the event of a catastrophe i imagine they would be about as helpful as any other random human. They aren't gods, and they certainly don't deserve the stratification. It's not like they're enlightened or something, most of the time they're just sociopaths who are rich, clever, and/or connected. When you get a glimpse under the hood at moments like this, it really is kinda jarring. Helps to dispel those silly presumptions about them at least.
What you're describing is a general experience with LLM, not limited to the C-level.
If an LLM sprouts rubbish you detect it because you have external knowledge, in other words, you're the subject matter expert.
What makes you think that those same errors are not happening at the same rate outside your direct personal sphere of knowledge?
Now consider what this means for the people around you, including the C-level.
Repeat after me, AI is Assumed Intelligence and should not be considered anything more than autocorrect on steroids.
AI is a classic case of Gell-Man amnesia
I'll add it to the list:
Thank you for giving me the name for this. I had no idea someone had properly defined it.
Managers love these AI tools because that's what they're already doing and familiar with; the same way you talk an AI to doing something for you is not very different from the experience of instructing a mediocre worker.