this post was submitted on 02 Dec 2025
522 points (99.4% liked)
PC Gaming
12798 readers
862 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But this is the only thing that matters. Who cares how people who don't know, care or capable of using some technology are actually using it? Ammonium nitrate is poisonous if you try to salt a soup with it, but it makes miracles in the hands of those who know how to use it (if you like analogies)
I am not a lawyer, but I think that even in the USA it isn't mandatory to buy everything that corporations sell.
Modern LLMs are very useful tools. Just some people try to smalltalk with it for some reason. Their right. Salting soup with nitrates is also their right.
Oh goddammit, you're here. Move on to Lembot_0006 already, ya contrarian doofus.
Is that a user that is "unblockable" by dodging username blocks by iterating the name?
Yep, someone already joked that they picked 4 digits for their username so they have a failsafe for getting banned 9,998 times. That was at 0004.
If you look at their chat history it really explains it all tbh. I wonder if we could get this instance to defederate their account haven.
Ugh yet another Lemmy user desperate to make the LLMs like them. Go chat with your chatbot friends and leave humans alone mate.