this post was submitted on 26 Nov 2025
325 points (96.3% liked)
PC Gaming
12758 readers
1382 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Except every tool used for development is going to have some level of AI in it and unless you are also building your own AI free tool you aren't going to know what's truly AI free. AI is here and the cat's out of the bag. There is no putting it back in at this point. We as a society need to figure out how to use it ethically.
We can't even restrain ourselves on the usage of weapons, the extraction of natural resources, the usage of energy, consumerism, or cars. The way our societies are working will not give much chance to ethics.
LLMs are actually just massively improved spell checks. If you've used an IDE with in line error detection, technically that's AI now.
I do wish we've drawn the line more clearly on what "AI" usage means in terms of "this game was made with AI"
Every single time I think about what LLM are I think about this quote from the game Night in the Woods:
“We're good at drawing lines through the spaces between stars like we're pattern-finders, and we'll find patterns and we like really put our hearts and minds into it and even if we don't mean to.”
LLMs are based on neural networks. They are little brains that have nothing else to deal with than finding patterns in our own logic and can seem to be smarter than what they really are because of it. Evolution has not weighted them with an ego or urgency, but because it has been trained on ours it can sometimes emulate it. But it fundamentally lacks the complexity of our brains, at least for now. It is still amazing what they can do given so little, and it is amazing how convincing they can be with their answers when they are completely wrong. It is a viral form of intelligence.
What I consider sad is that we are really getting no option to run it locally. It's an excuse to turn everything into a live service where not even a subscription saves you because you can run out of "tokens" now. I have absolutely no issue with OSS tools incorporating LocalLLM aids. If people have modern GPUs then they can use local LLMs in some form or another.