this post was submitted on 01 Jun 2025
261 points (96.1% liked)
Technology
70942 readers
3430 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Define "know".
An LLM can have text describing how it works and be trained on that text and respond with an answer incorporating that.
LLMs have no intrinsic ability to "sense" what's going on inside them, nor even a sense of time. It's just not an input to their state. You can build neural-net-based systems that do have such an input, but ChatGPT or whatever isn't that.
LLMs lack a lot of the mechanisms that I would call essential to be able to solve problems in a generalized way. While I think Dijkstra had a valid point:
...and we shouldn't let our prejudices about how a mind "should" function internally cloud how we treat artificial intelligence...it's also true that we can look at an LLM and say that it just fundamentally doesn't have the ability to do a lot of things that a human-like mind can. An LLM is, at best, something like a small part of our mind. While extracting it and playing with it in isolation can produce some interesting results, there's a lot that it can't do on its own: it won't, say, engage in goal-oriented behavior. Asking a chatbot questions that require introspection and insight on its part won't yield interesting result, because it can't really engage in introspection or insight to any meaningful degree. It has very little mutable state, unlike your mind.