Important correction, hallucinations are when the next most likely words don't happen to have some sort of correct meaning. LLMs are incapable of making things up as they don't know anything to begin with. They are just fancy autocorrect
Traister101
joined 1 year ago
Yes, yet this misunderstanding is still extremely common.
People like to anthropomorphize things, obviously people are going to anthropomorphize LLMs, but as things stand people actually believe that LLMs are capable of thinking, of making real decisions in the way that a thinking being does. Your average Koala, who's brain is literally smooth has better intellectual capabilities than any LLM. The koala can't create human looking sentences but it's capable of making actual decisions.