When an LLM fabricates a falsehood, that is not a malfunction at all. The machine is doing exactly what it has been designed to do: guess, and sound confident while doing it.
When LLMs get things wrong they aren't hallucinating. They are bullshitting.
source: https://thebullshitmachines.com/lesson-2-the-nature-of-bullshit/index.html
When an LLM fabricates a falsehood, that is not a malfunction at all. The machine is doing exactly what it has been designed to do: guess, and sound confident while doing it.
When LLMs get things wrong they aren't hallucinating. They are bullshitting.
source: https://thebullshitmachines.com/lesson-2-the-nature-of-bullshit/index.html