belit_deg

joined 2 years ago
[–] belit_deg@lemmy.world 12 points 1 day ago (4 children)

When an LLM fabricates a falsehood, that is not a malfunction at all. The machine is doing exactly what it has been designed to do: guess, and sound confident while doing it.

When LLMs get things wrong they aren't hallucinating. They are bullshitting.

source: https://thebullshitmachines.com/lesson-2-the-nature-of-bullshit/index.html