this post was submitted on 09 Aug 2025
84 points (97.7% liked)
Hacker News
2326 readers
367 users here now
Posts from the RSS Feed of HackerNews.
The feed sometimes contains ads and posts that have been removed by the mod team at HN.
founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I like how confident it is. Now imagine that this is a topic you know nothing about and are relying on it to get information.
I really wish people understood how it works, so that they wouldn't rely on it for literally anything.
I tried putting together a research plan using an LLM. Like nothing crazy I just wanted it to help me structure my thoughts and write LaTeX for me. Horrible experience.
I gave it a reference paper and said "copy that methodology exactly“ and then said exactly what steps I would like to see included.
It kept making bold claims and suggesting irrelevant methods and just plain wrong approaches. If I had no idea about the topic I might have believed it because that thing is so confident but especially if you know what you’re doing they’re bullshit machines.
collapsed inline media
It only seems confident if you treat it like a person. If you realize it's a flawed machine, the language it uses shouldn't matter. The problem is that people treat it like it's a person, ie. That its confident sounding responses mean anything.