this post was submitted on 01 Dec 2025
70 points (72.4% liked)

Technology

77090 readers
3338 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its "Extended Thinking" version) to find an error in "Today's featured article". In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.

you are viewing a single comment's thread
view the rest of the comments
[–] kalkulat@lemmy.world 11 points 1 day ago (1 children)

Finding inconsistencies is not so hard. Pointing them out might be a -little- useful. But resolving them based on trustworthy sources can be a -lot- harder. Most science papers require privileged access. Many news stories may have been grounded in old, mistaken histories ... if not on outright guesses, distortions or even lies. (The older the history, the worse.)

And, since LLMs are usually incapable of citing sources for their own (often batshit) claims any -- where will 'the right answers' come from? I've seen LLMs, when questioned again, apologize that their previous answers were wrong.

[–] architect@thelemmy.club -1 points 1 day ago (1 children)

Which LLMs are incapable of citing sources?

[–] jacksilver@lemmy.world 7 points 1 day ago

All of them. If you're seeing sources cited, it means it's a RAG (LLM with extra bits). The extra bits make a big difference as it means the response is limited to a select few points of reference and isn't comparing all known knowledge on a subject matter.