this post was submitted on 01 Dec 2025
70 points (72.4% liked)

Technology

77090 readers
3338 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its "Extended Thinking" version) to find an error in "Today's featured article". In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.

you are viewing a single comment's thread
view the rest of the comments
[–] anamethatisnt@sopuli.xyz 7 points 2 days ago

Yeah, my morning brain was trying to say that when it is used as a tool by someone that can validate the output and act upon it then it's often good. When it is used by someone who can't, or won't, validate the output and simply uses it as the finished product then it usually isn't any good.

Regarding your friend learning to use the terminal I'd still recommend validating the output before using it. If it's asking genAI about flags for ls then sure no big deal, but if a genAI ends up switching around sda and sdb in your dd command resulting in a wiped drive you only got yourself to blame for not checking the manual.