this post was submitted on 18 Jul 2025
246 points (95.2% liked)

Technology

72971 readers
3210 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] match@pawb.social 9 points 2 days ago (4 children)

isn't this just paranoid schizophrenia? i don't think chatgpt can cause that

[–] Alphane_Moon@lemmy.world 17 points 2 days ago (2 children)

I have no professional skills in this area, but I would speculate that the fellow was already predisposed to schizophrenia and the LLM just triggered it (can happen with other things too like psychedelic drugs).

[–] SkaveRat@discuss.tchncs.de 5 points 2 days ago

I'd say it either triggered by itself or potentially drugs triggered it, and then started using an LLM and found all the patterns to feed that shizophrenic paranoia. it's avery self reinforcing loop

[–] zzx@lemmy.world 2 points 2 days ago

Yup. LLMs aren't making people crazy, but they are making crazy people worse

[–] nimble@lemmy.blahaj.zone 7 points 2 days ago

LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you're more likely to get a false positive from a chatgpt.

So i think it just exacerbates things more than alternatives

[–] leftzero@lemmy.dbzer0.com 4 points 1 day ago

LLMs are obligate yes-men.

They'll support and reinforce whatever rambling or delusion you talk to them about, and provide “evidence” to support it (made up evidence, of course, but if you're already down the rabbit hole you'll buy it).

And they'll keep doing that as long as you let them, since they're designed to keep you engaged (and paying).

They're extremely dangerous for anyone with the slightest addictive, delusional, suggestible, or paranoid tendencies, and should be regulated as such (but won't).

[–] Skydancer@pawb.social 2 points 1 day ago (1 children)

Could be. I've also seen similar delusions in people with syphilis that went un- or under-treated.

[–] ScoffingLizard@lemmy.dbzer0.com 1 points 3 hours ago

Where tf are people not treated for syphilis?