this post was submitted on 25 Mar 2025
108 points (95.8% liked)

Technology

67536 readers
4850 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

New research from OpenAI shows that heavy chatbot usage is correlated with loneliness and reduced socialization. Will AI companies learn from social networks' mistakes?

you are viewing a single comment's thread
view the rest of the comments
[–] MagicShel@lemmy.zip 32 points 2 days ago (1 children)

Note that these studies aren’t suggesting that heavy ChatGPT usage directly causes loneliness. Rather, it suggests that lonely people are more likely to seek emotional bonds with bots

The important question here is: do lonely people seek out interaction with AI or does AI create lonely people? The article clearly acknowledges this and then treats the latter like the likely conclusion. It definitely merits greater study.

[–] taladar@sh.itjust.works 12 points 2 days ago (2 children)

Or does AI prey on lonely people much like other types of scams do?

[–] MagicShel@lemmy.zip 6 points 2 days ago (2 children)

It's not sentient and has no agenda. It's fair to say suggest that advertise themselves as "AI companions" appeal to / prey on lonely people.

It's not a scam unless it purports to be a real person.

[–] taladar@sh.itjust.works 8 points 2 days ago

Well, I was more using the term in terms of the industry than the actual software. The thought of AI of the kind we currently have having intentions of its own didn't even occur to me.

[–] Arkouda@lemmy.ca 5 points 2 days ago

It’s not sentient and has no agenda.

The Humans who program them are and do.

[–] Enkers@sh.itjust.works 4 points 2 days ago

The AI industry certainly does.

If you're going to use an LLM, it's pretty straightforward to roll your own with something like LM Studio, though.