this post was submitted on 29 Jun 2025
481 points (95.8% liked)

Technology

72017 readers
3783 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

you are viewing a single comment's thread
view the rest of the comments
[–] gerryflap@feddit.nl 7 points 1 day ago (2 children)

Better than nothing I guess. Obviously it's a privacy nightmare. But therapy is hard to reach nowadays and I've noticed that many men are reluctant to make that step. It'd be preferable if they did, but if ChatGPT can at least give an outlet for the emotions then it might just save a few people. Seeing men demolish themselves because they're too ashamed to seek help is something I've unfortunately seen quite often. Even though I'm aware of this I've still waited till it was way too late because I subconsciously didn't want to give in to the "weakness". I hate that men are conditioned this way, it costs lives.

[–] dsilverz@friendica.world 17 points 1 day ago

@gerryflap @bytesonbike

many men are reluctant to make that step

Sometimes it's not the patient to blame. I made the step, countless times since my childhood... I sought help... Result? Got several, diverging diagnostics, several medications that didn't work, until the most recent psychiatrist and psychologist some months ago: the psychiatrist said I got "nothing" (even when I had a fresh cut on my wrist) and the second "struggled to find any complaints from me". So I simply gave up on seeking medical care (and "care" in general, human or whatnot). I don't use AI for therapy because, as a former programmer, I'm deeply aware of their underlying Markov chain and NN algorithms, but sometimes their probabilistic outputs lead me to insights I couldn't get from any living Homo sapiens beings (such as the possibility that I have "Geschwind Syndrome", a condition of which will probably stay undiagnosed).

[–] Iceblade02@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

It's possible to reduce the privacy issues by using APIs with a local frontend. Given that APIs usually cater to companies instead of end consumers they actually have simple opt-outs for information logging.

Requires a bit of know-how, and you'll be paying for your llm per use (not that bad actually, I've personally averaged <10$/yr in api costs) but at least you get to have all your personal issues on your local device instead.

For a chatGPT-like experience you probably want the ooga booga web generation ui but there's others too.