this post was submitted on 29 Jun 2025
445 points (95.9% liked)
Technology
72017 readers
3126 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
that's easy to say, but when someone is in a crisis, I would be wrong to judge then for talking to an AI (shitty terrible solution) instead of a therapist that can be unaffordable and also comes with a risk of then being terrible.
a terrible therapist at least has an ethics board
a terrible therapist at least has evidence-based interventions on their side
a terrible therapist at lest has the fact that ~80% of positive outcomes have nothing to do with the interventions or anything the therapist does besides show up and be cool (a statistic I remember quite well from grad school)
AI has none of these things
therapy isn't fucking magic. it's a relationship. you can't have a relationship with an LLM. there's no such thing as AI therapy, you're just training it to tell you about CBT worksheets while you bitch about your problems like you're in a nail salon
The best therapist in the world can still end your career by causing your clearance to be revoked or rendering you unqualified for your unit’s mission.
(Suicide is a big problem in the military, I lost a buddy to it.)
The cheapest therapist in the world may still not be covered by your insurance. (And nothing you write in reply will alter that.)
They should work to make AI therapy better while keeping it totally anonymous. If it were really good it would be the number one use for running a local and disconnected and air gapped LLM: perfectly private therapy with no “we just use telemetry to improve our product” bullshit.
Then maybe a lot more men would seek help/talk about their thoughts and feelings.
I'm not in the military but I've worked with ts/sci cleared folks at a tech company, and this sounds odd to me. Can you explain a little more here? What's an example of a problem that, if discussed in therapy, could result in revocation of a security clearance?
They can’t make it better… you can’t have a relationship with an autocorrect
ok.
but the problem is that real therapy is expensive, and unaccessible, while AI is freely accessible, even though it's shit.
and open ai is profiting from that.
I'm just saying the blame should be aimed at the corporations and the healthcare system, rather than someone who is desperate for help
a terrible therapist can lock you in a room, some people don't want that risk