this post was submitted on 29 Jun 2025
468 points (95.9% liked)
Technology
72017 readers
3478 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The best therapist in the world can still end your career by causing your clearance to be revoked or rendering you unqualified for your unit’s mission.
(Suicide is a big problem in the military, I lost a buddy to it.)
The cheapest therapist in the world may still not be covered by your insurance. (And nothing you write in reply will alter that.)
They should work to make AI therapy better while keeping it totally anonymous. If it were really good it would be the number one use for running a local and disconnected and air gapped LLM: perfectly private therapy with no “we just use telemetry to improve our product” bullshit.
Then maybe a lot more men would seek help/talk about their thoughts and feelings.
I'm not in the military but I've worked with ts/sci cleared folks at a tech company, and this sounds odd to me. Can you explain a little more here? What's an example of a problem that, if discussed in therapy, could result in revocation of a security clearance?
They can’t make it better… you can’t have a relationship with an autocorrect