this post was submitted on 05 Jun 2025
934 points (98.6% liked)

Not The Onion

16548 readers
750 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] deathbird@mander.xyz 6 points 1 day ago (1 children)

Sue that therapist for malpractice! Wait....oh.

[–] jagged_circle@feddit.nl 3 points 1 day ago (2 children)

Pretty sure you can sue the ai company

[–] Case@lemmynsfw.com 3 points 1 day ago (1 children)

I mean, in theory... isn't that a company practicing medicine without the proper credentials?

I worked in IT for medical companies throughout my life, and my wife is a clinical tech.

There is shit we just CAN NOT say due to legal liabilities.

Like, my wife can generally tell whats going on with a patient - however - she does not have the credentials or authority to diagnose.

That includes tell the patient or their family what is going on. That is the doctor's job. That is the doctor's responsibility. That is the doctor's liability.

load more comments (1 replies)
[–] webghost0101@sopuli.xyz 3 points 1 day ago (1 children)

Pretty sure its in the Tos it can’t be used for therapy.

It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.

load more comments (1 replies)
[–] ivanafterall@lemmy.world 6 points 1 day ago

The article doesn't seem to specify whether Pedro had earned the treat for himself? I don't see the harm in a little self-care/occasional treat?

[–] Cattail@lemmy.world 5 points 22 hours ago (1 children)

sometimes i have a hard time waking up so a little meth helps

load more comments (1 replies)
[–] TheDeadlySquid@lemm.ee 5 points 1 day ago

And thus the flaw in AI is revealed.

[–] LovableSidekick@lemmy.world 5 points 2 days ago* (last edited 2 days ago) (6 children)

But meth is only for Saturdays. Or Tuesdays. Or days with "y" in them.

[–] GreenKnight23@lemmy.world 3 points 2 days ago

everyday is meythday if you're spun out enough.

load more comments (5 replies)
[–] pixxelkick@lemmy.world 4 points 2 days ago* (last edited 2 days ago) (4 children)

Anytime an article posts shit like this but neglects to include the full context, it reminds me how bad journalism is today if you can even call it that

If I try, not even that hard, I can get gpt to state Hitler was a cool guy and was doing the right thing.

ChatGPT isn't anything in specific other than a token predictor, you can literally make it say anything you want if you know how, it's not hard.

So if you wrote an article about how "gpt said this" or "gpt said that" you better include the full context or I'll assume you are 100% bullshit

load more comments (4 replies)
[–] thirdBreakfast@lemmy.world 3 points 1 day ago

> afterallwhynot.jpg

load more comments
view more: ‹ prev next ›