this post was submitted on 05 Jun 2025
938 points (98.7% liked)

Not The Onion

16548 readers
818 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] webghost0101@sopuli.xyz 27 points 2 days ago* (last edited 2 days ago) (1 children)

The llm models aren’t, they don't really have focus or discriminate.

The ai chatbots that are build using those models absolutely are and its no secret.

What confuses me is that the article points to llama3 which is a meta owned model. But not to a chatbot.

This could be an official facebook ai (do they have one?) but it could also be. Bro i used this self hosted model to build a therapist, wanna try it for your meth problem?

Heck i could even see it happen that a dealer pretends to help customers who are trying to kick it.

[–] smee@poeng.link 2 points 1 day ago

For all we know, they could have self-hosted "Llama3.1_NightmareExtreme_RPG-StoryHorror8B_Q4_K_M" and instructed it to take on the role of a therapist.