*because that’s what the prompt they were testing was designed to elicit.
Not The Onion
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
yup.
its so bs thad for som reason the peeps r treatin this as if its a new thing...
like - if i prompt my qwen to be bold, have a moral compass n take actions accordin to thad..
yea - itll tell peeps bout my affair.. ~if~ ~i~ ~had~ ~one..~
EDIT: dis entices me to do similar bs now... thad be funi >v<
That's just an ad.
Anthropic keeps pulling this bullshit line of advertising. LLMs will make up stories when you ask them to.
Good thing no one found out! /s
What is even the point of this research?
Seemingly to prove the people who have the skill to build an AI system are exactly the people you shouldn’t let run an AI system.