this post was submitted on 07 Jul 2025
1251 points (99.5% liked)

People Twitter

7654 readers
1205 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FelixCress@lemmy.world 73 points 3 days ago (7 children)

Fascists don't like truth. Facts don't fit their narrative.

[–] EmptySlime@lemmy.blahaj.zone 30 points 3 days ago (4 children)

The thing that gets me is they've apparently done multiple rounds of "correcting" Grok for being too "Woke" and it just keeps happening!

Reality has a well known liberal bias headass.

[–] kautau@lemmy.world 12 points 3 days ago (3 children)

Ironically, I think to truly train an LLM the way fascists would want, they'd need more content, but there's not enough original fascist revisionist content, so they'd need an LLM to generate all or most of the training data, which would lead to https://en.wikipedia.org/wiki/Model_collapse

[–] ChicoSuave@lemmy.world 25 points 3 days ago (1 children)

"Fascist burn too many books to train a fascist LLM" is a great joke.

[–] Tar_alcaran@sh.itjust.works 4 points 3 days ago

The big problem with training LLMs is that you need good data, but there's so much data you can't really manually separate all "good" from all "bad" data. You have to use the set of all data, and a much much smaller set of tagged and marked "good" data.

load more comments (1 replies)
load more comments (1 replies)
load more comments (3 replies)