this post was submitted on 13 Mar 2025
547 points (98.8% liked)

Technology

66353 readers
4590 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://infosec.pub/post/24994013

CJR study shows AI search services misinform users and ignore publisher exclusion requests.

you are viewing a single comment's thread
view the rest of the comments
[–] TheGoldenGod@lemmy.world 44 points 17 hours ago (5 children)

Training AI with internet content was always going to fail, as at least 60% of users online are trolls. It's even dumber than expecting you can have a child from anal sex.

[–] Rivalarrival@lemmy.today 18 points 17 hours ago (1 children)

It's even dumber than expecting you can have a child from anal sex.

I'm not nearly as sure of this today as I was before the election.

[–] musubibreakfast@lemm.ee 10 points 16 hours ago (2 children)

Because of what you just wrote some dumb ass is going to try to have a child through anal sex after doing a google search.

[–] T156@lemmy.world 13 points 16 hours ago (2 children)
[–] pyre@lemmy.world 3 points 2 hours ago

they've been having sex the wrong way

that's subjective

[–] NikkiDimes@lemmy.world 2 points 2 hours ago

There's no way this isn't bullshit. Please let this be bullshit...

[–] Imgonnatrythis@sh.itjust.works 7 points 16 hours ago (1 children)

I'm gonna go ahead and try without a Google search.

[–] musubibreakfast@lemm.ee 5 points 16 hours ago

I believe in you, please name your child after me if it works out.

There was that one time when an AI gave a pizza recipe including gluing the cheese down with Elmer's glue, because that was suggested as a joke on Reddit once.

There will never be such a thing as a useful LLM.

[–] desktop_user@lemmy.blahaj.zone 2 points 17 hours ago

but you can, it's about as likely as having one from a thigh-job but is technically not impossible.

[–] noodlejetski@lemm.ee 1 points 17 hours ago

where do you think lawyers come from?