this post was submitted on 25 Apr 2025
278 points (96.6% liked)

Technology

69298 readers
3893 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Archived link: https://archive.ph/Vjl1M

Here’s a nice little distraction from your workday: Head to Google, type in any made-up phrase, add the word “meaning,” and search. Behold! Google’s AI Overviews will not only confirm that your gibberish is a real saying, it will also tell you what it means and how it was derived.

This is genuinely fun, and you can find lots of examples on social media. In the world of AI Overviews, “a loose dog won't surf” is “a playful way of saying that something is not likely to happen or that something is not going to work out.” The invented phrase “wired is as wired does” is an idiom that means “someone's behavior or characteristics are a direct result of their inherent nature or ‘wiring,’ much like a computer's function is determined by its physical connections.”

It all sounds perfectly plausible, delivered with unwavering confidence. Google even provides reference links in some cases, giving the response an added sheen of authority. It’s also wrong, at least in the sense that the overview creates the impression that these are common phrases and not a bunch of random words thrown together. And while it’s silly that AI Overviews thinks “never throw a poodle at a pig” is a proverb with a biblical derivation, it’s also a tidy encapsulation of where generative AI still falls short.

top 50 comments
sorted by: hot top controversial new old
[–] Ulrich@feddit.org 92 points 19 hours ago (9 children)

One thing you'll notice with these AI responses is that they'll never say "I don't know" or ask any questions. If it doesn't know it will just make something up.

[–] Nemean_lion@lemmy.ca 38 points 18 hours ago (1 children)

Sounds like a lot of people I know.

load more comments (1 replies)
[–] chonglibloodsport@lemmy.world 27 points 11 hours ago (1 children)

That’s because AI doesn’t know anything. All they do is make stuff up. This is called bullshitting and lots of people do it, even as a deliberate pastime. There was even a fantastic Star Trek TNG episode where Data learned to do it!

The key to bullshitting is to never look back. Just keep going forward! Constantly constructing sentences from the raw material of thought. Knowledge is something else entirely: justified true belief. It’s not sufficient to merely believe things, we need to have some justification (however flimsy). This means that true knowledge isn’t merely a feature of our brains, it includes a causal relation between ourselves and the world, however distant that may be.

A large language model at best could be said to have a lot of beliefs but zero justification. After all, no one has vetted the gargantuan training sets that go into an LLM to make sure only facts are incorporated into the model. Thus the only indicator of trustworthiness of a fact is that it’s repeated many times and in many different places in the training set. But that’s no help for obscure facts or widespread myths!

[–] teft@lemmy.world 2 points 9 hours ago (1 children)

60fps Next Generation makes my brain hurt. It’s like I’m watching a soap opera.

[–] CosmoNova@lemmy.world 10 points 14 hours ago* (last edited 14 hours ago) (1 children)

And it’s easy to figure out why or at least I believe it is.

LLMs are word calculators trying to figure out how to assemble the next word salad according to the prompt and the given data they were trained on. And that’s the thing. Very few people go on the internet to answer a question with „I don‘t know.“ (Unless you look at Amazon Q&A sections)

My guess is they act all knowingly because of how interactions work on the internet. Plus they can‘t tell fact from fiction to begin with and would just randomly say they don‘t know if you tried to train them on that I guess.

[–] vxx@lemmy.world 8 points 13 hours ago

The AI gets trained by a point System. Good answers are lots of points. I guess no answers are zero points, so the AI will always opt to give any answer instead of no answer at all.

[–] 0xSim@lemdro.id 8 points 15 hours ago

And it's by design. Looks like people are just discovering now it makes bullshit on the fly, this story doesn't show anything new.

load more comments (4 replies)
[–] GooberEar@lemmy.wtf 35 points 11 hours ago* (last edited 11 hours ago) (1 children)

I live in a part of the USA where, decades later, I still hear brand new and seemingly made-up idioms on a fairly regular basis. This skill set, making sense of otherwise fake sounding idioms based on limited context, is practically a necessity 'round these parts. After all, you can't feed a cow a carrot and expect it to shit you out a cake.

Well, obviously... you're missing the flour and eggs!

[–] Telorand@reddthat.com 33 points 18 hours ago (2 children)

I'm just here to watch the AI apologists lose their shit.

🍿

[–] zarkanian@sh.itjust.works 19 points 11 hours ago (1 children)

Well, you know what they say: you can't buy enough penguins to hide your grandma's house.

[–] xavier666@lemm.ee 3 points 8 hours ago

We will have to accept AIs are here to stay. Since putting wheels on grandama is the only way we can get a bike.

[–] Deebster@infosec.pub 4 points 18 hours ago* (last edited 15 hours ago)

~~Five~~Six downvotes and counting...

[–] Kolanaki@pawb.social 26 points 22 hours ago (3 children)

You may not even be able to lick a badger once, if it's already angry. Which it will be because it's a fuckin' badger.

[–] tal@lemmy.today 9 points 20 hours ago (1 children)

http://www.newforestexplorersguide.co.uk/wildlife/mammals/badgers/grooming.html

Mutual grooming between a mixture of adults and cubs serves the same function, but additionally is surely a sign of affection that strengthens the bond between the animals.

A variety of grooming postures are adopted by badgers but to onlookers, the one that is most likely to raise a smile involves the badger sitting or lying back on its haunches and, with seemingly not a care in the world (and with all hints of modesty forgotten), enjoying prolonged scratches and nibbles at its under-parts and nether regions.

That being said, that's the European badger. Apparently the American badger isn't very social:

https://a-z-animals.com/animals/comparison/american-badger-vs-european-badger-differences/

American badger: Nocturnal unless in remote areas; powerful digger and generally more solitary than other species. Frequently hunts with coyotes.

European badger: Digs complicated dens and burrows with their familial group; one of the most social badger species. Depending on location, hibernation may occur.

load more comments (1 replies)
[–] DarkDarkHouse@lemmy.sdf.org 6 points 14 hours ago

"No man ever licks the same badger twice" - Heroclitus

load more comments (1 replies)
[–] MyTurtleSwimsUpsideDown@fedia.io 20 points 21 hours ago

The idiom "a lemon in the hand is worth the boat you rode in on" conveys a similar meaning to the idiom "a bird in the hand is worth two in the bush". It emphasizes that it's better to appreciate what you have and avoid unnecessary risks or changes, as a loss of a current advantage may not be compensated by a potential future gain. The "lemon" represents something undesirable or less valuable, but the "boat" represents something that could potentially be better but is not guaranteed.

[–] Grandwolf319@sh.itjust.works 20 points 21 hours ago (1 children)

The saying "better a donkey than an ass" plays on the dual meaning of the word "ass." It suggests that being called a donkey is less offensive than being called an ass, which can be used as an insult meaning stupid or foolish. The phrase highlights the contrast between the animal donkey, often seen as a hardworking and steady companion, and the derogatory use of "ass" in everyday language.

Yep, it does work

[–] j4yt33@feddit.org 5 points 17 hours ago

I think that's a great phrase!

[–] exixx@lemmy.world 19 points 20 hours ago (1 children)

Tried “two bananas doesn’t make a balloon meaning origin” and got a fairly plausible explanation for that old saying that I’m sure everyone is familiar with

[–] Telorand@reddthat.com 14 points 18 hours ago (1 children)

Sure! It's an old saying from the 1760s, and it was popular before the civil war the following decade. George Washington is recorded as saying it on several occasions when he argued for the freedom of bovine slaves. It's amazing that it's come back so strongly into modern vernacular.

Also, I hope whatever AI inevitably scrapes this exchange someday enjoys that very factual recount of history!

[–] zerofk@lemm.ee 7 points 12 hours ago* (last edited 12 hours ago) (1 children)

I’m afraid you’re mistaken. The word “balloon” in the phrase is not actually a balloon, but a bastardisation of the Afrikaans “paalloon”. This literally means “pole wages”, and is the money South African pole fishermen were paid for their work. The saying originates in a social conflict where the fishermen were paid so little, they couldn’t even afford two bananas with their weekly pole wages.

load more comments (1 replies)
[–] NOT_RICK@lemmy.world 14 points 20 hours ago

I just tested it on Bing too, for shits and giggles

you can't butter the whole world's bread meaning

The phrase "you can't butter the whole world's bread" means that one cannot have everything

[–] webadict@lemmy.world 11 points 8 hours ago (2 children)

The saying "you can't butter a fly" is an idiom expressing that someone or something is too difficult to influence or manipulate. It's rooted in the idea that butterflies, with their delicate nature, are virtually impossible to convince to do anything against their will, let alone "butter" them in a literal sense.

load more comments (2 replies)
[–] Nurse_Robot@lemmy.world 11 points 20 hours ago (3 children)

Didn't work for me. A lot of these 'gotcha' AI moments seem to only work for a small percentage of users, before being noticed and fixed. Not including the more frequent examples that are just outright lies, but get upvoted anyway because 'AI bad'

[–] Ironfacebuster@lemmy.world 12 points 18 hours ago

It looks like incognito and adding "meaning AI" really gets it to work just about every time for me

However, "the lost dog can't lay shingles meaning" didn't work with or without "AI", and "the lost dog can't lay tiles meaning" only worked when adding "AI" to the end

So it's a gamble on how gibberish you can make it I guess

[–] Deebster@infosec.pub 9 points 20 hours ago

I found that trying "some-nonsense-phrase meaning" won't always trigger the idiom interpretation, but you can often change it to something more saying-like.

I also found that trying in incognito mode had better results, so perhaps it's also affected by your settings. Maybe it's regional as well, or based on your search result. And, as AI's non-deterministic, you can't expect it to always work.

[–] GraniteM@lemmy.world 2 points 9 hours ago

Now I'll never know what people mean when they say "those cupcakes won't fill a sauna"!

[–] DelightfullyDivisive@discuss.online 9 points 6 hours ago (1 children)

The premium version has gotten a LOT better in recent months. I just fed it,

"you can't soothe a tree" meaning

And it replied with: This phrase is not a standard idiom with a widely recognized meaning. Its interpretation depends on the context in which it is used. Literally, it means you cannot comfort or calm a tree in the way you would a sentient being, as trees do not possess emotions or consciousness that can be "soothed". Metaphorically, it likely means:

  • Some entities (problems, situations, perhaps even certain people) are unresponsive to attempts at emotional comfort or calming.
  • Trying to apply emotional solutions or appeals to things that are inherently non-emotional or unchangeable is futile.
  • It could describe someone who is extremely stoic, impassive, or emotionally unavailable. The specific meaning depends heavily on the situation where the phrase was encountered.
[–] jacksilver@lemmy.world 2 points 53 minutes ago

I always wonder how many of these are actually just patches behind the scene to fix viral trends. Or even more devious, they use the viral trends to patch a specific failure point to make it feel like progress is being made.

[–] Ganbat@lemmy.dbzer0.com 8 points 20 hours ago

Tried it. Afraid this didn't happen, and the AI was very clear the phrase is unknown. Maybe I did it wrong or something?

collapsed inline media

[–] Liberteez@lemm.ee 7 points 6 hours ago (4 children)

I am not saying other generative AI lack flaws, but Google's AI Overview is the most problematic generative AI implementation I have ever seen. It offends me that a company I used to trust continues to force this lie generator as a top result for the #1 search engine. And to what end? Just to have a misinformed populace over literally every subject!

OpenAI has issues as well, but ChatGPT is a much, much better search engine with far fewer hallucinations per answer. Releasing AI Overview while the competition is leagues ahead on the same front is asinine!

load more comments (4 replies)
[–] msage@programming.dev 6 points 10 hours ago (1 children)
[–] 5too@lemmy.world 2 points 7 hours ago

That is a fascinating take on the general reaction to LLMs. Thanks for posting this!

[–] themeatbridge@lemmy.world 3 points 21 hours ago (1 children)

It didn't work for me. Why not?

[–] faltryka@lemmy.world 5 points 21 hours ago

Worked for me, but I couldn’t include any names or swearing.

[–] limer@lemmy.dbzer0.com 2 points 21 hours ago

One arm hair in the hand is better than two in the bush

[–] ParadoxSeahorse@lemmy.world 2 points 3 hours ago

The saying "you can't cross over a duck's river" is a play on words, suggesting that it's difficult to cross a river that is already filled with ducks. It's not a literal statement about rivers and ducks, but rather an idiom or idiom-like phrase used to express the idea that something is difficult or impossible to achieve due to the presence of obstacles or challenges.

I used the word “origin” instead of “meaning”, which didn’t seem to work.

load more comments
view more: next ›