this post was submitted on 26 Jul 2025
340 points (98.6% liked)

Technology

73254 readers
4496 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spankmonkey@lemmy.world 69 points 23 hours ago (5 children)

Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.

If it is necessary to fact check something every single time you use it, what benefit does it give?

[–] A_norny_mousse@feddit.org 31 points 22 hours ago

None. None at all.

[–] brsrklf@jlai.lu 19 points 22 hours ago* (last edited 22 hours ago)

None. It's made with the clear intention of substituting itself to actual search results.

If you don't fact-check it, it's dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn't find on your own.

Well, except hallucinations, of course.

[–] Feyd@programming.dev 8 points 13 hours ago (1 children)

That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it's lying in 2 minutes.

[–] spankmonkey@lemmy.world 5 points 13 hours ago

"Thank you for wasting my time."

[–] artyom@piefed.social -2 points 18 hours ago (1 children)

It hasn't stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.

So yes, it's dumb, but they kind of have to do it at this point. And they need everyone to know it's available from the site they're already using, so they push it on everyone.

[–] spankmonkey@lemmy.world 7 points 16 hours ago (1 children)

No, they don't have to use defective technology just becsuse everyone else is.

[–] XTL@sopuli.xyz -5 points 19 hours ago (3 children)

It might be able to give you tables or otherwise collated sets of information about multiple products etc.

I don't know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It's a bit like using an encyclopedia or a catalog except more convenient and even less reliable.

[–] spankmonkey@lemmy.world 10 points 19 hours ago* (last edited 19 hours ago) (1 children)

Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.

It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.

Putting the data into tables and other formats isn't helpful if the data is wrong!

[–] A_norny_mousse@feddit.org 3 points 15 hours ago

a feature for converting units

So does DDG

[–] Feyd@programming.dev 2 points 13 hours ago* (last edited 13 hours ago) (2 children)

You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.

[–] alsimoneau@lemmy.ca 1 points 1 hour ago

On Linux there's also 'units' which is amazing for this.

[–] XTL@sopuli.xyz 0 points 3 hours ago* (last edited 3 hours ago) (1 children)

All things were doable before. The point is that they were manual extra steps.

[–] Feyd@programming.dev 2 points 2 hours ago* (last edited 59 minutes ago)

They weren't though. You put stuff in the search bar and it detected you were asking about unit conversion and gave you an answer, without ever involving an llm. Are you being dense on purpose?

[–] alsimoneau@lemmy.ca 1 points 1 hour ago

Or go to Wolfram Alpha and gat actual computations done instead of ramblings?