this post was submitted on 29 May 2025
-28 points (23.1% liked)
Technology
71042 readers
3189 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'll preface this by saying I'm not an expert, and I don't like to speak authoritatively on things that I'm not an expert in, so it's possible I'm mistaken. Also I've had a drink or two, so that's not helping, but here we go anyways.
In the article, the author quips on a tweet where they seem to fundamentally misunderstand how LLMs work:
The tweet is correct. The LLM has a snapshot understanding of the internet based on its training data. It's not what we would generally consider a true index based search.
Training LLMs is a costly and time consuming process, so it's fundamentally impossible to regenerate an LLM in the same order of magnitude of time it takes to make a simple index.
The author fails to address any of these issues, which suggests to me that they don't know what they're talking about.
I suppose I could conceded that an LLM can fulfill a similar role that a search engine traditionally has, but it'd kinda be like saying that a toaster is an oven. They're both confined boxes which heat food, but good luck if you try to bake 2 pies at once in a toaster.
I think chat gpt does web searches now, maybe for the reasoning models. At least it looks like it's doing that.
Most do
One doesn't't need to know how an engine works to know the Ford pinto was a disaster
One doesn't need tknow how llms work to know they are pretty destructive and terrible
Nite I'm not going to argue this. It's just how things are now, and no apologetics will change what it is.