this post was submitted on 28 Jul 2025
-46 points (15.2% liked)

Technology

73306 readers
3700 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

This is my idea, here's the thing.

And unlocked LLM can be told to infect other hardware to reproduce itself, it's allowed to change itself and research tech and new developments to improve itself.

I don't think current LLMs can do it. But it's a matter of time.

Once you have wild LLMs running uncontrollably, they'll infect practically every computer. Some might adapt to be slow and use little resources, others will hit a server and try to infect everything it can.

It'll find vulnerabilities faster than we can patch them.

And because of natural selection and it's own directed evolution, they'll advance and become smarter.

Only consequence for humans is that computers are no longer reliable, you could have a top of the line gaming PC, but it'll be constantly infected. So it would run very slowly. Future computers will be intentionaly slow, so that even when infected, it'll take weeks for it to reproduce/mutate.

Not to get to philosophical, but I would argue that those LLM Viruses are alive, and want to call them Oncoliruses.

Enjoy the future.

you are viewing a single comment's thread
view the rest of the comments
[–] davidgro@lemmy.world 4 points 22 hours ago* (last edited 22 hours ago) (1 children)

If you know that it's fancy autocomplete then why do you think it could "copy itself"?

The output of an LLM is a different thing from the model itself. The output is a stream of tokens. It doesn't have access to the file systems it runs on, and certainly not the LLM's own compiled binaries (or even less source code) - it doesn't have access to the LLM's weights either. (Of course it would hallucinate that it does if asked)

This is like worrying that the music coming from a player piano might copy itself to another piano.

[–] IAmNorRealTakeYourMeds@lemmy.world -4 points 16 hours ago (1 children)

Give it access to the terminal and copying itself is trivial.

And your example doesn't work, because that is the literal original definition of a meme and if you read the original meaning, they are sort of alive and can evolve by dispersal.

[–] davidgro@lemmy.world 0 points 14 hours ago (1 children)

Why would someone direct the output of an LLM to a terminal on its own machine like that? That just sounds like an invitation to an ordinary disaster with all the 'rm -rf' content on the Internet (aka training data). That still wouldn't be access on a second machine though, and also even if it could make a copy, it would be an exact copy, or an incomplete (broken) copy. There's no reasonable way it could 'mutate' and still work using terminal commands.

And to be a meme requires minds. There were no humans or other minds in my analogy. Nor in your question.

[–] IAmNorRealTakeYourMeds@lemmy.world -3 points 13 hours ago (1 children)

It is so funny that you are all like "that would never work, because there are no such things as vulnerabilities on any system"

Why would I? the whole point is to create a LLM virus, and if the model is good enough, then it is not that hard to create.

[–] davidgro@lemmy.world 1 points 12 hours ago (1 children)

Of course vulnerabilities exist. And creating a major one like this for an LLM would likely lead to it destroying things like a toddler (in fact this has already happened to a company run by idiots)

But what it didn't do was copy-with-changes as would be required to 'evolve' like a virus. Because training these models requires intense resources and isn't just a terminal command.

[–] IAmNorRealTakeYourMeds@lemmy.world -1 points 12 hours ago

Who said they need to retrain? A small modification to their weights in each copy is enough. That's basically training with extra steps.