this post was submitted on 02 Apr 2025
243 points (96.9% liked)

Not The Onion

15614 readers
2583 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] superb@lemmy.blahaj.zone 4 points 1 day ago (1 children)

And what of it? They can’t learn. What things they can “learn” will quickly slide out of the context window during the next lecture

[–] ShellMonkey@lemmy.socdojo.com 1 points 8 hours ago* (last edited 8 hours ago)

I can get the anger people have and all that, but as a general tech they're still impressive all the same.

Running from a consumer piece of hardware I can take a model maybe 10-20 GB in size and tell it a few things I want. Then based on what it knows those things look like (or text or sound, whatever just picking on images) it creates something from the void. Sometimes stupid things, but something.

The model was created from images already existing but it doesn't store them whole cloth and just paste them together, so even if one where to think of it as derivations of existing images, at the very least it's an impressive storage compression to hold so much in a relatively small space.