this post was submitted on 14 Mar 2025
841 points (99.1% liked)

Technology

66353 readers
4321 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Grimy@lemmy.world 2 points 10 hours ago (1 children)

I'm not sure how that applies in the current context, where it would be used as training data.

[–] FarceOfWill@infosec.pub 0 points 7 hours ago (1 children)

Because once you can generate the GPL code from the lossy ai database trained on it the GPL protection is meaningless.

[–] Grimy@lemmy.world 1 points 3 hours ago

In such a scenario, it will be worth it. Llm aren't databases that just hold copy pasted information. If we get to a point where it can spit out whole functional githubs replicating complex software, it will be able to do so with most software regardless of being trained on similar data or not.

All software will be a prompt away including the closed sourced ones. I don't think you can get more open source then that. But that's only if strident laws aren't put in place to ban open source ai models, since Google will put that one prompt behind a paychecks worth of money if they can.