this post was submitted on 25 Jun 2025
152 points (97.5% liked)
Technology
71866 readers
5067 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you aren't allowed to freely use data for training without a license, then the fear is that only large companies will own enough works or be able to afford licenses to train models.
If they can just steal a creator's work, how do they suppose creators will be able to afford continuing to be creators?
Right. They think we have enough original works that the machines can just make any new creations.
๐
It is entirely possible that the entire construct of copyright just isn't fit to regulate this and the "right to train" or to avoid training needs to be formulated separately.
The maximalist, knee-jerk assumption that all AI training is copying is feeding into the interests of, ironically, a bunch of AI companies. That doesn't mean that actual authors and artists don't have an interest in regulating this space.
The big takeaway, in my book, is copyright is finally broken beyond all usability. Let's scrap it and start over with the media landscape we actually have, not the eighteenth century version of it.
I'm fairly certain this is the correct answer here. Also there is a seperation between judicative and legislative. It's the former which is involved, but we really need to bother the latter. It's the only way, unless we want to use 18th century tools on the current situation.
Yeah, I guess the debate is which is the lesser evil. I didn't make the original comment but I think this is what they were getting at.
Absolutely. The current copyright system is terrible but an AI replacement of creators is worse.
Yes precisely.
I don't see a situation where the actual content creators get paid.
We either get open source ai, or we get closed ai where the big ai companies and copyright companies make bank.
I think people are having huge knee jerk reactions and end up supporting companies like Disney, Universal Music and Google.
The companies like record studio who already own all the copyrights aren't going to pay creators for something they already owned.
All the data has already been signed away. People are really optimistic about an industry that has consistently fucked everyone they interact with for money.
Yes. But then do something about it. Regulate the market. Or pass laws which address this. I don't really see why we should do something like this then, it still kind of contributes to the problem as free reign still advantages big companies.
(And we can write in law whatever we like. It doesn't need to be a stupid and simplistic solution. If you're concerned with big companies, just write they have to pay a lot and small companies don't. Or force everyone to open their models. That's all options which can be formulated as a new rule. And those would address the issue at hand.)