this post was submitted on 26 Aug 2025
1226 points (96.9% liked)

interestingasfuck

8089 readers
1 users here now

interestingasfuck

founded 2 years ago
 
you are viewing a single comment's thread
view the rest of the comments
[–] De_Narm@lemmy.world 77 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

Sooo... is this image copyright infringement?

There are just so many weird cases, based on the wording. Would Youtube need to scan for Danes within all uploads to check for copyright violations? Which is obviously impossible.

[–] SoupBrick@pawb.social 74 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

IMO, better to get consumer protection laws in place early and refine them over time, than not at all.

The longer these things wait, the more time corpos have to get their influence in and either stop the efforts or water them down to be entirely ineffective.

Edit: Don't forget to read about it. https://www.globallawtoday.com/law/legal-news/2025/06/denmarks-groundbreaking-move-copyright-for-faces-and-voices/

[–] criticon@lemmy.ca 20 points 2 weeks ago* (last edited 2 weeks ago)

But rushed and incomplete bills can come with bad implementations that make them useless

-this post is known to the state of California to cause cancer

[–] finitebanjo@lemmy.world 1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I can imagine situations where this is a bad idea, such as making almost all journalism illegal because you don't have to legal right to cover news about an individual.

Hopefully they plan for that.

[–] Tja@programming.dev 1 points 2 weeks ago (1 children)

There are long established exceptions for use of copyrighted materials, and journalism is one of them.

[–] finitebanjo@lemmy.world 1 points 2 weeks ago

Good for Denmark.

IMO, better to get laws in place early and refine them over time, than not at all.

So... Move fast and break things?

[–] Stillwater@sh.itjust.works 35 points 2 weeks ago (2 children)

I'd figure the scenario would be that YouTube would need to respect takedown request from people whose likeness had been appropriated, which isn't that absurd

[–] De_Narm@lemmy.world 6 points 2 weeks ago

That's likely, but that would only help with the most viral cases. Otherwise, what's even the chance to come across AI generated content violating your copyright in an exponentially growing ocean of slop?

On the flipside, individuals could probably maliciously claim ad revenue. That's already a thing with music.

[–] Stovetop@lemmy.world 4 points 2 weeks ago (1 children)

Does have me wondering how YouTube would verify likeness, though. I could just find a video I don't like and claim to be a person in it. If all they need is a photo, I feel like that'd be easy to mock up. If they require government ID, that's getting into uncomfortable UK-esque ID verification territory.

[–] CileTheSane@lemmy.ca 2 points 2 weeks ago

Requiring proof of identification when you are taking legal action is significantly different from requiring proof of ID at all times.

Considering how lazy YouTube is about such things they'd probably just take your word for it and force the video creator to prove it isn't you in order to get their ad revenue back.

[–] Xaphanos@lemmy.world 16 points 2 weeks ago

Also, how many times have you seen a photo of someone that looks just like someone else that is entirely unrelated? Old photos in particular.

[–] pelespirit@sh.itjust.works 7 points 2 weeks ago (1 children)

I'm sure Youtube could figure it out, they do with music.

[–] dohpaz42@lemmy.world 1 points 2 weeks ago (1 children)

Automated anything on such a grand scale is always a Bad Idea (tm). It’s better to just let copyright holders flag videos manually. Less likely to get weaponized that way. Of course, that’s anecdotal and purely my opinion.

[–] corsicanguppy@lemmy.ca 4 points 2 weeks ago

Automatic protection for people without them having to chase it in the courts is, somehow, a bad thing?

[–] zarkanian@sh.itjust.works 1 points 2 weeks ago

It can't be copyright. They just used the wrong term.