this post was submitted on 11 Dec 2025
70 points (89.8% liked)

Technology

77090 readers
2138 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

After reading about this on hacker news, I get why they do it. Its to make people upload identification documents, to get them prepped to authenticate for using the internet. Now the world makes sense again. I was wondering why they would do something positive. But now I get it.

you are viewing a single comment's thread
view the rest of the comments
[–] Zagorath@aussie.zone 31 points 1 day ago (1 children)

What do you guys think about this? I think its a big positive.

It's not. But not for the reason you say.

I get why they do it. Its to make people upload identification documents

This is just some conspiracy theory nonsense. The law specifically says that photo ID cannot be the only way users can verify themselves. And it also says that any uploaded documents must not be used for any other purpose. No, the reason behind the law is exactly what they say it is: to protect kids. They're just really bad at their job and don't understand the ways this law will not accomplish that goal.

I'll repost some of my comments from elsewhere:

The ultimate goal is a good one. Keep kids safe from dangerous social media algorithms. The method used to arrive at it...the Government did the wrong thing at pretty much every opportunity they possibly could.

Step 1: the government should have considered regulating the actual algorithms. We know that Facebook has commissioned internal studies which told them certain features of their algorithm were harmful, and they decided to keep it that way because it increased stickiness a little bit. Regulate the use of harmful algorithms and you fix this not just for children, but for everyone

Step 2: if we've decided age verification must be done, it should be done in a way that preserves as much privacy and exposes people to as little risk as possible. The best method would be laws around parental controls. Require operating systems to support robust parental controls, including an API that web browsers and applications can access. Require social media sites and apps do access that API. Require parents to set up their children's devices correctly.

Step 3: if we really, really do insist on doing it by requiring each and every site do its own age verification, require it be done in privacy-preserving ways. There are secure methods called "zero-knowledge proofs" that could be used, if the government supported it. Or they could mandate that age verification is done using blinded digital signatures. This way, at least when you upload your photo or ID to get your age verified, the site doesn't actually get to know who you are, and the people doing the age verification don't get to know which sites you're accessing.

Step 4: make it apply to actually-harmful uses of social media, not a blanket ban on literally everything. Pixelfed is not harmful in the way Instagram is. It just isn't. It doesn't have the same insidious algorithms. Likewise Mastodon compared to Xitter. And why does Roblox, the site that has been the subject of multiple reports into how it facilitates child abuse get a pass, while Aussie.Zone has to do some ridiculous stuff to verify people's age? Not to mention Discord, which is clearly social media, and 4chan, which is...4chan.

Step 5: consider the positive things social media can do. Especially for neurodiverse and LGBTQ+ kids, finding supportive communities can be a literal life-saver, and social media is great at that.

Step 3.5: look at the UK. Their age restriction has been an absolute failure. People using footage from video games to prove they're old enough. Other people having their documents leaked because of insecure age verification processes and companies keeping data they absolutely should not be holding on to

And perhaps most importantly:

Step 0: Transparent democratic processes

Don't put up legislation and pass it within 1 week. Don't restrict public submissions to a mere 24 hours. Don't spend just 4 hours pretending to consider those public submissions that did manage to squeeze into your tight timeframe. There is literally no excuse for a Government to ever act that fast (with possible exception for quick responses to sudden, unexpected, acute crises, which this definitely is not). Good legislation takes time. Good democratic processes require listening to and considering a broad range of opinions. Even if everything about what the legislation delivered actually ended up perfect, this would be an absolutely shameful piece of legislation for the untransparent, anti-democratic way in which it was passed into law.

And that's not to mention the fact that in some ways, not having an account is making things more dangerous. Like how porn bans in other countries have basically just amounted to PornHub bans, with people able to ignore it by going to shadier sites with far worse content on them and less content moderation. And I've seen a number of parents point to YouTube in particular, saying that when their kids had an account, they were able to see the kids' watch history, and could tell the YouTube algorithm to stop recommending specific channels or types of content. Without an account, you can't do that.

And, naturally, we're already seeing cases of kids passing despite being under-age. 11 year-olds who get told they look 18. A 13 year-old whose parent said they could pass for 10, who—just by scrunching his face up a bit—got the facial recognition to say he's 30+. Shock-horror, facial recognition is not a reliable determiner of age. It never should have been allowed.