this post was submitted on 07 Dec 2025
240 points (97.6% liked)

Technology

77090 readers
2539 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The plaintiffs’ brief alleges that Meta was aware that its platforms were endangering young users, including by exacerbating adolescents’ mental health issues. According to the plaintiffs, Meta frequently detected content related to eating disorders, child sexual abuse, and suicide but refused to remove it. For example, one 2021 internal company survey found that more than 8 percent of respondents aged 13 to 15 had seen someone harm themself or threaten to harm themself on Instagram during the past week. The brief also makes clear that Meta fully understood the addictive nature of its products, with plaintiffs citing a message by one user-experience researcher at the company that Instagram “is a drug” and, “We’re basically pushers.”

Perhaps most relevant to state child endangerment laws, the plaintiffs have alleged that Meta knew that millions of adults were using its platforms to inappropriately contact minors. According to their filing, an internal company audit found that Instagram had recommended 1.4 million potentially inappropriate adults to teenagers in a single day in 2022. The brief also details how Instagram’s policy was to not take action against sexual solicitation until a user had been caught engaging in the “trafficking of humans for sex” a whopping 17 times. As Instagram’s former head of safety and well-being, Vaishnavi Jayakumar, reportedly testified, “You could incur 16 violations for prostitution and sexual solicitation, and upon the seventeenth violation, your account would be suspended.”

top 9 comments
sorted by: hot top controversial new old
[–] FriendOfDeSoto@startrek.website 38 points 12 hours ago (1 children)

If these things were clean cut, they would have been dragged to court already many times over. For messing with teenage girls for a laugh 10 years ago. For tacitly approving genocide in Myanmar. For cheating on their video views during the highly successful pivot to video. A good lawyer will get them out of this one too with but a slap on the wrist. They exist in a gray zone where they can fuck up as much as they want to without having to fear great consequences. Vote for politicians who want to regulate these companies more.

[–] marx@piefed.social 16 points 11 hours ago* (last edited 11 hours ago) (4 children)

Americans, as a general population, don't give a shit about Myanmar, may not know it even exists. They don't really care or know about video view controversies and the like.

One thing they do care A LOT about, is their kids. And the evidence is strong that Mark Zuckerberg and Meta executives knew children, on a mass scale, were being endangered by their products and deliberately, purposely allowed it to continue. They need to be prosecuted. If nobody even tries, then we've already lost.

[–] fluffykittycat@slrpnk.net 13 points 10 hours ago (1 children)

America doesn't care about kids, only exploiting them

[–] architect@thelemmy.club 8 points 9 hours ago

America isn’t unique. Look at the British pedophiles the Japanese pedophiles and what they do to little boys in half the world like the Middle East. America LOL meanwhile they cut clits off little girls in Africa.

No one cares about kids. Save your America shame.

[–] FriendOfDeSoto@startrek.website 7 points 8 hours ago

Americans, as a general population, don't give a shit about Myanmar, may not know it even exists.

I would say that's irrelevant for the crimes committed. And not just Americans would struggle to find Myanmar on a map. Or really care what's going on there unless it's rooting out phishing farms using abducted foreigners.

I commend your view on the matter, that when it comes to their children they will do something. That may turn out to be true. However, that's not going to be enough to get anyone at meta convicted under the current laws. They are running under a cover of diffuse authority and supervision internally and section 230 externally. Abhorent drug pusher comments are not admissions of guilt. They have good lawyers. We need new laws, more regulation, and fines that make Wall Street worried.

[–] architect@thelemmy.club 4 points 9 hours ago

I promise you people don’t give a shit about kids or their kids. I was once one and not a single fucking adult gave a shit about the sexual abuse. They get mad at the kids instead. I’ve been there. Actions speak louder than words. Kids are getting shot. Adults do not care about kids as a whole.

[–] fishos@lemmy.world 2 points 9 hours ago

Honestly, I care a dick load more about Myanmar and enabling genocide than "but think of the children!". That's one of the laziest and most misued calls to action and at this point, I honestly dgaf when I hear it. It's just propaganda at this point.

Don't be so quick to stereotype us. You're insulting those of use who do pay attention.

[–] orbituary@lemmy.dbzer0.com 5 points 11 hours ago

Articles like this are exhausting. Yes. The answer is yes. Will it happen? Drum roll... No. It won't happen. Need evidence? Look at the United States government.

[–] BigMacHole@sopuli.xyz 2 points 11 hours ago

These people are STUPID! You CANT arrest RICH people!