this post was submitted on 28 Oct 2025
        
      
      81 points (95.5% liked)
      Technology
    75756 readers
  
      
      7991 users here now
      This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
        founded 2 years ago
      
      MODERATORS
      
    you are viewing a single comment's thread
view the rest of the comments
    view the rest of the comments
Paying yourself to promote your own product. Promising to fix vague "risks" that make the product sound more powerful than it is, with "fixes" that won't be measurable.
In other words, Sam is cutting a $25 billion check to himself.
So they're already aware of the risks, AI companies are being run with the same big oil/big tobacco playbook lol. You can have all the fancy new technology but if the money is still coming from the same group of rich inbred douchebags it doesn't matter because it will turn to shit.
AI companies are definitely aware of the real risks. It's the imaginary ones ("what happens if AI becomes sentient and takes over the world?") that I imagine they'll put that money towards.
Meanwhile they (intentionally) fail to implement even a simple cutoff switch for a child that's expressing suicidal ideation. Most people with any programming knowledge could build a decent interception tool. All this talk about guardrails seems almost as fanciful.