this post was submitted on 15 Jun 2025
        
      
      645 points (99.5% liked)
      PC Gaming
    12595 readers
  
      
      737 users here now
      For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
        founded 2 years ago
      
      MODERATORS
      
    you are viewing a single comment's thread
view the rest of the comments
    view the rest of the comments
5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at 'normie' AI bros trying to use them online, shit doesn't work.
4090 is... mediocre because it's expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.
Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.
The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia's (and AMD's) fault for literally being anticompetitive.