this post was submitted on 22 Dec 2025
951 points (98.5% liked)
People Twitter
8774 readers
1534 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm guessing you dropped a zero or two on the user count, also added an extra zero to the wattage (most traditional colocation datacenters max out at around 2,000 concurrent watts per 48U rack, so each server is going to target around 50-75w per rack unit of average load)
Netflix is going to be direct-playing pre-transcoded streams, so the main constraint would be bandwidth. If we average out all streams to 5mb/s, that's about 200 streams per gigabit of network bandwidth. Chances are that server has at least 10 gigabit networking, probably more like 50 gigabit if they have SSDs storing the data (especially with modern memory caching). That's between 2,000 and 10,000 active clients per server
Back of the envelope math says that's around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
Granted for a service the size of Netflix we aren't talking about individual servers we're looking at a big orchestrated cluster of servers, but most of that is handling basic web server tasks that are a completely solved problem and each individual server is probably serving a few million clients thanks to modern caching and acceleration features. The real cost and energy hit is going to be in the content distribution which I covered above.