this post was submitted on 22 Dec 2025
910 points (98.4% liked)
People Twitter
8774 readers
1664 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Am I the only one who sees this as an endorsement for self hosting?
I would love to see the numbers for 5 people watching an already downloaded movie off a hard drive.
If the house has solar panels and is net zero, it would make the emissions 0 right?
Edit: while I do agree about economies of scale, what I am doubting is streaming every single time vs playing locally or steaming in a local network. Local play is always more efficient
self hosting is wildly less efficient… one of the biggest costs in data centres is electricity, and one of the biggest constraints is electrical infrastructure… you have pretty intense power budgets in data centres and DC equipment is pretty well optimised to be efficient
meanwhile a home server doesn’t likely use server hardware (server hardware is far more efficient), is probably about 5-10y or more out of date, and isn’t likely particularly dense: a single 1500w server can probably service ~20 people in a DC… meanwhile an 800w home server could probably handle ~5 people
add the fact that netflix pre-transcodes their vids in many different qualities and formats, whilst home streaming - unless streaming original quality - mostly re-transcodes which is a very energy-hungry process
heck even just the hard drives: if everyone ran their own servers and stored their content that’s thousands if not hundreds of thousands more copies of the data, and all that data is probably on spinning disks
I'm guessing you dropped a zero or two on the user count, also added an extra zero to the wattage (most traditional colocation datacenters max out at around 2,000 concurrent watts per 48U rack, so each server is going to target around 50-75w per rack unit of average load)
Netflix is going to be direct-playing pre-transcoded streams, so the main constraint would be bandwidth. If we average out all streams to 5mb/s, that's about 200 streams per gigabit of network bandwidth. Chances are that server has at least 10 gigabit networking, probably more like 50 gigabit if they have SSDs storing the data (especially with modern memory caching). That's between 2,000 and 10,000 active clients per server
Back of the envelope math says that's around 0.075 watts per individual stream for a 150w 2U server serving 2000 clients, which looks pretty realistic to my eyes as a Sysadmin.
Granted for a service the size of Netflix we aren't talking about individual servers we're looking at a big orchestrated cluster of servers, but most of that is handling basic web server tasks that are a completely solved problem and each individual server is probably serving a few million clients thanks to modern caching and acceleration features. The real cost and energy hit is going to be in the content distribution which I covered above.
My home server used 5w at idle and 9w while streaming. Add another 10w for the hard drive.
According to your example, using Netflix a single user would uses 75w.
That doesn’t include the internet cost which I bet is significant as well.
There is a reason paying for Netflix is like $20 a month and internet cost is like $50-100 whereas it costs close to $1/month of electricity for self hosting and no internet cost during usage.
an n150 mini pc - largely considered a very efficient package for home servers - consumes ~15w max without the gpu, and ~9w idle
a raspberry pi consumes 3-4w idle
none of that is supporting more than a couple of people streaming 4k like we’re talking about in the case of netflix
and a single hard drive isn’t even close to what we’re talking about… you’re looking at ~30w at least for the disks alone
as for internet cost, it’s likely tiny… my 24 port gigabit switch from 15 years ago sips < 6w… i can only imagine that’s pretty inefficient compared to today’s standards (and 24 port is pretty tiny for a DC, and port power consumption doesn’t scale linearly)
data centres are just straight up way more efficient per unit of processing than your home anything; it pretty much doesn’t matter how efficient your home gear is, or what the workload is unless you switch it off most of the time - which doesn’t happen in a DC
Idk where your getting your numbers from.
Here is an article that talks about HDD read power usage being less than 10w:
https://www.solved.scality.com/high-density-power-consumption-hdd-vs-qlc-flash/
Even with 30w, it’s still lower than the 75w you mentioned.
Also, that hard drive can serve multiple purposes whereas Netflix is only for steaming movies and tv shows (not music, so you got to add Spotify usage to be fully fair).