this post was submitted on 22 Dec 2025
847 points (98.4% liked)
People Twitter
8774 readers
1515 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The numbers are are also clearly fictive. Driving a car for 4 miles uses about half a liter of fuel. A liter of gasoline contains about 9kwh of energy meaning, that you would use about 4.5 kwh per half hour of streaming. So the servers would have to draw about 9 KW to serve a single person? That would be like 10 gaming PCs running at full power to serve one person. Are they animating the shows in real time? No compression algorithm is that inefficient and no hard drive uses that much energy.
edit: also they could never be profitable like that. Let's say you watch three hours per day. That would be 9kWx3hrsx30days=810kwh per month. Even if they only pay 5 cents a kWh that would still be over $40 per month just in electricity cost for one user.
Thanks for doing the math. I'm not gonna check it, you seem trustworthy enough.
I’m not gonna check the numbers either. Because I have no idea how. And I don’t even understand them.
So obviously he’s right!
The numbers aren't too difficult to verify.
I found this Canadian government web page that says it's roughly 8.9 kWh, so that checks out.
Looking at the fuel efficiency table on that same website, it looks like OP used a reasonable average fuel efficiency of 30 mpg or slightly under 8L/100km: 4 miles / 30mpg = 0.13 gallons, or 0.492 liters, so their claim of half a liter of gas also checks out.
The cheapest commercial energy in the US appears to be in North Dakota at $0.0741/kWh, so using $0.05/kWh was very generous.
The average Netflix user watches about 2 hours per day, or 60 hours per month.
Just in an attempt to be a bit more accurate, let's assume the individual user's television and internet router use about 900W, so we'll use a final number of 8kW for Netflix's power use per user.
8 kW * 60 hours= 480 kWh
And the cost of all of those kWh at $0.05: 480 kWh * $0.05 = $24.00
Or, the cost in the least expensive state in the US: 480 kWh * $0.0741 = $35.57
National average is $0.14/kWh, so unless Netflix was serving everyone out of North Dakota and Texas, their average cost per user would be much closer to $70 per user.
OP's numbers were definitely already accurate enough for the point. Basically, there's no possible way Netflix needs that much electricity to serve their users.
I checked them Adolf, the numbers are accurate.
o7 doing the lord's work in the comments
I prefer to think that this post is unrealistically optimistic. If you drive an electric car and live in Quebec, this could very well be true. For reference, Quebec's electric grid is just about 100% hydroelectric power, so your driving emissions would be close to 0.
I only looked at power consumption, not emissions. If the electricity produced is emissions free than the emissions for both driving and streaming would be zero. So the original statement would be true, but meaningless. But lets compare the energy consumption with an EV. At 15kwh/100km(4.14mi/kWh) the EV would need 15kwh/100km*6,44km=0.966kwh for 4 miles. That still leaves us with a power draw of 1.932KW. That is closer to a realistic but I still don't think the power consumption of streaming is that high.
Streaming also doesn't emit microplastics all over the road via tyre wear. Streaming doesn't emit brake dust. Streaming doesn't require paving vast quantities of land in tarmac.
"closer to realistic" - technically, but 1 kW is just so much power, I find it hard to imagine.
Say I was streaming from my own home server instead (about 20W, which could serve more then just one user), and over a gigabit Ethernet switch (also about 20W) which could serve a 4k streams to 50 users, but let's say it's just me). Then I would use 0.04 kW of electricity for streaming? Maybe I'm streaming from my gaming PC (0.1 kW idle) and have a large inefficient monitor (another 0.1kW). Then it sums up to 0.24 kW. We're still not close to 1 kW and I'm out of ideas.
Granted, you'll have many more switches because this is the internet. But those won't serve just a single user so the power per user is much smaller too. And netflix servers will use more power, but they are also much better optimized for streaming than my home server, and not 90% idle, shared by many users.
And what would you do if you weren't streaming? Would you turn off your gaming PC and monitor? If not, we can't really fully count their consumption. Maybe... ah, I've got it! You're boiling water for coffee at the same time. Yes, that would be 1kW. All the time, while streaming, one cup of water after the other non-stop.
I'm not saying their numbers are correct but you are missing: Routers ( four minimum, Netflix data center, backbone isp, local isp, your house), TV, for many a streaming device which can range from the TV itself to a PS5 or gaming PC, and for many a soundbar or amp and speakers.
They probably took max load for all those devices and lumped that all together which, yeah max load isn't right and the routers should actually be split amongst many many houses but it is all part of streaming.
reminds me of when they use to calculate financial losses from a hack, they would add in the full cost of any hardware touched, and the full price to develop any of the software touched…
ending up at dozens of millions of dollars, just because some looked at a thing
like if you spray painted a wall on building and they charged you with the entire cost of building the entire structure.
i’d also say manufacturing the devices probably roughly doubles the carbon footprint (same with the car but we’re trying every trick in the book to figure out where the figure came from)
Don't forget that the grids that power these servers are mixed too, not 100% fossil fuels. And even if they were coal-fired, power generation is more efficient than internal combustion engines.
Likely it'd have to be at LEAST 30-40 kW to serve a single person for it to be equivalent, but I can't be arsed to do the math.
This person maths.
total mathhead
probably has a math lab
Trying the close to best scenario I can think of for the tweet to be correct
4 miles is about 6.5 km (rounding up)
Ford fiesta takes uses 6 litres over 100 km (tiny car also rounded down)
0.39l of gasoline is about 3.5 kwh (rounded down)
Well the next step would be apply loved trick: Engine only pases around 1/5 of gasoline energy to useful energy, so that number can be used to make it more possible We get 0.7kwh
Half an hour would give us 0.35kwh
Beffy Gaming PC uses around 400w (my gaming pc uses less) when doing light tasks, so that gives around 0.2kwh
Since I love drinking tea, that leaves me 0.15kwh for a whole litre of tea to chug down every 30 minutes
So with my average binge session I would have consume around 12 litres of tea for the tweet to be viable
No gaming PC should use 400W unless it's under heavy load.
Yeah, you're right. I tried my best to make the numbers work and still couldn't reach the bullshit tweet
Heh, just did the same but with CO2 emissions. And even considering those, the numbers were wildly off - about 2 days of constant streaming (nearly 48 hours!) equates a standard gas car's 4 mile drive in emissions.