OK. Science time. Somewhat arbitrary values used, the point is there is a amortization calculation, you'll need to calculate your own with accurate input values.
A PC drawing 100W 24/7 uses 877 kWh@0.15 $131.49 per year.
A NAS drawing 25W 24/7 uses 219 kWh@0.15 $32.87 per year
So, in this hypothetical case you "save" about $100/year on power costs running the NAS.
Assuming a capacity equivalent NAS might cost $1200 then you're better off using the PC you have rather than buying a NAS for 12 years.
This ignores that the heat generated by the devices is desirable in winter so the higher heat output option has additional utility.
I bought a two bay Synology for $270, and a 20TB hdd for $260. I did this for multiple reasons. The HDD was on sale so I bought it and kept buying things. Also I couldn't be buggered to learn everything necessary to set up a homemade NAS. Also also i didn't have an old PC. My current PC is a Ship of Theseus that I originally bought in 2006.
You're not wrong about an equivalent NAS to my current pc specs/capacity being more expensive. And yes i did spend $500+ on my NAS And yet I also saved several days worth of study, research, and trial and error by not building my own.
That being said, reducing e-waste by converting old PCs into Jellyfin/Plex streaming machines, NAS devices, or personal servers is a really good idea
In the UK the calculus is quite different, as it's £0.25/kWh or over double the cost.
Also, an empty Synology 4-bay NAS can be gotten for like £200 second hand. Good enough if you only need file hosting. Mine draws about 10W compared to an old Optiplex that draws around 60W.
With that math using the NAS saves you 1.25 pence per hour. Therefore the NAS pays for itself in around about 2 years.
my gaming pc runs at like 50w idle and only draws a ton of power if its being used for something. It would be more accurate to consider a PC to be 1.75x more power than a NAS but then account for the cost of buying a NAS. I'd say NAS would probably take 2-4 years to pay off depending on regional power prices.
... 100W? Isn't that like a rally bygone era? CPUs of the past decade can idle at next to nothing (like, there isn't much difference between an idling i7/i9 and a Pentium from the same era/family).
Or are we taking about arm? (Sry, I don't know much about them.)
The CPU being the largest in this context. Older processors usually don't have as aggressive throttling as modern ones for low power scenarios.
Similarly, the "power per watt" of newer processors is incredibly high in comparison, meaning they can operate at much lower power levels while running the same workload.
I got a Kill-A-Watt similar device. I have measured my old PC at around 110W. PC specs: i5-6600, 16gb DDR4 ram, 1060 3gb, 1x2TB hdd, 1x250gb sata ssd, 1x1tb m2 ssd.
I think we need to qualify "idling", my NAS runs bittorrent with thousands of torrents, so it's never really "idle", it just isn't always doing intensive processing such as transcoding.
In the fall/winter in northern areas it's free! (Money that would already be spent on heating).
Summer is a negative though, as air conditioning needs to keep up. But the additional cost is ~1/3rd the heat output for most ACs (100w of heat require < 30w of refrigeration losses to move)
OK. Science time. Somewhat arbitrary values used, the point is there is a amortization calculation, you'll need to calculate your own with accurate input values.
A PC drawing 100W 24/7 uses 877 kWh@0.15 $131.49 per year.
A NAS drawing 25W 24/7 uses 219 kWh@0.15 $32.87 per year
So, in this hypothetical case you "save" about $100/year on power costs running the NAS.
Assuming a capacity equivalent NAS might cost $1200 then you're better off using the PC you have rather than buying a NAS for 12 years.
This ignores that the heat generated by the devices is desirable in winter so the higher heat output option has additional utility.
Either you already have drives and could use them in a new NAS or you would have to buy them regardless and shouldn’t include them in the NAS price.
8 drives could go into most computers I think. Even 6 drive NAS can be quite expensive.
https://a.co/d/jcUR3yV
That's not a NAS, that's a whole-ass PC, though with only 8gb RAM. And way overpriced for the spec.
I bought a two bay Synology for $270, and a 20TB hdd for $260. I did this for multiple reasons. The HDD was on sale so I bought it and kept buying things. Also I couldn't be buggered to learn everything necessary to set up a homemade NAS. Also also i didn't have an old PC. My current PC is a Ship of Theseus that I originally bought in 2006.
You're not wrong about an equivalent NAS to my current pc specs/capacity being more expensive. And yes i did spend $500+ on my NAS And yet I also saved several days worth of study, research, and trial and error by not building my own.
That being said, reducing e-waste by converting old PCs into Jellyfin/Plex streaming machines, NAS devices, or personal servers is a really good idea
In the UK the calculus is quite different, as it's £0.25/kWh or over double the cost.
Also, an empty Synology 4-bay NAS can be gotten for like £200 second hand. Good enough if you only need file hosting. Mine draws about 10W compared to an old Optiplex that draws around 60W.
With that math using the NAS saves you 1.25 pence per hour. Therefore the NAS pays for itself in around about 2 years.
my gaming pc runs at like 50w idle and only draws a ton of power if its being used for something. It would be more accurate to consider a PC to be 1.75x more power than a NAS but then account for the cost of buying a NAS. I'd say NAS would probably take 2-4 years to pay off depending on regional power prices.
But the heat is a negative in the summer. So local climate might tip the scales one way or the other.
... 100W? Isn't that like a rally bygone era? CPUs of the past decade can idle at next to nothing (like, there isn't much difference between an idling i7/i9 and a Pentium from the same era/family).
Or are we taking about arm? (Sry, I don't know much about them.)
All devices on the computer consume power.
The CPU being the largest in this context. Older processors usually don't have as aggressive throttling as modern ones for low power scenarios.
Similarly, the "power per watt" of newer processors is incredibly high in comparison, meaning they can operate at much lower power levels while running the same workload.
I got a Kill-A-Watt similar device. I have measured my old PC at around 110W. PC specs: i5-6600, 16gb DDR4 ram, 1060 3gb, 1x2TB hdd, 1x250gb sata ssd, 1x1tb m2 ssd.
I think we need to qualify "idling", my NAS runs bittorrent with thousands of torrents, so it's never really "idle", it just isn't always doing intensive processing such as transcoding.
In the fall/winter in northern areas it's free! (Money that would already be spent on heating).
Summer is a negative though, as air conditioning needs to keep up. But the additional cost is ~1/3rd the heat output for most ACs (100w of heat require < 30w of refrigeration losses to move)