this post was submitted on 30 Nov 2025
307 points (97.2% liked)

Technology

77090 readers
3576 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] deadbeef79000@lemmy.nz 52 points 14 hours ago (7 children)

OK. Science time. Somewhat arbitrary values used, the point is there is a amortization calculation, you'll need to calculate your own with accurate input values.

A PC drawing 100W 24/7 uses 877 kWh@0.15 $131.49 per year.

A NAS drawing 25W 24/7 uses 219 kWh@0.15 $32.87 per year

So, in this hypothetical case you "save" about $100/year on power costs running the NAS.

Assuming a capacity equivalent NAS might cost $1200 then you're better off using the PC you have rather than buying a NAS for 12 years.


This ignores that the heat generated by the devices is desirable in winter so the higher heat output option has additional utility.

[–] brrt@sh.itjust.works 20 points 13 hours ago (1 children)

Assuming a capacity equivalent NAS might cost $1200

Either you already have drives and could use them in a new NAS or you would have to buy them regardless and shouldn’t include them in the NAS price.

[–] kbobabob@lemmy.dbzer0.com 1 points 9 hours ago (1 children)

8 drives could go into most computers I think. Even 6 drive NAS can be quite expensive.

https://a.co/d/jcUR3yV

[–] frongt@lemmy.zip 2 points 8 hours ago* (last edited 8 hours ago)

That's not a NAS, that's a whole-ass PC, though with only 8gb RAM. And way overpriced for the spec.

[–] SirSamuel@lemmy.world 13 points 11 hours ago

I bought a two bay Synology for $270, and a 20TB hdd for $260. I did this for multiple reasons. The HDD was on sale so I bought it and kept buying things. Also I couldn't be buggered to learn everything necessary to set up a homemade NAS. Also also i didn't have an old PC. My current PC is a Ship of Theseus that I originally bought in 2006.

You're not wrong about an equivalent NAS to my current pc specs/capacity being more expensive. And yes i did spend $500+ on my NAS And yet I also saved several days worth of study, research, and trial and error by not building my own.

That being said, reducing e-waste by converting old PCs into Jellyfin/Plex streaming machines, NAS devices, or personal servers is a really good idea

[–] Armand1@lemmy.world 9 points 11 hours ago* (last edited 11 hours ago)

In the UK the calculus is quite different, as it's £0.25/kWh or over double the cost.

Also, an empty Synology 4-bay NAS can be gotten for like £200 second hand. Good enough if you only need file hosting. Mine draws about 10W compared to an old Optiplex that draws around 60W.

With that math using the NAS saves you 1.25 pence per hour. Therefore the NAS pays for itself in around about 2 years.

[–] Auth@lemmy.world 7 points 12 hours ago

my gaming pc runs at like 50w idle and only draws a ton of power if its being used for something. It would be more accurate to consider a PC to be 1.75x more power than a NAS but then account for the cost of buying a NAS. I'd say NAS would probably take 2-4 years to pay off depending on regional power prices.

[–] EndlessNightmare@reddthat.com 4 points 4 hours ago

This ignores that the heat generated by the devices is desirable in winter so the higher heat output option has additional utility.

But the heat is a negative in the summer. So local climate might tip the scales one way or the other.

[–] Evil_Shrubbery@thelemmy.club 3 points 10 hours ago (3 children)

... 100W? Isn't that like a rally bygone era? CPUs of the past decade can idle at next to nothing (like, there isn't much difference between an idling i7/i9 and a Pentium from the same era/family).

Or are we taking about arm? (Sry, I don't know much about them.)

[–] douglasg14b@lemmy.world 1 points 3 hours ago

All devices on the computer consume power.

The CPU being the largest in this context. Older processors usually don't have as aggressive throttling as modern ones for low power scenarios.

Similarly, the "power per watt" of newer processors is incredibly high in comparison, meaning they can operate at much lower power levels while running the same workload.

[–] imetators@lemmy.dbzer0.com 1 points 4 hours ago

I got a Kill-A-Watt similar device. I have measured my old PC at around 110W. PC specs: i5-6600, 16gb DDR4 ram, 1060 3gb, 1x2TB hdd, 1x250gb sata ssd, 1x1tb m2 ssd.

[–] Damage@feddit.it 1 points 3 hours ago

I think we need to qualify "idling", my NAS runs bittorrent with thousands of torrents, so it's never really "idle", it just isn't always doing intensive processing such as transcoding.

[–] douglasg14b@lemmy.world 1 points 3 hours ago

In the fall/winter in northern areas it's free! (Money that would already be spent on heating).

Summer is a negative though, as air conditioning needs to keep up. But the additional cost is ~1/3rd the heat output for most ACs (100w of heat require < 30w of refrigeration losses to move)