this post was submitted on 30 Nov 2025
305 points (97.2% liked)

Technology

77090 readers
3576 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Sv443@sh.itjust.works 75 points 16 hours ago (6 children)

Highly doubt it's worth it in the long run due to electricity costs alone

[–] brucethemoose@lemmy.world 66 points 16 hours ago (1 children)

Depends.

Toss the GPU/wifi, disable audio, throttle the processor a ton, and set the OS to power saving, and old PCs can be shockingly efficient.

[–] cmnybo@discuss.tchncs.de 19 points 16 hours ago (1 children)

You can slow the RAM down too. You don't need XMP enabled if you're just using the PC as a NAS. It can be quite power hungry.

[–] brucethemoose@lemmy.world 10 points 15 hours ago (1 children)

Eh, older RAM doesn't use much. If it runs close to stock voltage, maybe just set it at stock voltage and bump the speed down a notch, then you get a nice task energy gain from the performance boost.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 8 points 14 hours ago (1 children)

There was a post a while back of someone trying to eek every single watt out of their computer. Disabling XMP and running the ram at the slowest speed possible saved like 3 watts I think. An impressive savings, but at the cost of HORRIBLE CPU performance. But you do actually need at least a little bit of grunt for a nas.

At work we have some of those atom based NASes and the combination of lack of CPU, and horrendous single channel ram speeds makes them absolutely crawl. One HDD on its own performs the same as this raid 10 array.

[–] brucethemoose@lemmy.world 1 points 14 hours ago (1 children)

Yeah.

In general, 'big' CPUs have an advantage because they can run at much, much lower clockspeeds than atoms, yet still be way faster. There are a few exceptions, like Ryzen 3000+ (excluding APUs), which idle notoriously hot thanks to the multi-die setup.

[–] ag10n@lemmy.world 1 points 11 hours ago (1 children)

Peripherals and IO will do that. Cores pulling 5-6W while IO die pulls 6-10W

https://www.techpowerup.com/review/amd-ryzen-7-5700x/18.html

[–] brucethemoose@lemmy.world 1 points 9 hours ago* (last edited 9 hours ago)

Same with auto overclocking mobos.

My ASRock sets VSoC to a silly high coltage with EXPO. Set that back down (and fiddle with some other settings/disable the IGP if you can), and it does help a ton.

...But I think AMD's MCM chips just do idle hotter. My older 4800HS uses dramatically less, even with the IGP on.

[–] Damage@feddit.it 16 points 16 hours ago (2 children)

So I did this, using a Ryzen 3600, with some light tweaking the base system burns about 40-50W idle. The drives add a lot, 5-10W each, but they would go into any NAS system, so that's irrelevant. I had to add a GPU because the MB I had wouldn't POST without one, so that increases the power draw a little, but it's also necessary for proper Jellyfin transcoding. I recently swapped the GPU for an Intel ARC A310.

By comparison, the previous system I used for this had a low-power, fanless intel celeron, with a single drive and two SSDs it drew about 30W.

[–] lectricleopard@lemmy.world 9 points 14 hours ago (1 children)

Ok, im glad im not the only one that wants a responsive machine for video streaming.

I ran a pi400 with plex for a while. I dont care to save 20W while I wait for the machine to respond after every little scrub of the timeline. I want to have a better experience than Netflix. Thats the point.

[–] Damage@feddit.it 4 points 14 hours ago

Eh, TBH I'd like to consume less power, but I mean, a 30-40W difference isn't going to ruin me or the planet, I've got a rather efficient home all in all.

[–] YerbaYerba@lemmy.zip 3 points 12 hours ago

I have a 3600 in a NAS and it idles at 25w. My mobo luckily runs fine without a GPU. I pulled it out after the initial install.

[–] leftascenter@jlai.lu 13 points 16 hours ago (1 children)

A desktop running a low usage wouldn't consume much more than a NAS, as long as you drop the video card (which wouldn't be running anyways).

Take only that extra and you probably have a few years usage before additional electricty costs overrun NAS cost. Where I live that's around 5 years for an estimated extra 10W.

[–] Damage@feddit.it -1 points 14 hours ago (1 children)

as long as you drop the video card

As I wrote below, some motherboards won't POST without a GPU.

Take only that extra and you probably have a few years usage before additional electricty costs overrun NAS cost. Where I live that’s around 5 years for an estimated extra 10W.

Yeah, and what's more, if one of those appliance-like NASes breaks down, how do you fix it? With a normal PC you just swap out the defective part.

Most modern boards will. Also there's integrated graphics on basically every single current CPU. Only AMD on AM4 held out on having iGPUs for so damn long.

[–] ImgurRefugee114@reddthat.com 4 points 16 hours ago* (last edited 16 hours ago) (2 children)

I'm still running a 480 that doubles as a space heater (I'm not even joking; I increase the load based on ambient temps during winter)

[–] thatKamGuy@sh.itjust.works 3 points 15 hours ago (1 children)

I am assuming that’s a GTX 480 and not an RX 480; if so - kudos for not having that thing melt the solder off the heatsink by now! 😅

The GTX 480 is efficient by modern standards. If Nvidia could make a cooler that could handle 600 watts in 2010 you can bet your sweet ass that GPU would have used a lot more power.

Well that and if 1000 watt power supplies were common back then.

[–] 9point6@lemmy.world 3 points 15 hours ago

If they're gonna buy a nas anyway, how many years to break even?

[–] EncryptKeeper@lemmy.world 1 points 9 hours ago

I have an old Intel 1440 desktop that runs 24/7 hooked up to a UPS along with a Beelink miniPC, my router, and a POE switch and the UPS is reporting a combined 100w.