this post was submitted on 04 Aug 2025
835 points (98.3% liked)

PC Gaming

12002 readers
473 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] brucethemoose@lemmy.world 1 points 3 days ago* (last edited 3 days ago) (1 children)

Thanks for the ideas! Hopefully I can push the graphics up without turning into a pile of lava. I need to figure out how to record graphics power consumption for me to reference to evaluate changes.

It's far more efficient to just TDP limit your GPU rather than lowering settings to try and get power consumption (and laptop fan speed) down. It will stick to slightly lower clocks, which is exponentially better since that also lowers voltage, and voltage increases power consumption quadratically.

Otherwise it will always try to boost to 100W anyway.

You can do this with MSI Afterburner easily, or you can do it in Windows with just the command line. For example, nvidia-smi -pl 80 will set the power limit to 80W (until you restart your PC). nvidia-smi by itself will show all its default settings.

I do this with my 3090, and dropping from the default 420W to 300W hardly drops performance at all without changing a single graphics setting.

Alternatatively you can hard cap the clocks to your GPU's "efficient" range. For my 3090 thats somewhere around 1500-1700 MHz, and TBH I do this more often, as it wastes less power from the GPU clocking up to uselessly inefficient voltages, but lets it "power up" for really intense workloads.

FYI you can do something similar with the CPU too, though it depends on the model and platform.

[โ€“] GreenCrunch@lemmy.today 2 points 3 days ago

Thank you very much, kind graphics wizard. I will put this knowledge to good use saving my ears from that fan. This is exactly what I was looking for!