this post was submitted on 15 Jun 2025
640 points (99.5% liked)

PC Gaming

11408 readers
566 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
 

Well I am shocked, SHOCKED I say! Well, not that shocked.

top 50 comments
sorted by: hot top controversial new old
[–] coacoamelky@lemm.ee 143 points 2 days ago (4 children)

The good games don't need a high end GPU.

[–] overload@sopuli.xyz 25 points 2 days ago

Absolutely. True creative games are made by smaller dev teams that aren't forcing ray tracing and lifelike graphics. The new Indianna Jones game isn't a GPU-selling card, and is the only game that I've personally had poor performance on with my 3070ti at 1440p.

[–] Railcar8095@lemm.ee 24 points 2 days ago

Terraria minimum specs: "don't worry bro"

[–] GrindingGears@lemmy.ca 9 points 2 days ago (1 children)

Problem is preordering has been normalized, as has releasing games in pre-alpha state.

load more comments (1 replies)
load more comments (1 replies)
[–] JordanZ@lemmy.world 104 points 2 days ago (11 children)

When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.

[–] vividspecter@lemm.ee 55 points 2 days ago (5 children)

It doesn't help that the gains have been smaller, and the prices higher.

I've got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I'm sure as fuck not paying that kind of money for a video card.

[–] ByteJunk@lemmy.world 11 points 2 days ago

I'm in the same boat.

In general, there's just no way I could ever justify buying a Nvidia card in terms of cost per buck, it's absolutely ridiculous.

I'll fork over 4 digits for a gfx when salaries go up by a digit as well.

[–] AndyMFK@lemmy.dbzer0.com 9 points 2 days ago

I just picked up a used RX 6800 XT after doing some research and comparing prices.

The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I'm very happy with my purchase. Solid upgrade from my 1070 Ti

[–] GrindingGears@lemmy.ca 8 points 2 days ago

Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.

load more comments (2 replies)
[–] missingno@fedia.io 15 points 2 days ago (1 children)

I don't think they're actually expecting anyone to upgrade annually. But there's always someone due for an upgrade, however long it's been for them. You can compare what percentage of users upgraded this year to previous years.

load more comments (1 replies)
load more comments (9 replies)
[–] LostXOR@fedia.io 56 points 2 days ago (4 children)

For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It's crazy that anyone would even consider buying it unless they're rich or actually need it for something important.

[–] endeavor@sopuli.xyz 24 points 2 days ago (12 children)

And still have your house burn down due to it just being a 2080 that has 9.8 jiggawats pushed into it.

There isn't a single reason to get any of the 5 series imo, they don't offer anything. And i say that as a 3d artist for games.

Edit: nevermind i remember some idiots got roped into 4k for gaming and are now paying the price like marketing wanted them to.

load more comments (12 replies)
[–] LordWiggle@lemmy.world 10 points 2 days ago (2 children)

unless they're rich or actually need it for something important

Fucking youtubers and crypto miners.

load more comments (2 replies)
load more comments (2 replies)
[–] bluesheep@lemm.ee 50 points 2 days ago (1 children)

Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

Yeah no shit, what a weird fucking take

[–] SharkAttak@kbin.melroy.org 16 points 2 days ago

But why spend to ""eat food"" when you can have RAYTRACING!!!2

[–] simple@piefed.social 47 points 2 days ago (2 children)

Unfortunately gamers aren't the real target audience for new GPUs, it's AI bros. Even if nobody buys a 4090/5090 for gaming, they're always out of stock as LLM enthusiasts and small companies use them for AI.

[–] imetators@lemmy.dbzer0.com 8 points 1 day ago

Ex-fucking-actly!

Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.

load more comments (1 replies)
[–] sp3ctr4l@lemmy.dbzer0.com 45 points 2 days ago* (last edited 2 days ago) (5 children)

In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

(Lowest price I can find)

... That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago... or even just a reasonably high end PC from right now.

...

The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures... which has necessitated the invention of intelligent temporal frame upscaling, and frame generation... the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

This reality is a farce.

...

Meanwhile, if you jump down to 1440p, well, I've got a future build plan sitting in a NewEgg wishlist right now.

RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) ... so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you'd need... potentially with enough room to also add in some extra internal HDD storage drives, ie, you've got leftover wattage headroom.

If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I'm making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.

[–] CheeseNoodle@lemmy.world 20 points 2 days ago (5 children)

Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn't Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.

[–] sp3ctr4l@lemmy.dbzer0.com 9 points 2 days ago* (last edited 2 days ago) (1 children)

Yep, thats gonna be significantly more powerful than my planned build... and likely somewhere between 500 to 1000 more expensive... but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

I'm guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you'd only need that full wattage for running raytracing in 4k.

Does that sound about right?

Eitherway... yeah... imagine an alternate timeline where marketing and industry direction isn't bullshit, where people actually admit things like:

Consoles cannot really do what they claim to do at 4K... at actual 4K.

They use checkerboard upscaling, so basically they're actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they're actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don't show console gamers real graphics settings menus, so they don't know that.

Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.

load more comments (1 replies)
load more comments (4 replies)
load more comments (4 replies)
[–] chunes@lemmy.world 30 points 1 day ago (5 children)

I stopped maintaining a AAA-capable rig in 2016. I've been playing indies since and haven't felt left out whatsoever.

load more comments (5 replies)
[–] Bakkoda@sh.itjust.works 28 points 2 days ago (2 children)

My new gpu was a steam deck.

load more comments (2 replies)
[–] candyman337@lemmy.world 27 points 1 day ago* (last edited 1 day ago) (2 children)

It's just because I'm not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I'm waiting until AMD gets a little better with ray tracing and switching to team red.

load more comments (2 replies)
[–] Ulrich@feddit.org 25 points 2 days ago* (last edited 2 days ago)

I think the Steam Deck can offer some perspective. If you look at the top games on SD it's like Baldurs Gate, Elden Ring, Cyberpunk, etc., all games that run REALLY poorly. Gamers don't care that much.

[–] paultimate14@lemmy.world 24 points 2 days ago

I've been waiting for a product that makes sense.

I'm still waiting. I can keep waiting

[–] graham1@lemmy.world 24 points 2 days ago (2 children)

Colour me surprised

Resumes gaming with a 1000-series card

load more comments (2 replies)
[–] frenchfryenjoyer@lemmings.world 22 points 1 day ago (5 children)

I have a 3080 and am surviving lol. never had an issue

[–] nik282000@lemmy.ca 9 points 1 day ago

Still running a 1080, between nvidia and windows 11 I think I'll stay where I am.

load more comments (4 replies)
[–] MyOpinion@lemmy.today 19 points 2 days ago

I am tired of be treated like a fool. No more money for them.

[–] Xenny@lemmy.world 18 points 2 days ago (2 children)

Rimworld doesn't need a new gpu

[–] knightly@pawb.social 12 points 2 days ago* (last edited 2 days ago) (1 children)

What it needs is a new multi-threaded engine so I can actually use all these extra cores. XD

[–] Sabata11792@ani.social 9 points 2 days ago (1 children)

Sounds like version 1.6 is supposed to get multithreading.

load more comments (1 replies)
load more comments (1 replies)
[–] Blackmist@feddit.uk 16 points 2 days ago (5 children)

Still on a 1060 over here.

Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.

load more comments (5 replies)
[–] JackbyDev@programming.dev 16 points 1 day ago (3 children)

Uhhh, I went from a Radeon 1090 (or whatever they're called, it's an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It's normal to not buy a GPU every year.

load more comments (3 replies)
[–] smeg@infosec.pub 15 points 2 days ago

Increasingly across many markets, companies are not targeting average or median consumers. They're only chasing whales, the people who can pay the premium. They've decided that more mid tier customers aren't worth it -- just chase the top. It also means a lower need for customer support.

[–] Alphane_Moon@lemmy.world 14 points 2 days ago (6 children)

It seems like gamers have finally realized that the newest GPUs by NVIDIA and AMD are getting out of reach, as a new survey shows that many of them are skipping upgrades this year.

Data on GPU shipments and/or POS sales showing a decline would be much more reliable than a survey.

Surveys can at times suffer from showing what the respondents want to reply as opposed to what they do.

[–] MudMan@fedia.io 8 points 2 days ago (4 children)

I mean, as written the headline statement is always true.

I am horrified by some of the other takeaways, though:

Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally.

57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills.

Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today.

Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
[–] xep@fedia.io 10 points 2 days ago (1 children)

if latency were eliminated

I'm sure we'd all switch to room temperature fusion for power if we could, too, or use superconductors in our electronics.

load more comments (1 replies)
load more comments (3 replies)
load more comments (5 replies)
[–] gravitas_deficiency@sh.itjust.works 13 points 1 day ago (2 children)

Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

So my next card is probably gonna be an RX 9070XT.

load more comments (2 replies)
[–] Goodtoknow@lemmy.ca 13 points 2 days ago (5 children)

GTX 1060 6Gb still going strong!

load more comments (5 replies)
[–] t_berium@lemmy.world 12 points 1 day ago

I remember when High-end-GPUs were around 500 €.

[–] Phoenicianpirate@lemm.ee 9 points 1 day ago (2 children)

I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it'll be my right for at least 10 years.

But it was expensive.

load more comments (2 replies)
[–] LordWiggle@lemmy.world 9 points 2 days ago (9 children)

Fuck Nvidia anyways. #teamred

load more comments (9 replies)
[–] kevinsbacon@lemmy.today 9 points 1 day ago (4 children)

All I want is more VRAM, it can already play all the games I want.

load more comments (4 replies)
[–] GaMEChld@lemmy.world 8 points 1 day ago

Don't think I'll be moving on from my 7900XTX for a long while. Quite pleased with it.

load more comments