I would help out their GPU sales if they were ever in stock.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
I had luck with Microcenter last week (if you have one near you); checked their website at my preferred location and they had 9070 XTs in stock, went after work and I got one.
I have one about an hour away and no luck so far at that location.
Edit: oh damn, they are in stock today!
Edit2: it was one and now its gone :(
I got mine on a Thursday FWIW. The employees didn't even know they had them in stock, they had to pull it out of a case in the back; had to have been an afternoon truck delivery
Also it is goddamned wonderful as a GPU 😍 ive been replaying Far Cry 6 with maxed out... everything. At 4k, and I've got roughly 100 FPS stable. Its absolutely gorgeous
If you really wanna get one and it's in stock, reserve it for pick up, they'll hold it for like 3 days and you don't have to pay until you actually go get it, in case you change your mind or can't make it in that time.
It's a huge gamble for manufacturers to order a large allocation of wafers a year in advance of actual retail sales. The market can shift considerably in that time. They probably didn't expect Nvidia to shit the bed so badly.
Wait, are you saying if they had more product, they could sell more product?
Sounds like voodoo economics to me!
The last time they had plenty of stock and cards people wanted to buy at the same time was the RX 200 series. They sold lots of cards, but part of the reason people wanted them was because they were priced fairly low because the cards were sold with low margins, so they didn't make a huge amount of money, helping to subsidise their CPU division when it was making a loss, but not more.
Shortly after this generation launched Litecoin ASIC mining hardware became available, so suddenly the used market was flooded with these current-generation cards, making it make little sense to buy a new one for RRP, so towards the end of the generation, the cards were sold new at a loss just to make space. That meant they needed to release the next generation cards to convince people to buy them, but as they were just a refresh generation (basically the same GPUs but clocked higher and with lower model numbers with only the top-end Fury card being new silicon) it was hard to sell 300-series cards when they cost more than equivalent 200-series ones.
That meant they had less money to develop Polaris and Vegas than they wanted, so they ended up delayed. Polaris sold okay, but was only available as low-margin low-end cards, so didn't make a huge amount of money. Vega ended up delayed by so long that Nvidia got an entire extra generation out, so AMD's GTX 980 competitor ended up being an ineffective GTX 1070 competitor, and had to be sold for much less than planned, so again, didn't make much money.
That problem compounded for years until Nvidia ran into their own problems recently.
It's not unreasonable to claim that AMD graphics cards being in stock at the wrong time caused them a decade of problems.
I've always been an AMD cpu guy; my first pc had an AMD gpu and then i moved away to nvidia, but due to costs i now moved back to AMD and i got zero complaints.
Hell they finally got me too... They can thank Intel for royally fucking up 13-14 gen and then releasing a new chip that wasn't much better than previous ones to warrant the price.
I always built Intel PCs out of habit mostly, but I just got a 9800x3d last week for my rebuild.
Been AMD for years but went Intel for a media server due to the encoder and better idling power. I wish AMD would improve their video encoding.
My ryzen 7 3700x is several years old now. It was a little finicky with what memory it liked but since working that out, it's been great. No complaints. I expect this system to last me at least another 5 years.
I upgraded to a 5700X from a 3600 this year to take advantage of some sales, no regrets. Wish I had the spare cash for a 9070XT, maybe next gen.
3800x here, I've been very happy with it. I don't see a need to upgrade. My 2070s, however... does not work very high-end with my ultra-wide monitor when playing AAA (or even AAAA!..) games lol.
Wish my 5800x had lasted :( It seems to have died after only 4 years.
The only hardware issues I've ever had were due to poor thermal management.
If you want hardware longevity, use a high quality PSU, don't overclock, provide excessive cooling (so that several years from now, when you neglect your system and its full of dust, you'll still be OK).
I had all of that. Ran into intermittent random crashes about a year ago. After a year of not being able to find a cause I found a thread of other 5800x users running into the same problem. (For the record this was with a high quality PSU, a very very light overclock, and temps were fine throughout that time. Also while I'm not a true IT professional I do know my way around a computer, and the most in depth error logs I could find, which there were very few of, pointed to really low level calculation errors.)
After finally giving up and just buying a 9800x3d I gave the system to my friend for a huge discount, but after reinstalling everything the CPU never booted again.
While what you say is generally true, it is also sometimes the case that some parts are just slightly defective, and those defects might show with age. It's the first CPU I've ever had that died on me (other than a 9800x3d but that was an MSI mobo that killed it,) so I don't really hold it against them. And I'm very happy with the 9800x3d. Its amazing the difference it's made in games.
That's a bummer that it failed on you.
I've been wondering if it would be worth it to replace my 3700x with a 5800x3d but I'm not sure the modest performance improvement would be worth the price.
Thermal problems are much less likely to kill hardware than they used to be. CPU manufacturers have got much better at avoiding microfractures caused by thermal stress (e.g. by making sure that everything in the CPU expands at the same rate when heated) and failures from electromigration (where the atoms in the CPU move because of applied voltage and stop being parts of transistors and traces, which happens faster at higher temperatures). Ten or twenty years ago, it was really bad for chips to swing between low and high temperatures a lot due to thermal stress, and bad for them to stay at above 90°C for a long time due to electromigration, but now heat makes so little difference that modern CPUs dynamically adjust their frequency to stay between 99.0° and 99.9° under load by default. The main benefit of extra cooling these days is that you can stay at a higher frequency for longer without exceeding the temperature limit, so get better average performance, but unless your cooling solution is seriously overspecced, the CPU will be above 99.0° under load a lot of the time either way and the motherboard just won't ramp the fan up to maximum.
Glad i already built a system and don't have to worry for a few years about upgrades unless something breaks. Also i'm Canadian so our tariff dispositions may be different.
Once again gamers are oppressed
Strong CPU ~~sales~~, but GPUs ... trail behind