this post was submitted on 25 Nov 2025
661 points (98.8% liked)

Technology

77072 readers
3010 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Generative “AI” data centers are gobbling up trillions of dollars in capital, not to mention heating up the planet like a microwave. As a result there’s a capacity crunch on memory production, shooting the prices for RAM sky high, over 100 percent in the last few months alone. Multiple stores are tired of adjusting the prices day to day, and won’t even display them. You find out how much it costs at checkout.

you are viewing a single comment's thread
view the rest of the comments
[–] markz@suppo.fi 292 points 1 day ago (6 children)

I swear there's a new gold rush every time I want to upgrade my pc.

[–] notabot@piefed.social 158 points 1 day ago (4 children)

It wouldn't be quite so bad if the previous gold rush ended first, but they seem to just be stacking up.

[–] Truscape@lemmy.blahaj.zone 43 points 1 day ago (2 children)

Speak for your self - scored a nice GPU upgrade during the crypto crash, maybe something similar will be achievable after this insanity hits the brakes.

[–] artyom@piefed.social 7 points 1 day ago

Until the next crisis...

[–] anomnom@sh.itjust.works 4 points 1 day ago (1 children)

I thought they weren’t using gaming gpus this time though?

[–] Trainguyrom@reddthat.com 5 points 1 day ago* (last edited 22 hours ago) (1 children)

Gaming GPUs during normal crypto markets don't compute fast enough to mine crypto profitably, but if crypto prices get high enough such as during a boom cycle, it can become profitable to mine on gaming GPUs

Edit to add: For crypto there's basically a set speed that any given GPU mines at. The hash rate specifically. It really doesn't change noticably over time through software updates, nor does the power consumption of the GPU. Its basically a set cost per cryptocurrency mined with any given hardware. If the value earned by mining can exceed the cost to run the GPU then GPU mining can quickly start making sense again.

[–] anomnom@sh.itjust.works 2 points 1 day ago (1 children)

Yeah, even less so with gen AI though I think.

[–] Trainguyrom@reddthat.com 1 points 22 hours ago

Machine learning models have much different needs that crypto. Both run well on gaming GPUs and both run even better on much higher end GPUs, but ultimately machine learning models really really need fast memory because it loads the entire weights into graphics memory for processing. There's some tools which will push it to system memory but these models are latency sensitive so crossing the CPU bus to pass 10s of gigabytes of data between the GPU and system memory is too much latency.

Machine learning also has the aspect of training vs inference, where the training portion will take a long time, will take less time with more/faster compute and you simply can't do anything with the model while it's training, meanwhile inference is still compute heavy it doesn't require anywhere near as much as the training phase. So organizations will typically rent as much hardware as possible for the training phase to try to get the model running as quickly as possible so they can move on to making money as quickly as possible.

In terms of GPU availability this means they're going to target high end GPUs, such as packing AI developer stations full of 4090s and whatever the heck Nvidia replaced the Tesla series with. Some of the new SOCs which have shared system/vram such as AMD's and Apple's new SOCs also fill a niche for AI developer and AI enthusiasts too since that enables large amounts of high speed video memory for relatively low cost. Realistically the biggest impact that AI is having on the Gaming GPU space is it's changing the calculation that AMD, Nvidia and Intel are making when planning out their SKUs, so they're likely being stingy on GPU memory specs for lower end GPUs to try to push anyone with specific AI models they're looking to run to much more expensive GPUs

[–] bobs_monkey@lemmy.zip 23 points 1 day ago

This AI bubble needs to explode yesterday, Wall Street be damned.

[–] Trainguyrom@reddthat.com 5 points 1 day ago* (last edited 1 day ago)

There was a nice window from about a year or two ago to about 3 months ago where no individual components were noticably inflated. Monitors took the longest to recover since the pandemic shortages so that was arguably around the beginning of this year that they seemed to fully normalize

Its funny because at work we've been pushing hard on Windows 11 refreshes all year and warning that there will likely be a rush of folks refreshing at the last possible minute at the end of the year inflating prices. And we ended up being correct on the inflated prices part but it was actually the AI bubble that did it

[–] Bronzebeard@lemmy.zip 1 points 1 day ago

That's the tariffs, now. GPUs had come down a bit before the dumbass

[–] Assassassin@lemmy.dbzer0.com 54 points 1 day ago (4 children)

This is why I'm still running ddr4. Every time I think about upgrading a generation, there's a run on some integral component.

[–] Truscape@lemmy.blahaj.zone 35 points 1 day ago (2 children)

AM4 is gonna last until the 2030s at this rate...

[–] Assassassin@lemmy.dbzer0.com 16 points 1 day ago (2 children)

With how good my 5600x still performs, I could very well see it lasting that long. Assuming it doesn't randomly kill itself after a few years like my previous ryzen 5.

[–] gravitas_deficiency@sh.itjust.works 9 points 1 day ago (1 children)

I was silly and got myself a 5950X. But I feel less silly about it now tbh. It’s gonna become my new homelab core whenever I get the chance to do a new gaming build again that’s not a high 4-figure investment.

[–] Assassassin@lemmy.dbzer0.com 8 points 1 day ago (1 children)

Totally worth it with how good ryzens have held up performance wise. Unless you're doing some really CPU heavy stuff or have a beast of a GPU, you probably won't get bottlenecked by the CPU for at least 5 more years.

Unless you're using windows in your homelab. I assume you're not since you have a home lab.

Nope - proxmox is the way

[–] Truscape@lemmy.blahaj.zone 5 points 1 day ago

5800x3d was probably my best cpu purchase of all time, damn

[–] theunknownmuncher@lemmy.world 2 points 1 day ago (1 children)
[–] Truscape@lemmy.blahaj.zone 7 points 1 day ago* (last edited 1 day ago) (1 children)

In a sane world, the limitations of a CPU socket would be reached, and then newer SKUs would no longer be release and all stock for prospective builders would be second hand.

That's clearly not the case here. AM4 continues to get new CPU releases and parts are still available new from retail, years after the support officially ending. That's a good thing for variety and entry level machines, but such dependency means a future CPU could be limited in featureset/performance if it releases on AM4 instead of AM5, which there may be enough demand to force designers to downgrade chips for AM4 compatibility.

[–] Trainguyrom@reddthat.com 3 points 1 day ago

The good thing about new AM4 boards being available at this point in time is you have options to keep older hardware running. Usually the CPU and memory will out-survive motherboard. Much like those new Chinese motherboards supporting 4th and 6th gen Intel CPUs, this is great for longevity and reduces how much production is needed

In a sane world, the limitations of a CPU socket would be reached, and then newer SKUs would no longer be released

I'd argue that it would be best if computers were more like cars, a new platform gets released each decade or so, and small improvements are made to individual parts but the parts are largely interchangable within the platform and produced for a decade or two before production is retired. More interchangable parts, slower release cycle and more opportunities for repair instead of replacement

[–] Dultas@lemmy.world 2 points 14 hours ago

DDR4 is expensive as shit too now. I was trying to build out a new rack for my homelab and 256GB of ram went from like $300 6 months ago to $1500.

[–] tomiant@piefed.social 1 points 1 day ago

Did I tell you about using arch?

[–] Wildmimic@anarchist.nexus 1 points 1 day ago

I dki so too - just upgraded my X2600 with a shiny X5950, the nicest cpu my aging mainboard can run. with 16 cores and 64 gigs of ram i see a future when i simply replace the entire machine for daily use and make this one a very nice server.

[–] adespoton@lemmy.ca 14 points 1 day ago (1 children)

It’s why I started treating computers as commodities — I rarely upgrade anymore; just wait the 5 years and by an entirely new system.

[–] Imgonnatrythis@sh.itjust.works 27 points 1 day ago (1 children)

Same except for me it's 10 years.

[–] popekingjoe@lemmy.world 7 points 1 day ago

This is about my upgrade cadence, except for storage. I ran my Ryzen 1600 until the 7000 series dropped and upgraded mobo+RAM at once for about $600.

I then moved the old parts to another case to use as a low load server only for both the motherboard and CPU die within a few weeks. 🫡

[–] mack@lemmy.sdf.org 8 points 1 day ago* (last edited 1 day ago)

because we're in an era where there always will be a gold rush for a specific component. upgrades have slowed down considerably in the past 10 years, my laptop is 4 years old and still kicks like the first day, I still game on my 8 year old laptop which is permanently attached to the TV and running as a steam machine with more than decent performance.

this wasn't even thinkable in the 00's

I'm pretty sure after hard disks, GPUs, rams the next shortage is either Arm CPUs or a specific future type of PSUs

[–] the_q@lemmy.zip 2 points 1 day ago

So it's your fault...

[–] Hubi@feddit.org 2 points 1 day ago

I feel like the luckiest person because I built my last PC right before the crypto hype and my current one right before the AI bubble.