this post was submitted on 25 Nov 2025
672 points (98.8% liked)

Technology

77072 readers
3101 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Generative “AI” data centers are gobbling up trillions of dollars in capital, not to mention heating up the planet like a microwave. As a result there’s a capacity crunch on memory production, shooting the prices for RAM sky high, over 100 percent in the last few months alone. Multiple stores are tired of adjusting the prices day to day, and won’t even display them. You find out how much it costs at checkout.

you are viewing a single comment's thread
view the rest of the comments
[–] scintilla@crust.piefed.social 11 points 1 day ago (2 children)

Hi I can explain the difference. The three other things you listed are necessary for a multitude of reasons. The current boom in data centers is for a solution in search of a problem wasting shit for no gain to humanity as a whole.

Hope that helps :3

[–] Trainguyrom@reddthat.com 5 points 1 day ago (2 children)

Also the scariest part of this datacenter inflation is how much of these new data centers are going to be abandoned within the next 5 years when the AI bubble pops and suddenly the companies spending like crazy on datacenter growth need to cut back. There'll be lots of big empty buildings outside of small towns costing taxpayers a ton of money, much like when any big box store closes up shop. You can either spend a ton of money tearing it down, a ton of money rebuilding it into something useful, a ton of money attracting another business which may or may not front the cost for remodeling the space or a ton of money maintaining the empty property so it doesn't fall over and become even more of a blight. There's no winning for these small municipalities that just get used and abused by large businesses

[–] Artisian@lemmy.world 1 points 17 hours ago* (last edited 17 hours ago) (1 children)

? Massive GPU server racks are relatively easy to repurpose for several things. The most likely (if sad) is crypto mining, but there's also expensive weather simulations, cloud gaming, video hosting, etc.

Requesting a source that these centers are hard to repurpose. I find myself pretty skeptical. Computers are generally multipurpose and easy to swap tasks on.

[–] Trainguyrom@reddthat.com 1 points 16 hours ago (1 children)

Is there enough demand for thousands of servers with purpose built ARM processors (which may or may not have any publicly available kernel support) driving 4-8 600w a pop Nvidia datacenter chips though? Yes some will be repurposed but there simply won't be the demand to fill immediately. Realistically what will happen is companies operating these datacenters will liquidate the racks, probably liquidate some of the datacenters entirely and thousands of servers will hit the secondhand market for next to nothing. While some datacenter structure city empty and unmaintained until they're either bought up to be repurposed, bought up to be refurbished and brought back into datacenter use of torn down, just like an empty Super Walmart location

Some of the datacenters will be reworked for general compute, maybe a couple will maintain some AI capacity, but given the sheer quantity of compute being stood up for the AI bubble and the sheer scale of the bubble, basically every major tech company is likely to shrink significantly when the bubble pops, since we're talking companies that currently have market caps measured in trillions, and literally a make up full quarter of the entire value of the New York Stock Exchange, it's going to be a bloodbath.

Remember how small the AI field was 6 years ago? It was purely the domain of academic research, fighting for scraps outside of a handful of companies big enough to invest in am AI engineer or two on the off chance they could make something useful for them. We're probably looking at a correction back down to nearly that scale. People who have drank the coolaid will wake up one day and realize how shit the output of generative AI is compared to the average professional's human work

[–] Artisian@lemmy.world 1 points 13 hours ago (1 children)

Thank you for fleshing out your world model and theory. I think that this model falls short of a source (and contradict some other AI-pessimistic economics predictions; namely a crash in computing cost and in crypto), but could be developed into something I'd find compelling.

Let me brainstorm aloud about what I think this world model predicts that we might have data on...

Did we see a crash in ISP prices, home and industry internet use, domain hosting, or other computing services in the dotcom bubble? That situation seems extremely analagous; but my vibe was that several of these did not drop (ISP price I suspect was stable), and some of these saw a dip but stayed well above early-internet rates (domain hosting)? I feel like there'd be a good analogy here, but I'm struggling with a way to operationalize.

I mentioned a use for compute that your reply didn't cover: crypto mining. Do we have evidence that the floor on crypto is well below datacenter operating costs (across exploitative coins as well)? I vaguely remember a headline in this direction. Another use case I don't see drying up: cheating on essay assignments.

More broadly, this model predicts that all compute avenues are much lower payoff than datacenter operating costs. I think I'd need to see this checked against an exhaustive HPC application list. I know that weather forecasting uses up about as much compute as AI for some supercomputing clusters.

Governments have already issued rather large grants to AI-driven academic projects. I suspect many of these are orders of magnitude larger than the size of academic AI 6 years ago. (I'll also quickly note that libraries are better than google search has ever been for finding true facts; yet google search has remained above library use throughout its existence.)

[–] Trainguyrom@reddthat.com 2 points 5 hours ago

Honestly the questions you're posing require a level of market analysis that could fill an entire white paper and be sold for way more money than I want to think about. Its a level of market analysis I don't want to dive into. My gut instinct from having worked in the tech industry, working with datacenters and datacenter hardware at large companies is that the AI industry will contract significantly when the bubble pops. I'm sure I could find real data to support this prediction but the level of analysis that would require and the hours of work are simply more than it's worth for an internet comment.

You have factors including what hardware is being deployed to meet AI bubble demand, how the networking might be setup differently for AI compared to general GPU compute, who is deploying what hardware, what the baseline demand for GPU compute is if you simulate no present AI bubble, etc. etc. it's super neat data analysis but I ain't got the time nor appetite for that right now

[–] MalMen@masto.pt 0 points 16 hours ago (1 children)

@Trainguyrom @scintilla I realy dont understund the AI doom theory.. Theres alot of shitty projects going nowhere? Sure. Theres a buble? Likely. But AI have value and is not going anywhere

[–] Trainguyrom@reddthat.com 1 points 6 hours ago

Oh yeah machine learning as a technology will survive, and eventually it will be implemented where it can do what it's really good at, but right now it's being shoved into everything to do things it isn't good at, so you end up with a super expensive to run, energy inefficient tool that runs worse than with traditional algorithms that can be run client side or on a single much cheaper server (I'm oversimplifying the server architecture for brevity)

Think customer service chatbots on ever car dealership's website. Traditionally these were extremely simplistic and usually just had canned responses based on keywords in the customer's written message and would quickly cascade the customer to a real customer service rep as soon as things got out of scope. Now with LLMs companies are running those as the customer service chatbots and the LLM can do anything from agreeing to sell a new car for a dollar to providing scam or invalid contact info to referring the customer to a competitor. There's no knowing what the AI will do because it's non-deterministic and you don't want that in customer service!

Right now we're in the bubble phase where every single company is finding some way to shoehorn AI into its business model so they can brag about it. Fucking Logitech added a remappable AI button that brings up a ChatGPT interface and just spends Logitech's money on tokens with ChatGPT. That's pure bubble behavior. Once the bubble pops we won't have literally every single time you open a car dealership page spending an LLM token or 5, you won't have Amazon running AI chatbots on every product page just for asking about that product, you won't have every website just giving away free unrestricted access to LLMs. That's what I'm talking about.

AI demand will drop when the bubble pops, and while it will be higher than it was 8 years ago, everyone is going to be very skeptical of anything AI, just like folks are still skeptical of mortgage backed securities over 15 years later, or just like people are skeptical of commerical websites without a clear method of financing 25 years after the dotcom bubble. People remember these things and will take a while to warm up to the idea again

[–] Artisian@lemmy.world 4 points 1 day ago* (last edited 18 hours ago)

Growing inefficient cattle crops in a desert to preserve water rights: not necessary.

Flying Coast to coast for a business meeting that could be an email: not necessary.

Manufacturing those cheap scissors that break after 2 uses: should be a crime (not necessary).

All of these subcases have comparable emissions and externalities to the data centers (at least by my fermi estimates).