this post was submitted on 15 Sep 2025
360 points (89.0% liked)
Technology
75186 readers
1877 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn't require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.
Well, most of the carbon footprint for models is in training, which you probably don't need to do at home.
That said, even with training they are not nearly our leading cause of pollution.
Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days
Basically every tech company is using it... It's millions of people, not just us...
Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.
Most of these figures are guesses along a spectrum of "educated" since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you're looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn't matter--there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).