this post was submitted on 15 Sep 2025
360 points (89.0% liked)

Technology

75186 readers
1877 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

archive.is link to article from allabout.ai at https://www.allaboutai.com/resources/ai-statistics/ai-environment/

you are viewing a single comment's thread
view the rest of the comments
[–] blaue_Fledermaus@mstdn.io 11 points 18 hours ago (3 children)

Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn't require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.

[–] Artisian@lemmy.world 7 points 17 hours ago (1 children)

Well, most of the carbon footprint for models is in training, which you probably don't need to do at home.

That said, even with training they are not nearly our leading cause of pollution.

[–] REDACTED@infosec.pub 2 points 12 hours ago

Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days

[–] Flagstaff@programming.dev 4 points 18 hours ago (1 children)

Basically every tech company is using it... It's millions of people, not just us...

[–] very_well_lost@lemmy.world 2 points 11 hours ago

Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.

[–] FatCrab@slrpnk.net 1 points 6 hours ago

Most of these figures are guesses along a spectrum of "educated" since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you're looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.

That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn't matter--there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).