this post was submitted on 15 Sep 2025
355 points (89.0% liked)

Technology

75186 readers
1877 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

archive.is link to article from allabout.ai at https://www.allaboutai.com/resources/ai-statistics/ai-environment/

top 50 comments
sorted by: hot top controversial new old
[–] lime@feddit.nu 75 points 17 hours ago* (last edited 17 hours ago) (4 children)

idk if that's the intended takeaway from those numbers.

According to AllAboutAI analysis, global AI processing generates over 260,930 kilograms of CO₂ monthly from ChatGPT alone, equivalent to 260 transatlantic flights, with 1 billion daily queries consuming 300 MWh of electricity.

according to the faa there are on average 5500 planes in the air every day, and while i couldn't find an exact number there seem to be between 350 and 1 200 transatlantic flights every day, depending on season.

260 tons is still massive, but let's not kid ourselves. it's about equivalent to producing 12 new american-size cars.

[–] Artisian@lemmy.world 23 points 16 hours ago

Thank you.

Idk if LLMs can tell which number is bigger. But we already knew humans can't.

[–] otp@sh.itjust.works 16 points 13 hours ago

Just goes to show that you don't even need AI to spread misinformation! Haha

[–] leftthegroup@lemmings.world 7 points 16 hours ago (3 children)

Yes, but there's zero fucking actual benefit.

Seeing memes posted here that use AI while sitting on it is the most confusing thing to me.

Just... don't use it, people. The hole burning in AI bros' pockets will close up if you just stop making it profitable. Even the free ones are making money with ads. Don't use it, even for a joke.

[–] SkyeStarfall@lemmy.blahaj.zone 7 points 7 hours ago

Frankly focusing on the carbon output of AI models is a red herring. It's not a significant part of the problem and just makes people complacent in the form of feeling like we've achieved something if it succeeds. It's not worse than stuff like video games

Focus on the actual negative effects of AI, but carbon intensity isn't a major one

[–] lime@feddit.nu 3 points 6 hours ago (1 children)

we do a lot of things for no benefit. video games, golf, horse racing, grilling... all those have far larger carbon footprints. as someone else said, focus on the actual negatives of generative ai, like the proven cognitive decline and loneliness.

load more comments (1 replies)
[–] very_well_lost@lemmy.world 2 points 11 hours ago (1 children)

260,930 kilograms of CO₂ monthly from ChatGPT alone

ChatGPT has the most marketing, but it's only part of the AI ecosystem... and honestly, I wouldn't be surprised if other AI products are bigger now. Practically every time someone does a Google search, Gemini AI spits out a summary whether you wanted it or not — and Google processes more than 8 billion search queries per day. That's a lot of slop.

There are also more bespoke tools that are being pushed aggressively in enterprise. Microsoft's Copilot is used extensively in tech for code generation and code reviews. Ditto for Claude Code. And believe me, tech companies are pushing this shit hard. I write code for a living, and the company I work for is so bullish on AI that they've mandated that us devs have to use it every day if we want to stay employed. They're even tracking our usage to make sure we comply... and I know I'm not alone in my experience.

All of that combined probably still doesn't reach the same level of CO² emissions as global air travel, but there are a lot more fish in this proverbial pond than just OpenAI, and when you add them all up, the numbers get big. AI usage is also rising much, much faster than air travel, so it's really only a matter of time before it does cross that threshold.

[–] lime@feddit.nu 3 points 9 hours ago

they list the others in the article.

[–] Semi_Hemi_Demigod@lemmy.world 54 points 18 hours ago (5 children)

Which is why I threw up in my mouth a little when my boss said we all need to be more bullish on AI this morning.

[–] Reygle@lemmy.world 32 points 18 hours ago

My boss is also a fuckwit

[–] BrianTheeBiscuiteer@lemmy.world 32 points 18 hours ago (1 children)

Same. And they basically jizz their pants when they see a practical use for AI, but 9 out of 10 times there's already a cheaper and more reliable solution they won't even entertain.

[–] magikmw@piefed.social 2 points 7 hours ago

There's practical use for AI?

[–] GhostlyPixel@lemmy.world 9 points 17 hours ago* (last edited 17 hours ago) (1 children)

I’ve mentioned it before but my boss’s boss said only 86% of employees in his department use AI daily and it’s one of his annual goals to get that to 100%. He is obsessed.

[–] ramble81@lemmy.zip 9 points 14 hours ago

They’re salivating at the chance to reduce head count and still make money. Employees are by far the largest cost for any company. They hate paying it out when it could be for them.

[–] SpaceNoodle@lemmy.world 5 points 17 hours ago

You should correct their spelling of "bullshit"

[–] HugeNerd@lemmy.ca 5 points 11 hours ago

Replace your boss with it.

[–] ayyy@sh.itjust.works 28 points 10 hours ago (1 children)

Your article doesn’t even claim that. Do you have any idea just how carbon intensive a flight is?

[–] MonkderVierte@lemmy.zip 1 points 6 hours ago

Or a LLM query?

[–] Wildmimic@anarchist.nexus 18 points 4 hours ago (4 children)

OP, this statement is bullshit. you can do about 5 million requests for ONE flight.

i'm gonna quote my old post:

I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.

The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That's nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.

Even compared to other online activities that use data centers ChatGPT's power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!

collapsed inline media

Water is about the same, although the positioning of data centers in the US sucks. The used water doesn't disappear tho - it's mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.

collapsed inline media

Read up here !

[–] brucethemoose@lemmy.world 1 points 44 minutes ago* (last edited 42 minutes ago)

If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365...

Added up, it's not that much power, or water.

That's unrealistic, of course, but that's literally the physical cap of what humanity can produce at the moment.

load more comments (3 replies)
[–] AndiHutch@lemmy.zip 15 points 14 hours ago

It also pollutes the mind of ignorant people with misinformation. Not that that is anything new. But I do think objective truth is very important in a democratic society. It reminds me of that video that used to go around that showed Sinclair Broadcasting in like 20 some different 'local' broadcast news all repeating the same words verbatim. It ended with 'This is extremely dangerous to our democracy'. With AI being added to all the search engines, it is really easy to look something and unknowingly get bombarded with false info pulled out of the dregs of internet. 90% of people don't verify the answer to see if it is based in reality.

[–] blaue_Fledermaus@mstdn.io 11 points 17 hours ago (3 children)

Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn't require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.

[–] Artisian@lemmy.world 8 points 16 hours ago (1 children)

Well, most of the carbon footprint for models is in training, which you probably don't need to do at home.

That said, even with training they are not nearly our leading cause of pollution.

[–] REDACTED@infosec.pub 2 points 11 hours ago

Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days

[–] Flagstaff@programming.dev 3 points 17 hours ago (1 children)

Basically every tech company is using it... It's millions of people, not just us...

[–] very_well_lost@lemmy.world 1 points 11 hours ago

Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.

[–] FatCrab@slrpnk.net 1 points 5 hours ago

Most of these figures are guesses along a spectrum of "educated" since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you're looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.

That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn't matter--there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).

[–] maccam912@programming.dev 8 points 17 hours ago (4 children)

What does it mean to consume water? Like it's used to cool something and then put back in a river? Or it evaporates? It's not like it can be used in some irrecoverable way right?

[–] lime@feddit.nu 5 points 17 hours ago

"using" water tends to mean that it needs to be processed to be usable again. you "use" water by drinking it, or showering, or boiling pasta too.

[–] morto@piefed.social 5 points 15 hours ago* (last edited 14 hours ago)

if they take the water and don't return to the source, there will be less available water in the water body, and it can lead to scarcity. If they take it and return, but at a higher temperature, or along with pollutants, it can impact the life in the water body. If they treat the water before returning, to be closest to the original properties, there will be little impact, but it means using more energy and resources for the treatment

[–] Flagstaff@programming.dev 3 points 17 hours ago* (last edited 17 hours ago)

I think the point is that it evaporates and may return as rain, which is overwhelmingly acid rain or filled with microplastics or otherwise just gets dirty and needs to be cleaned or purified again.

[–] kibiz0r@midwest.social 2 points 17 hours ago (1 children)

They need to use very pure water, and it evaporates completely, so it must be continually replenished.

[–] Hackworth@sh.itjust.works 11 points 17 hours ago

Need is a strong word. There are much more efficient ways to cool data centers. They've just chosen the most wasteful way because it's the cheapest (for them).

[–] rimu@piefed.social 8 points 16 hours ago

The emoji usage, heading & bold text pattern makes me certain the article was written using AI.

[–] REDACTED@infosec.pub 6 points 11 hours ago

Bitcoin or crypto?

[–] melsaskca@lemmy.ca 5 points 3 hours ago (1 children)

I did some research and according to some AI's this is true. According to some other AI's this is false.

[–] Reygle@lemmy.world 2 points 2 hours ago

"Dear expensive thing: Are you wasteful?"

[–] jordanlund@lemmy.world 4 points 16 hours ago (1 children)

Generating bullshit that isn't really that useful.

Remember when the Apple Newton "revolutionized" computing with handwriting recognition?

No, of course not, because the whole thing sucked and vanished outside of old Doonesbury cartoons. LOL

collapsed inline media

[–] corsicanguppy@lemmy.ca 3 points 15 hours ago* (last edited 15 hours ago)

My peer used the newton for comp sci class notes. Daily. Exclusively.

Then she went on to mastermind the behaviour and tactics of Myth: The Fallen Lords.

It's tenuous, but I say that's causal.

[–] HubertManne@piefed.social 4 points 16 hours ago

This is my main issue with it. I think its useful enough but only if it uses about the same energy as you would use doing whatever without it. Most conversations I had with someone trying to convince me it does not use to much power end up being very much like crypto ones were it keeps on being apples to oranges and the energy consumption seems to much. Im hoping hardware can be made to get the power use lower the way graphics cards did. I want to see querying an llm using about the same as searching for the answer or lower.

[–] ReCursing@feddit.uk 3 points 17 hours ago

It's using energy, we need more renewables. That's not a problem with AI. Direct your opprobrium where it belongs

[–] fittedsyllabi@lemmy.world 3 points 11 hours ago
[–] Bebopalouie@lemmy.ca 2 points 2 hours ago

I stopped l, not that I used it that much, about 5 months ago.

[–] melfie@lemy.lol 0 points 4 hours ago (1 children)

I have started using Copilot more lately, but I’ve also switched from plastic straws to paper, so I’m good, right?

[–] jokersteve@lemmy.world 1 points 12 minutes ago

You can drink one less coffee per week and so save more carbon emission and water usage than not using LLMs.