this post was submitted on 06 Aug 2025
693 points (98.1% liked)

Showerthoughts

36540 readers
270 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
 

In the late 2000s there was a push by a lot of businesses to not print emails and people use to add a 'Please consider this environment before printing this email.'

collapsed inline media

Considering how bad LLMs/'ai' are with power consumption and water usage a new useless tag email footer should be made.

top 50 comments
sorted by: hot top controversial new old
[–] IcedRaktajino@startrek.website 96 points 5 days ago* (last edited 5 days ago) (2 children)

Ooooh, I'm totally adding that to my email signature.

Aaaaaaand done.

collapsed inline media

[–] scytale@piefed.zip 28 points 5 days ago* (last edited 5 days ago) (1 children)

If you don't mind the additional real estate, it would also be great to have a version where "printing" is still there but struck out, for people who aren't aware of the original.

[–] IcedRaktajino@startrek.website 39 points 5 days ago (1 children)

Had to shrink the font a bit; looks better all on one line.

collapsed inline media

load more comments (1 replies)
[–] Catoblepas@piefed.blahaj.zone 11 points 5 days ago (1 children)
[–] IcedRaktajino@startrek.website 15 points 5 days ago (2 children)

Is that good? I'm old and have no idea what that means.😆

[–] TranquilTurbulence@lemmy.zip 11 points 5 days ago* (last edited 5 days ago) (1 children)
[–] Catoblepas@piefed.blahaj.zone 8 points 5 days ago

Haha it’s good! It’s supposed to be a little guy saluting. The o is the head and the 7 is the arm and hand doing a salute pose.

[–] Cevilia@lemmy.blahaj.zone 73 points 5 days ago (2 children)

🏞 Please consider the environment before issuing a return-to-office mandate

[–] a_wild_mimic_appears@lemmy.dbzer0.com 15 points 5 days ago (1 children)

This is the right response - RTO is much worse for the climate than GenAI.

[–] Trihilis@ani.social 7 points 5 days ago* (last edited 4 days ago) (4 children)

So I'm not saying RTO is worse than AI or vice versa. But do you have any data to back up that statement. I've been seeing nothing but news about AI data centers being an absolute nightmare for the planet. And even more so when corrupt politicians let then be built in places that already have trouble with maintaining normal water levels.

I get both are bad for the environment.

Editie: thanks for all the sources everyone. TIL

[–] Bytemeister@lemmy.world 10 points 4 days ago* (last edited 4 days ago)

Well, real quick, my drive to the office is ~10 miles. My car gets ~3.1 miles/kwh. So let's say I use 3 KWH per trip, two trips a day, makes it 6KWH. A typical LLM request uses 0.0015KWH of electricity, so my single day commute in my car uses ~4000 LLM queries worth of electricity.

Yeah RTO is way worse, even for an EV that gets 91MPGe.

The thing is: those AI datacenters are used for a lot of things, LLM's usage amount to about 3% of usage, the rest is for stuff like image analysis, facial recognition, market analysis, recommendation services for streaming platforms and so on. And even the water usage is not really the big ticket item:

collapsed inline media

The issue of placement of data centers is another discussion, and i agree with you that placing data centers in locations that are not able to support them is bullshit. But people seem to simply not realize that everything we do has a cost. The US energy system uses 58 trillion gallons of water in withdrawals each year. ChatGPT use about 360 million liters/year, which comes down to 0.006% of Americas water usage / year. An average american household uses about 160 gallons of water / day; ChatGPT requests use about 20-50 ml/request. If you want to save water, go vegan or fix water pipes.

[–] Showroom7561@lemmy.ca 3 points 4 days ago* (last edited 4 days ago)

Proof was during COVID:

In many megacities of the world, the concentration of PM and NO2 declined by > 60% during the lockdown period. The air quality index (AQI) also improved substantially throughout the world during the lockdown. SOURCE

[–] inb4_FoundTheVegan@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (3 children)

I'm on my phone so I can't fully crunch the numbers, but I took a few minutes to poke around and I think I found the stats to put both of these in perspective.

https://www.arbor.eco/blog/ai-environmental-impact

Each query sends out roughly 4.32 grams of CO₂e (MLCO2), which may seem trivial on its own but adds up millions of queries a day, and you're looking at a staggering daily output. 1 million messages sent to ChatGPT is equivalent to 11,001 miles driven by an average gasoline-powered passenger vehicle

https://www.epa.gov/greenvehicles/greenhouse-gas-emissions-typical-passenger-vehicle

The average passenger vehicle emits about 400 grams of CO2 per mile.

So yikes and without a doubt unsustainable energy usage, but comparing this to wikis article on COVID environmental impacts

https://en.wikipedia.org/wiki/Impact_of_the_COVID-19_pandemic_on_the_environment?wprov=sfla1

In 2020, carbon dioxide emissions fell by 6.4% or 2.3 billion tonnes globally.

My napkin math says that we would need ~532,407,407 AI queries to match the 2020 work for home drop, but unfortunately, Chat GPT alone is estimating 2.5 billion prompts, daily.

I started writing this assuming the opposite was true but unfortunately AI is a bigger environmental impact than an RTO. Which is honestly shocking. I hope someone corrects my math and tells me it isn't this dire. Work from should be the norm, but AI is truly just a massive environmental burden.

load more comments (3 replies)
[–] oplkill@lemmy.world 7 points 4 days ago

This! There are no reason go back to office for some professions like programmers, managers, etc

[–] SolidShake@lemmy.world 33 points 5 days ago (3 children)

I don't think regular people really understand the power needed for AI. It's often taught that we just have it. But not where it comes from.

[–] slazer2au@lemmy.world 11 points 5 days ago (1 children)

True, but most people don't realise how little not printing an email 'helped' the environment.

[–] SolidShake@lemmy.world 9 points 5 days ago (1 children)

It would have been significant if a lot of people did it.

[–] IndiBrony@lemmy.world 5 points 5 days ago (1 children)

I'm doing my part. Can't remember the last time I had to print anything.

load more comments (1 replies)

I don't think regular people really understand how little 3W per request is. It's the energy you take up by eating 3kcal. Or what your WiFi router uses in half an hour. Or your clothes dryer in 5 seconds.

[–] fading_person@lemmy.zip 3 points 4 days ago (1 children)

People keep telling us that ai energy use is very low, but at the same time, companies keep building more and more giant power hungry datacenters. Something simply doesn't add up.

Sure, a small local model can generate text at low power usage, but how useful will that text be, and how many people will actually use it? What I see is people constantly moving to the newest, greatest model, and using it for more and more things, processing more and more tokens. Always more and more.

[–] jj4211@lemmy.world 3 points 4 days ago

Each datacenter is set to handle millions of users, so it concentrates all the little requests into very few physical locations.

The tech industry further amplifies things with ambient LLM invocation. You do a random google search, it implicitly does an LLM unasked. When a user is using an LLM enabled code editor, it's making LLM requests every few seconds of typing to drive the autocomplete suggestions. Often it has to submit a new LLM request before the old one even completed because the user typed more while the LLM was chewing on the previous input.

So each LLM invocation may be reasonable, but they are being concentrated impact wise into very few places and invocations are amplified by tech industry being overly aggressive about overuse for the sake of 'ambient magic.

[–] daniskarma@lemmy.dbzer0.com 21 points 5 days ago

To be fair they never cared about environment. A paper is something easy to recycle and certainly not the most polluting material to produce.

It was more about saving money, greenwashing and pushing a conversion towards digital archiving (which is much more efficient that paper)

[–] HubertManne@piefed.social 14 points 5 days ago (4 children)

This is the main reason I am reticent about using ai. I can get around its funtional limitations but I need to know they have brought the energy usage down.

[–] taiyang@lemmy.world 11 points 5 days ago (1 children)

It's not that bad when it's just you fucking around having it write fanfics instead of doing something more taxing, like playing an AAA video game or, idk, run a microwave or whatever it is normies do. Training a model is very taxing, but running them isn't and the opportunity cost might even be net positive if you tend to use your gpu a lot.

It becomes more of a problem when everyone is doing it when it's not needed, like reading and writing emails. There's no net positive, it's a very large scale usage, and brains are a hell of a lot more efficient at it. This use case has gotta be one of the dumbest imaginable, all while making people legitimately dumber using it over time.

[–] HubertManne@piefed.social 3 points 5 days ago (3 children)

oh you are talking locally I think. I play games on my steamdeck as my laptop could not handle it at all.

Your steam deck at full power (15W TDP per default) equals 5 ChatGPT requests per hour. Do you feel guilty yet? No? And you shouldn't!

load more comments (2 replies)
[–] slazer2au@lemmy.world 9 points 5 days ago (3 children)

You can run one on your PC locally so you know how much power it is consuming

load more comments (3 replies)
[–] Blue_Morpho@lemmy.world 8 points 5 days ago (1 children)

It's the same as playing a 3d game. It's a GPU or GPU equivalent doing the work. It doesn't matter if you are asking it to summarize an email or play Red Dead Redemption.

[–] HubertManne@piefed.social 5 points 5 days ago

I mean if every web search I do is like playing a 3d game then I will stick with web searches. 3d gaming is the most energy intensive thing I do on a computer.

[–] a_wild_mimic_appears@lemmy.dbzer0.com 3 points 5 days ago (13 children)

How much further down than 3W/request can you go? i hope you don't let your microwave run 10 seconds longer than optimal, because that's exactly the amount of energy we are talking about. Or running a 5W nightlight for a bit over half an hour.

LLM and Image generation are not what kills the climate. What does are flights, cars, meat, and bad insulation of houses leading to high energy usage in winter. Even if we turned off all GenAI, it wouldn't even leave a dent compared to those behemoths.

load more comments (13 replies)
[–] Chozo@fedia.io 12 points 5 days ago

Text generation uses hardly any energy at all, though. Most phones do it locally these days. In fact, it likely takes less energy to generate an email in 5 seconds than it would take for you to type it out manually in 5 minutes with the screen on the whole time.

[–] stevedice@sh.itjust.works 12 points 5 days ago (1 children)
[–] cdf12345@lemmy.zip 16 points 5 days ago (3 children)

“If everyone is littering, it’s not a big deal if I throw the occasional can on the ground”

[–] Artisian@lemmy.world 8 points 5 days ago

I mean, depends on the email. If you spend more time answering yourself than the AI would, you almost certainly emit more green house gasses, used more fresh water and electricity, and burned more calories. Depending on the email, you might have also decreased net happiness generally.

Do we care about the environment or not? Please, oppose datacenters in desserts and stop farming alphalpha where water supplies are low. But your friend using AI to answer an email that could have been a google search is not the problem.

[–] a_wild_mimic_appears@lemmy.dbzer0.com 7 points 5 days ago (3 children)

I miss the days where climate activists didn't get distracted by small change like GenAI. The big ticket issues haven't changed since the beginning of the climate movement: Cars, Flights, Industry (mainly concrete), Meat and Heating/AC are what drives climate change - any movement that polices individual usage of negligible CO2 emission will fail because noone likes to be preached at.

load more comments (3 replies)
load more comments (1 replies)
[–] _AutumnMoon_@lemmy.blahaj.zone 11 points 5 days ago

Please consider the environment before sending me an email, seriously, I won't read it.

[–] TheReturnOfPEB@reddthat.com 10 points 5 days ago

I want to see the "cease and desist you may not use my facebook posts without my express permission" type footers but against AI to start showing up

[–] aarRJaay@lemmy.world 8 points 5 days ago (3 children)

But it's everywhere now and it's almost impossible to use mainstream services without it being used. I can just go to Google anymore, type a search query and get a reply without AI bs being used. How long before it's baked into the GMail compose window and it doesn't without me wanting to.

[–] IndiBrony@lemmy.world 10 points 5 days ago

Then we stop using it.

I think we need a Rule 34 of open-source programs:

Rule 34: If it exists, there is an open-source version of it

i) If no open-source version exists, it is currently being created

ii) If no open-source version is being created, you must create it yourself

[–] chicken@lemmy.dbzer0.com 4 points 5 days ago* (last edited 5 days ago)

Doesn't gmail already do this? I seem to remember there being 'suggested response' options before I turned it off in the settings that were definitely AI generated. That option being presented to me creeps me out because you can't know if what you're receiving was actually written by the person sending it.

[–] shalafi@lemmy.world 3 points 5 days ago* (last edited 5 days ago)

Thanks. You reminded me to turn Gemini off. Did that once and it came back on.

load more comments
view more: next ›