this post was submitted on 24 May 2025
1166 points (99.0% liked)

Science Memes

14649 readers
2956 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] vivendi@programming.dev 2 points 10 hours ago* (last edited 10 hours ago) (3 children)

I will cite the scientific article later when I find it, but essentially you're wrong.

[–] lipilee@feddit.nl 7 points 8 hours ago (2 children)

water != energy, but i'm actually here for the science if you happen to find it.

[–] EldritchFeminity@lemmy.blahaj.zone 1 points 48 minutes ago

It can in the sense that many forms of generating power are just some form of water or steam turbine, but that's neither here nor there.

IMO, the graph is misleading anyway because the criticism of AI from that perspective was the data centers and companies using water for cooling and energy, not individuals using water on an individual prompt. I mean, Microsoft has entered a deal with a power company to restart one of the nuclear reactors on Three Mile Island in order to compensate for the expected cost in energy of their AI. Using their service is bad because it incentivizes their use of so much energy/resources.

It's like how during COVID the world massively reduced the individual usage of cars for a year and emissions barely budged. Because a single one of the largest freight ships puts out more emissions than every personal car combined annually.

[–] vivendi@programming.dev 1 points 8 hours ago

This particular graph is because a lot of people freaked out over "AI draining oceans" that's why the original paper (I'll look for it when I have time, I have a exam tomorrow. Fucking higher ed man) made this graph

[–] xthexder@l.sw0.com 6 points 8 hours ago (1 children)

Asking ChatGPT a question doesn't take 1 hour like most of these... this is a very misleading graph

[–] vivendi@programming.dev 3 points 8 hours ago* (last edited 8 hours ago) (1 children)

This is actually misleading in the other direction: ChatGPT is a particularly intensive model. You can run a GPT-4o class model on a consumer mid to high end GPU which would then use something in the ballpark of gaming in terms of environmental impact.

You can also run a cluster of 3090s or 4090s to train the model, which is what people do actually, in which case it's still in the same range as gaming. (And more productive than 8 hours of WoW grind while chugging a warmed up Nutella glass as a drink).

Models like Google's Gemma (NOT Gemini these are two completely different things) are insanely power efficient.

[–] xthexder@l.sw0.com 3 points 8 hours ago* (last edited 8 hours ago) (1 children)

I didn't even say which direction it was misleading, it's just not really a valid comparison to compare a single invocation of an LLM with an unrelated continuous task.

You're comparing Volume of Water with Flow Rate. Or if this was power, you'd be comparing Energy (Joules or kWh) with Power (Watts)

Maybe comparing asking ChatGPT a question to doing a Google search (before their AI results) would actually make sense. I'd also dispute those "downloading a file" and other bandwidth related numbers. Network transfers are insanely optimized at this point.

[–] vivendi@programming.dev 1 points 8 hours ago

I can't really provide any further insight without finding the damn paper again (academia is cooked) but Inference is famously low-cost, this is basically "average user damage to the environment" comparison, so for example if a user chats with ChatGPT they gobble less water comparatively than downloading 4K porn (at least according to this particular paper)

As with any science, statistics are varied and to actually analyze this with rigor we'd need to sit down and really go down deep and hard on the data. Which is more than I intended when I made a passing comment lol

[–] Sorse@discuss.tchncs.de 1 points 8 hours ago (1 children)
[–] vivendi@programming.dev 3 points 8 hours ago* (last edited 8 hours ago)

According to https://arxiv.org/abs/2405.21015

The absolute most monstrous, energy guzzling model tested needed 10 MW of power to train.

Most models need less than that, and non-frontier models can even be trained on gaming hardware with comparatively little energy consumption.

That paper by the way says there is a 2.4x increase YoY for model training compute, BUT that paper doesn't mention DeepSeek, which rocked the western AI world with comparatively little training cost (2.7 M GPU Hours in total)

Some companies offset their model training environmental damage with renewable and whatever bullshit, so the actual daily usage cost is more important than the huge cost at the start (Drop by drop is an ocean formed - Persian proverb)