Skip Navigation

Is AI really burning a forest every time they’re used?

Either that or taking like an entire lake worth of ocean per generation response. I see this a lot on hexbear and I’m genuinely curious if I haven’t been misinformed. But how true are the climate change impacts about LLMs? Are they really burning a forest, taking a lake worth of water true, or is this just hyperbole? Because looking into this somewhat, I see contradictions? Alex Avila pointed out a lot of these contradictions and I am going to use his what he said on this, from his video AI Wars: How Corporations Hijacked Anti-AI Backlash at around 2:40:42

Anyways when, for example if I go into a random article, like this NPR one where the writer cites Goldman Sachs for this claims

According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search query.

Goldman Sachs has researched the expected growth of data centers in the U.S. and estimates they’ll be using 8% of total power in the country by 2030, up from 3% in 2022. Company analysts say “the proliferation of AI technology, and the data centers necessary to feed it” will drive a surge in power demand “the likes of which hasn’t been seen in a generation.”^[NPR: AI brings soaring emissions for Google and Microsoft, a major contributor to climate change]

First off, I didn’t know Goldman Sachs is a research institution? I thought they were a financial and banking institution. Which is strange that NPR is citing them. But NPR one of many yes? So to go to a different article from UNEP.

Third, data centres use water during construction and, once operational, to cool electrical components. Globally, AI-related infrastructure may soon consume six times more water than Denmark, a country of 6 million, according to one estimate. That is a problem when a quarter of humanity already lacks access to clean water and sanitation.

Finally, to power their complex electronics, data centres that host AI technology need a lot of energy, which in most places still comes from the burning of fossil fuels, producing planet-warming greenhouse gases. A request made through ChatGPT, an AI-based virtual assistant, consumes 10 times the electricity of a Google Search, reported the International Energy Agency. While global data is sparse, the agency estimates that in the tech hub of Ireland, the rise of AI could see data centres account for nearly 35 per cent of the country’s energy use by 2026^[UNEP: AI has an environmental problem. Here’s what the world can do about that. ]

or one from The Commons.

According to the IEA, while a single Google search takes 0.3 watt-hours of electricity, a ChatGPT request takes 2.9 watt-hours.”^[The Commons: Understanding AI's environmental footprint]

or Axios.

One oft-cited rule of thumb suggested that querying ChatGPT used roughly 10 times more energy than a Google search — 0.3 watt-hours for a traditional Google search compared with 2.9 watt-hours for a ChatGPT query.”^[Axios: AI's climate impact is still a black box]

So to look at the Goldman Sach study^[Goldman Sach: AI is poised to drive 160% increase in data center power demand] You find this claim.

A single ChatGPT query requires 2.9 watt-hours of electricity, compared with 0.3 watt-hours for a Google search, according to the International Energy Agency.

Who cites the International Energy Agency, much like the commons. In particular they cite this study^[IEA: Electricity 2024 Analysis and forecast to 2026]

That mentions this

Market trends, including the fast incorporation of AI into software programming across a variety of sectors, increase the overall electricity demand of data centres. Search tools like Google could see a tenfold increase of their electricity demand in the case of fully implementing AI in it. When comparing the average electricity demand of a typical Google search (0.3 Wh of electricity) to OpenAI’s ChatGPT (2.9 Wh per request), and considering 9 billion searches daily, this would require almost 10 TWh of additional electricity in a year.

And for that figure the IEA cites this paper by De Vries^[The growing energy footprint of artificial intelligence]

The Axios article links to a different study, but that study links back to the De Vries paper. So it’s interesting how a lot of these lead to De Vries paper. To quote the relevant portion from De Vries

Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.

Alex Avila points out how nonsensical this in his video, around at 2:46:45. He also points out the Goldman Sachs connection another other financial capitalist connections to this later in that video.

Mainly he just points out how that 10 time more than a standard keyword search is a financial cost not an energy one, and that the 0.3 is an energy cost. And that it’s nonsense to take 0.3 and times it by 10 from that guess that’s a financial cost to get something entirely new. Which I agree with and that 3 WH per LLM makes no sense, especially since it based off google keyword search, not LLM usage. Alex also points out that google keyword search from 2009 and a lot has changed since then.

In Alex’s video he goes onto talk more about the issues with De Vries study, but just based off this alone. Those articles above are getting something wrong. I know the articles mention more so I'll just bring up Alex's other points since I think he did a good investigation on this.

One of the things Alex points out is how like, a lot of actual energy use by AI companies and data companies aren’t really transparent so it is hard for us to know. So how can we really know how much is used?

Another thing is to go back to Alex, in his video he mentions this study The carbon emissions of writing and illustrating are lower for AI than for humans

which argues that AI costs are very low, like in this graph.

Which goes against what I hear others say or like those articles above. Is there something wrong with that study?

Also another thing worth mentioning is, again I’m referring to a lot of things Alex said, is Data Centers only take up 1-2% of electricity use. Another article saying this to.

Around the globe, data centers currently account for about 1 to 1.5 percent of global electricity use, according to the International Energy Agency.^[Scientific American: The AI Boom Could Use a Shocking Amount of Electricity]

and the IEA link in that article is different since they linked to Data Centres and Data Transmission Networks

What interesting about that that IEA page they linked, is what they say here.

Data centres and data transmission networks are responsible for 1% of energy-related GHG emissions

Anyways Alex also mentions that overall data centers and AI only make up a very small fraction, which I think is a fair point. How has AI really changed anything in regards to climate change when it still the same issues at hand? Last I check the U.S military is still one of the largest polluters. Vehicles also take up more pollution. Agriculture. Yet somehow AI is out contributing all of these? I know energy also takes up a portion to, but as been pointed out, to go to data centers it only takes up less than 2% of total energy use, and AI only a fraction of that of that 2% that expected to grow, since data centers are used for other things besides LLM’s.

Also worth mentioning is the idea that increase energy use might be overestimated as well. Going back to that video at 2:57:00, I think it’s worth watching about the issues with the IEA report on this stuff.

Besides that, another thing mention by Alex and others is water usage, but it been pointed out a lot of water is recycled in data centers? Along with some data centers using things like waste water. Also one thing Alex points out that there is less water use by things like chatgpt referring to this paper Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models that says

Additionally, GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10 – 50 medium-length responses, depending on when and where it is deployed.”

Also a really good point Alex made is how one single hamburger has more of a water usage at 1,695 liters of water for one hamburger. Another thing but from what I understand, training an AI does use a lot of energy, but when someone interacting with an AI, interference I think, the energy cost is way less.

Anyways am I'm just wondering if just being mislead or getting some things wrong? Since it just seems like, the effects climate change effects from AI is rather overblown? I think Alex did really good investigation into this. I am being genuine here since to me, a lot of AI stuff is over hyped, and I feel there a bit of some reactionary sentiment to it, treating it as a scapegoat for everything wrong. Leading to over hyping it and contribute more to that over hype.

There are some valid issues with AI, and besides that, China is showing proper uses for it, and it also helps how for the last few years they been using more renewable energies, which cut out a lot of the emissions in regards for AI stuff no? In regards to use for electricity.

31 comments
  • I’m gonna hit this thread one more time because while cooking I realized another thing: I spent a little under an hour reading the source material on just one element of the ops post and typing out a few summaries of that. What isn’t included in that is the apropos point that my hour of human writer labor, to the extent that a person can mischaracterize tippa-tapping out forum posts as such, produced two hundred or so words whose value can’t be quantified by their number because they are actually correct.

    The difference between my response and a Claude summary of the sources for the graph in the op is the difference between a deck built by four dudes from the Lowe’s parking lot and my dumbass cousin and his buddies. The same amount of lumber, fasteners, joist hangers, concrete and possibly even the same tools and lunches went into it, but one will collapse in a week and the other won’t.

    That’s what I really can’t emphasize enough, ai can look good in comparison to people’s work if the externalities are ignored and the analysis is quantitative but it falls apart under scrutiny or qualitative perspectives.

    This is a microcosm of ai use in reality where it’s only worthwhile when you need to produce something but the specific thing or it’s level of quality, however that’s measured, doesn’t matter.

  • Pretty much all popular writing about AI power usage is dogshit. Half of it confuses Watts and Watt-hours, meaning no qualified person actually critically examined the claims. Water usage is compared to "swimming pools"(no mention of what size) and electricity use is compared to old corded phones, which could mean extreme incompetence, but more likely they are being intentionally misleading.

  • Oh, yeah, the "it burns a forest" or "drinks the ocean" is hyperbole.

    However, I'm fine with it as:

    1. the USA seems to be draggin on renewable energy/cleaner energy sources
    2. Rainfall patterns are all over the place now. So it can actively be more difficult to have an inexpensive and reliable source of water to feed the power/cooling consumption of the datacenters.
    3. It isn't just ChatGPT on a site somewhere that you have to specifically go to. Its being shoved in everywhere. So its not [1 site x 10 med-length replies] its [100's of sites x 1000's of short to med-length replies]. And that's just text. Replying to an email? AI gives you an automated response that you ignore to type up your email. Searching for video game information, "Hey, here's some AI generated content" you didn't want. Trying to look up some essays about something Mao said? First result may/may not be an AI abstract that you'll ignore because its most just a copy/paste of something found in the first few pages of search results. Shit, just keeping the datacenters idling has a "fixed" cost in energy/water that is used if absolutely NOBODY asked an AI for anything.
  • honestly, the criticism of AI i often see is wrong/exaggerated. i think there is a deep, existential horror to the usage of ai that people do not like, but have a hard time actually describing why it makes them so uncomfortable, so they go for talking points they see in the discourse. that's why a lot of ai bros dissuade all criticism with "they just think ai is evil lmao"

    it isn't good for the environment, and regardless of how not good it is for the environment, it's being used to disadvantage the working class further for profit. that's bad! so i wish it was less "ai burns an entire forest with each use" and more "no amount of energy usage is an acceptable amount to generate dogshit"

    or something idk i spent like all day reading ai discourse and i think my brain is fried rn

  • Additionally, GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10 – 50 medium-length responses, depending on when and where it is deployed.

    Worth noting that this water usage is mostly from the electricity, making electricity uses water. You are pretty accurate in your assessment and I'm surprised this is an askchapo post rather then an effort post. It is overblown. It honestly strikes me as a distraction from the actual climate polluters. A small reduction of the meat industry would be massively more impactful then stopping all AI models. There's other criticisms of AI that work better imo.

    it also helps how for the last few years they been using more renewable energies, which cut out a lot of the emissions in regards for AI stuff no?

    I mean

    not really imo. If that electricity could have been redirected to turning off a fossil fuel plant then its still the same "pool" being used. If a datacenter uses 90% renewable but that means the city uses more fossil fuels instead of that renewable, its a wash.

    edit: Here's another link I found when searching around, his math works to 0.3wh for small queries. Obviously consider the source etc etc but I personally don't see an issue with this calculations. He also covers training and gpu production costs if you are interested (though briefly).

    • Worth noting that this water usage is mostly from the electricity, making electricity uses water. You are pretty accurate in your assessment and I'm surprised this is an askchapo post rather then an effort post. That's is worth noting and that one paper linked does bring that up within one of the scopes. And I didn't put it in effort since I'm borrowing heavily from Alex on this, but also just from looking around to and checking stuff, a lot of his investigation holds up.

      mean not really imo. If that electricity could have been redirected to turning off a fossil fuel plant then its still the same "pool" being used. If a datacenter uses 90% renewable but that means the city uses more fossil fuels instead of that renewable, its a wash.

      I think that is a fair point. My mindset was thinking of how China is approaching AI, and especially in regards to how their also switching over to renewables over time. It just seems like that the impacts from AI would even be less the more renewable energies are used, in what contexts and how. Since outside of LLM's they been using it for lots of stuff. I recall reading of China automating a factory line to make one type of missile and it's variants for 24 hours a day.

  • Good analysis and it's good to do some self-crit to find if our arguments are accurate or just reflexive and reactionary. Just 2 counterpoints though:

    1. You're only accounting for the marginal cost of making new queries to a pre-trained model. The whole essence of capital is that it constantly grows and makes new forms of itself in pursuit of profit, which is evidently true of LLMs. OpenAI et al are training huge new models every year, and the training is the part that consumes ridiculous amounts of resources. I don't have an analysis of the amortized cost of the training over the lifecycle of a new model, but I think that will probably change the calculation here quite a bit.
    2. Even when only accounting for marginal cost and not capital cost, new Chain-of-thought models like DeepSeek's R1 have massively shifted the balance of compute time to the side of query-time compute. The way they work is that instead of generating a response directly by predicting the next tokens after their query, they do that process many times and have a system that allows them to iteratively improve the response. That means that the cost of a query is now multiple times larger for models that use this technology. If big tech continues with the AI push I think a pessimistic prediction of the energy cost of AI would look worse than you suggest.
  • Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.

    I think the Google answer for how much energy a typical LLM interaction takes is going to be very different from an OpenAI answer for how much a typical LLM interaction takes. The reason for this is that the attention algorithm in LLMs is quadratic, meaning the amount of compute that has to happen is the square of the number of input tokens. So as the input context grows longer the amount of compute increases also, exponentially (well - in the attention layer - the rest of the network is affected differently). Google seems to be using LLM responses in search where people have very short inputs (granted theres probably a bunch of other context fed in, but regardless)... whereas ChatGPT has a context window that grows with each message. So I think that saying a ChatGPT query is probably 3Wh because google estimates LLM interactions are 10x more energy intensive is likely wrong.

    Just to add I think the pertinent information we would want to say anything about the power consumption is how many operations does a typical interaction take (and, even better, if there was a distribution because who knows if it's actually a situation where 10% of interactions take 99% of the energy), and how many operations per watt do they achieve (and I think there are also different answers here based on batching and other optimizations they might do that involves actually running many inferences at once, so maybe peak performance is what we'd be interested in)

  • ai won't kill the planet, but also ai eats a lot of shit on production stage, making silicon with caps (think they dropped tantal, but still), and stuff taken together consumes a lot of minerals from the production side, fabs utilize lots of water/chemicals which are disposed in funny ways (see apple shenanigans in cali), etc.

    main waste as is, is in developing competing models

    and goldman is kinda a research, cause they predict some lateral demand to buy shares of companies involved (say on first blush you say buy nvidia, on second - buy tsmc, on third - buy energy suppliers and water rights in taiwan as well as copper production). also i don't think 3wh is particularly egregious, if it solves for 2s, it corresponds to like 5400w powerful gpu which is like fine, it's somewhat on scale of 120kw per 70 blackwells (not that they would use them for inference), i think you can take advertised tokens per second, and get to energy consumption using whole rack (probably of tpus or whatever google inference thingy called) to get somewhat sane answer

    tl.dr they won't burn a forest to work a datacenter, they will burn a forest to mine copper.

31 comments