August 22nd, 2024

Is AI eating all the energy?

The article examines the significant energy consumption of AI technologies, highlighting environmental concerns, the complexities of energy use narratives, and the distinction between energy-intensive training and less demanding inference processes.

Read original articleLink Icon
Is AI eating all the energy?

The article discusses the energy consumption associated with artificial intelligence (AI) technologies, particularly focusing on the efficiency and proportional costs of running AI models. It highlights a growing concern about the environmental impact of AI, with claims that AI systems consume significant amounts of energy, often compared unfavorably to traditional technologies. The author notes that while there is a justified resentment towards big tech companies, the narrative surrounding AI's energy use may not fully capture the complexities involved. The piece emphasizes that energy consumption is just one aspect of a broader conversation about AI's societal and economic implications. It outlines the correlation between power consumption, heat emission, carbon output, and water usage, indicating that as demand for AI services increases, so does energy consumption. However, advancements in energy efficiency are also occurring, which complicates the overall picture. The article categorizes AI power requirements into training and inference, explaining that training models is significantly more energy-intensive than using them for inference. It provides various examples of energy consumption for different AI models, illustrating the range of energy costs associated with training and deploying AI technologies. Ultimately, the author suggests that while AI's energy use is a valid concern, it should be considered alongside other factors in evaluating its overall impact.

- AI technologies are associated with significant energy consumption, raising environmental concerns.

- The narrative around AI's energy use may oversimplify the complexities involved.

- Energy consumption trends are influenced by both increasing demand and improvements in energy efficiency.

- Training AI models is much more energy-intensive than using them for inference.

- The discussion of AI's impact should encompass broader societal and economic implications beyond energy use.

Related

Taking a closer look at AI's supposed energy apocalypse

Taking a closer look at AI's supposed energy apocalypse

Artificial intelligence's impact on energy consumption in data centers is debated. Current data shows AI's energy use is a fraction of overall consumption, with potential growth by 2027. Efforts to enhance efficiency are crucial.

Taking a closer look at AI's supposed energy apocalypse

Taking a closer look at AI's supposed energy apocalypse

Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.

Can the climate survive the insatiable energy demands of the AI arms race?

Can the climate survive the insatiable energy demands of the AI arms race?

Google's emissions spike 50% in 5 years due to AI energy demands, posing climate challenges. Datacentres crucial for AI consume much electricity, with predictions of doubling consumption by 2026. Tech firms invest in renewables, but scaling challenges persist. AI's pursuit of advanced models intensifies energy use, raising sustainability concerns.

The Uneven Distribution of AI's Environmental Impacts

The Uneven Distribution of AI's Environmental Impacts

The article discusses AI's environmental impacts, emphasizing high energy and water use during model training. It highlights freshwater evaporation, uneven impact distribution, and suggests managing data centers responsibly to address environmental inequality.

Google reported a 13% increase in its emissions footprint in 2023

Google reported a 13% increase in its emissions footprint in 2023

The environmental impact of AI is concerning, with emissions rising due to increased energy consumption in data centers. Efficient practices are needed to balance AI's benefits and its environmental costs.

Link Icon 11 comments
By @readyplayernull - 6 months
Isn't it disconcerting that brains consume so little energy doing continuous high complexity tasks like visual processing and even resting produce vivid simulations, while numeric calculations require so much effort for advanced brains? It's like the energy usage pattern is completely opposite between brains and computers. That should tell something about our approach to intelligence by using, maybe, the wrong tool?
By @zekrioca - 6 months
I think the comparisons to images are a bit misleading, where the author implies that generating images are somehow more energy-efficient than taking the photos yourself.

It basically ignores the fact that the very data these large models use came from real photos taken from real cameras etc. and that used real energy, which in the end should be included in the total training cost, i.e., in addition to the electricity consumed by the GPUs. This would have an impact in how the total cost can be amortised with usage. Of course the impact might be just to slow down a little the amortisation with use, but it still should be included for a fair comparison.

Same applies to some of the other examples they give.

By @lokimedes - 6 months
Progress (empowerment) always costs more energy, that is thermodynamics. We may be surprised in what form the expended energy takes, or skeptical about the benefits (think whale oil lamps), but that is the general pattern.

That we (and especially eco-dreamers in Europe) saw the stagnating energy consumption per capita over the last 3 decades as evidence that we can save energy to reduce climate change is a fiction that can only be held by people who haven’t witnessed true progress (or cared for it).

By @synicalx - 6 months
I've often wondered if we're focusing on the wrong things with lines of thinking like "does X use too much power?". It's not like our demand for electricity is ever going to be lower than it is right now, so trying to use less of it seems futile. We can generate immense quantities of electricity given enough time and money, we've obviously fallen well behind if electricity is considered a limiting factor.

Disclaimer; I'm not talking specifically about AI here, just production and consumption of electricity in all it's forms.

By @jillesvangurp - 6 months
We live in interesting times. Data centers are starting to use a lot of energy. But in the grand scheme of the energy market rapidly trending towards using renewable energy, that's actually OK.

Energy usage is bad for our planet when you use things like coal and gas plants to generate the energy. But the vast majority of installed new capacity is actually a mix of wind and solar supplemented with battery. That mix is growing at a rate measured in TW/year of added capacity. It's well over 80% now.

A lot of new gas and coal capacity is increasingly reserved for use rather than actually used. Shiny new gas plants designed to run 24x7 are actually now being pushed into a role as peaker plants. Which means they are rapidly becoming financial basket cases. Some relatively new plants have actually been closed already for this reason.

More coal gets decommissioned than built at this point (China and India are still adding capacity, the rest of the world mostly isn't). For the world, peak coal was last decade. And even in China, peak coal seems near.

In terms of overall fossil fuel usage, things are about to peak and then decline in terms of fossil resources used. What that means is that from now on the amount of fossil fuels (coal, gas, oil) used for electricity production should start to decline not just in relative terms but in absolute terms.

Don't get me wrong, it's still pretty bad and short term the decline is maybe not yet quite happening or that easy to observe. But in a few years there should be some measurable steady decline. Experts seem to disagree whether that's right now or by something like 2028. But the point is that we're at or close to the peak here.

Cloud computing and AI are one of the big drivers for all this. They create demand for more energy. This causes the energy market to grow. But as it grows, fossil fuel usage is starting to shrink. That's a trend that accelerates as renewable energy and batteries get cheaper (learning effects, technology improvements, etc.).

There are some pretty interesting economics at play here that boil down that doing more of a thing causes that thing to become cheaper. Not just a little bit but actually by quite a lot. So, there's value in creating new demand: it speeds all this up.

Cost per mwh is an interesting number to track. That has been dropping steadily over the last decade and a half. Anyone using a lot of energy, e.g. data centers, is going to be looking for the most cost effective way to source that energy. And renewables are at this point the cheapest source by far.

The main premise of this article seems to suggest that we should slow down things like AI because it uses too much energy. But by creating a need to grow the energy market, we're actually speeding up the demise of fossil fuel generation; not slowing it down. So, I'm not so convinced that putting a lot of energy (pun intended) into attempting to slow this down is that productive. It's probably futile and the effects are debatable (I would argue negative even).

Besides, computing resources are enabling a few things that are actually causing behavioral changes. For example, we travel a lot less for work. And we can work remotely. Also a lot of work is getting automated entirely removing the need to move humans around to do the work. AI is projected to be a big part of that. It has an energy cost. But it also has upsides. And that cost can met sustainably. So, it's not all that bad.

By @1vuio0pswjnm7 - 6 months
"Recent tech trends have followed a pattern of being huge society-disrupting systems that people don't actually want."

"While planned obsolescence means this applies to consumer products in general, the recent major tech fad hypes - cryptocurrency, "the metaverse", artificial intelligence... - all seem to be comically expensive boondoggles that only really benefit the salesmen."

But acccording to sentence #1, some people do want these things, e.g., salespeople, investors, etc. But as sentence #2 implies, this is not a majority of people. Over the years, when people complain on HN about software and the web, other commenters, presumably software/web developers, respond that these complainers do not matter because they are only a minority: the majority of people do not complain. Failure to complain apparently means the silent majority "wants" what they get.

What rule can we derive from this developer "reasoning/argument". Sometimes it makes sense to cater to a minority, other times it doesn't?

"If you're not familiar, data centers are dedicated facilities for running servers. Data centers are "the cloud": instead of running your own servers, you can rent computer power from experts who are very good at keeping computers from turning off."

"The problem we immediately run into if we try to think about the proportional cost of AI is that there is no consensus on whether it's ultimately useful."

I think it is safe to say that only a minority of people are speculating that "AI" is useful. The majority are silent. Does that mean the majority want it.

We know what salespeople, investors and developers will say.

"As I mentioned earlier, the AI boom feels a lot like the blockchain cryptocurrency push of a few years ago. Like cryptocurrency, AI is a tech fad, it requires data centers, it consumes more energy than a webserver... the comparison is extremely natural."

If these things were truly useful, then it stands to reason the majority of people would be asking for them. Instead we see indifference. To developers, this indifference equates to "Yes, we want it".

A fundamental problem with paying atteniton to the speculation of developers and other pundits who advocate expensive, unnecessary uses of computers, e.g., "AI", besides the obvious conflict of interest, is that these folks do not have a good track record of being honest. To start MicroSoft in the 1970s Bill Gates had to lie that he had software that did not yet exist. This "vapourware" tactic might seem quaint but this culture of dishonesty now involves much higher stakes.

Developers are still faking demos. Putting Elizabeth Holmes in prison has not stopped the Silicon Valley culture of fraud.

By @samstave - 6 months
I wrote up a thing on this:

HNer @externedguy "..built interactive map of active & decommissioned nuclear stations/reactors"

https://news.ycombinator.com/item?id=41189056

---

So I wrote the following in response:

(I correlated the Nuclear reactor locations with DataCenters, undersea cable endpints (which will be near both nukes and datacenters)

As they could be layers - and we track shipments and we can see where AI consumes:

---

...if we add the layers of the SubmarinCableMap [0] DataCenterMap [1] - and we begin to track shipments

And

https://i.imgur.com/zO0yz6J.png -- Left is nuke, top = cables, bottom = datacenters. I went to ImportYeti to look into the NVIDIA shipments: https://i.imgur.com/k9018EC.png

And you look at the suppliers that are coming from Taiwan, such as the water-coolers and power cables to sus out where they may be shipping to, https://i.imgur.com/B5iWFQ1.png -- but instead, it would be better to find shipping lables for datacenters that are receiving containers from Taiwain, and the same suppliers as NVIDIA for things such as power cables. While the free data is out of date on ImportYeti - it gives a good supply line idea for NVIDIA... with the goal to find out which datacenters that are getting such shipments, you can begin to measure the footprint of AI as it grows, and which nuke plants they are likely powered from.

Then, looking into whatever reporting one may access for the consumption/util of the nuke's capacity in various regions, we can estimate the power footprint of growing Global Compute.

DataCenterNews and all sorts of datasets are available - and now the ability to create this crawler/tracker is likely full implementable

https://i.imgur.com/gsM75dz.png https://i.imgur.com/a7nGGKh.png

[0] https://www.submarinecablemap.com/

[1] https://www.datacentermap.com/

----

And a while back I posted:

In the increasingly interconnected global economy, the reliance on Cloud Services raises questions about the national security implications of data centers. As these critical economic infrastructure sites, often strategically located underground, underwater, or in remote-cold locales, play a pivotal role, considerations arise regarding the role of military forces in safeguarding their security. While physical security measures and location obscurity provide some protection, the integration of AI into various aspects of daily life and the pervasive influence of cloud-based technologies on devices, as evident in CES GPT-enabled products, further accentuates the importance of these infrastructure sites.

Notably, instances such as the seizure of a college thesis mapping communication lines in the U.S. underscore the sensitivity of disclosing key communications infrastructure.

Companies like AWS, running data centers for the Department of Defense (DoD) and Intelligence Community (IC), demonstrate close collaboration between private entities and defense agencies. The question remains: are major cloud service providers actively involved in a national security strategy to protect the private internet infrastructure that underpins the global economy, or does the responsibility solely rest with individual companies?

By @monero-xmr - 6 months
Energy use is only bad if the user is doing something you disagree with. When crypto miners were in the news, it was terrible because crypto mining "provides no value" (to the people doing the criticizing).

Now that AI is using massive energy and the usual nags are coming out to criticize it, suddenly HN is promulgating the glorious benefits of such excess energy use. Pot meet kettle. Human nature never changes

By @mjfl - 6 months
AI is not useful enough to use all this energy. Sounds like capital destruction.
By @pants2 - 6 months
Bitcoin only uses huge amounts of energy because the bitcoin community likes it that way. Nearly all other blockchains use some form of Proof of Stake model and their energy use isn't much more than a typical cloud service.