June 26th, 2024

Taking a closer look at AI's supposed energy apocalypse

Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.

Read original articleLink Icon
Taking a closer look at AI's supposed energy apocalypse

The article discusses the perceived impact of artificial intelligence (AI) on energy consumption, particularly in data centers. While recent reports suggest AI's energy demands could strain power grids, a closer examination reveals that the majority of energy usage in data centers predates the AI boom. Despite AI models requiring significant energy, they represent only a fraction of overall data center energy consumption. Estimates indicate that AI could consume 85 to 134 TWh of power by 2027, a substantial amount but still a small portion of global electricity demand. Comparatively, PC gaming alone consumes 75 TWh annually. Efforts to improve cost efficiency in AI deployment may help reduce energy consumption. The focus on AI's energy impact overlooks the broader energy usage by data centers supporting various online services. While concerns exist, the narrative of AI causing an energy apocalypse may be overstated given the larger context of data center energy consumption.

Link Icon 10 comments
By @cjk2 - 5 months
It's not about an energy apocalypse at all. It's about the cost versus benefit when the principal cost on large scale deployments is energy. AI stuff takes a lot of energy to run therefore costs a lot of money to run therefore requires revenue that matches the energy used.

Now the problem is that if someone puts a $10/month price on the AI functionality, they will stop paying for it because it doesn't really have a $10/month ROI.

Literally every AI thing I've seen people use has got an interest taper off way before the free trial even ends.

The constraining factor will prevent an energy apocalypse: It doesn't materially improve most people's lives at all.

By @robnado - 5 months
This is an economics problem: if the cost of energy is lower or equal to the benefit of consuming it, the energy will be consumed. As more and more energy is consumed to run these datacenters, price of energy will go up, and datacenters will learn to either be more efficient, or will pass on these costs to their users who will learn to be more efficient. Of course, there might be inefficiencies like subsidies and monopolies that distort the market for energy for a while, but in the long run, this is a self-fixing problem.
By @cubefox - 5 months
> The important thing to remember, though, is that there are economic limits involved in the total energy use for this kind of technology. With bitcoin mining, for instance, the total energy usage has jumped up and down over time, tracking pretty closely with the price of bitcoin. When bitcoin is less valuable, miners are less willing to spend a lot of electricity chasing lower potential profits. That's why alarmist headlines about bitcoin using all the world's energy by 2020 never came to pass (to say nothing of efficiency gains in mining hardware).

> A similar trend will likely guide the use of generative AI as a whole, with the energy invested in AI servers tracking the economic utility society as a whole sees from the technology.

This comparison is inappropriate. Cryptocurrency was and is mainly an object of speculation, not of intrinsic economic value. I'm contrast, AI has clearly a lot of economic value, and that value increases with time, since it gets better with time.

Moreover, the rate of improvement over the last few years have been staggering. Things that seemed far out of reach science fiction technology ten years ago are now unremarkable reality. So there are solid empirical grounds to expect that progress in the coming 10 years will be transformative as well. Nothing like that was or is the case for cryptocurrency.

By @mordymoop - 5 months
What some people seem to be describing or expecting is that everything comes to a screeching years-long halt. What will actually happen is we will be on a slightly slower exponential growth curve than we counterfactually would have been in a world with much more energy infrastructure. The subjective experience will still be one of being on an exponential trend. The constraints are just different.
By @verisimi - 5 months
This ai energy stuff makes zero sense..... Unless you have an existing metric for the energy cost of fiat currency. So what is the cost of fiat? And I don't mean merely the production of paper, but also the cost of moving it, transferring it online, etc. The whole idea of an energy cost for currency is nonsense imo. But it does make sense if you want to usher in the idea of a carbon tax.
By @whoistraitor - 5 months
What I’d love to see is a rundown of new efficiencies in generative AI. With things like quantization and specialized transformer hardware, the costs will hopefully be less shocking in the future. And fwiw, I don’t find the article’s stats very shocking anyway. At least it’s a net win type of game, whereas the previous environmental bugbear, PoW crypto, was entirely premised on net-energy loss.
By @antisthenes - 5 months
Wasn't crypto supposed to cause this same energy apocalypse?

Does AI even use more energy than crypto ever did? Doubtful.

Besides, we're projected to ingest all of humanity's written knowledge by 2026, and after that there will be a cliff in power consumption, we will mostly switch to inference on purpose built hardware, which is vastly more efficient.

By @rldjbpin - 5 months
> And for those opposed to generative AI on principled or functional grounds, putting similar energy into millions of Nvidia AI servers probably seems like just as big of an energy waste.

while there is value and money-grifting involved in both crypto and genAI, there is a major differentiating factor between the wastage argument - artificially-adjusted proof of work, which is/was used for major cryptocurrencies.

people spend time and energy (figuratively and literally) on the internet on many ways one could define as wastefully (take scrolling down the "for you" feed for example). but for majority of cases, the "efficiencies" or their lack of are not done intentionally and are often improved on (e.g. new codecs like AV1 needing less bandwidth).

but to make crypto tokens work, the original decision of using PoW has led to a massive surge of energy consumption to achieve the same results. while there might be some work on countering its negatives (e.g. with eth network), it is still a major driving force, despite an arms race to make more efficient hardware.

while a lot of current AI has major inefficiencies (will be too long to list down here!), the root cause is often the fast-moving innovations and everyone building on someone's imperfect work.

coming back to the original argument, you should expect more dedicated hardware to do the underlying matrix multiplications to make AI work. for most people who would only care about inference, it is already becoming a reality at consumer level. but expect people to push the new stuff to its limit to stay competitive, whether you find all of this ridiculous or not.

By @advael - 5 months
This article is a great example of a really frustrating thing people are doing, which is pretending that all this AI business started two years ago

I get it, GenAI became popular as a consumer-facing product and tech industry PR blitz in that year. But people in the tech world should know better. AI as massive industry-wide GPGPU workload took off in the 2010s, including widespread usage of smaller models like CNNs throughout both FAANG's stack and in scattered startups, as well as supermodels used as recommendation engines by everyone and their grandma. Arguably the entire business models of every irresponsibly large tech company ran on this shit. All the telemetry. Ad targeting. Social Media feeds. Hell, GPT-3 came out before 2020. It was a scandal in this world when OpenAI exclusively licensed it to Microsoft, who was already probably using it in search at that point. None of this was actually that long ago, this is way too soon to have cultural amnesia. I get that I'm in somewhat of a bubble as an AI researcher but surely tech publications should at least know these basic facts, right? Is this yet another reason to be annoyed at the 2010s lingo for calling all these pervasive neural networks "the algorithms"?

From the perspective of energy expenditure from AI workloads, the statement that it's a major driving force of the rising energy demands of datacenters is a perfectly reasonable conclusion given a graph where the TWH more than triples between 2012 and 2024. The article sometimes specifies "generative AI" (which did exist in 2012, but was in a way less interesting state for most people and businesses until 2022), but often just says "AI", which is a big umbrella term people have at least consistently been using for most neural networks for that entire span of time (and longer, and for lots of other things, and it's hopelessly overloaded to the point of being nonsensical sometimes, but regardless this is an incredibly uncontroversial usage). So someone at a data center with a graph that basically tracks the rise of GPGPU neural networks and shows a big jump in energy expenditure over that period attributing this to "AI" is very reasonable!

By @renewiltord - 5 months
I don’t take environmentalists seriously because they primarily act to accelerate climate change. So I’m just going to dismiss this out of hand. They’re a poor source of information.