Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.
Read original articleThe article discusses the perceived impact of artificial intelligence (AI) on energy consumption, particularly in data centers. While recent reports suggest AI's energy demands could strain power grids, a closer examination reveals that the majority of energy usage in data centers predates the AI boom. Despite AI models requiring significant energy, they represent only a fraction of overall data center energy consumption. Estimates indicate that AI could consume 85 to 134 TWh of power by 2027, a substantial amount but still a small portion of global electricity demand. Comparatively, PC gaming alone consumes 75 TWh annually. Efforts to improve cost efficiency in AI deployment may help reduce energy consumption. The focus on AI's energy impact overlooks the broader energy usage by data centers supporting various online services. While concerns exist, the narrative of AI causing an energy apocalypse may be overstated given the larger context of data center energy consumption.
Related
AI is exhausting the power grid
Tech firms, including Microsoft, face a power crisis due to AI's energy demands straining the grid and increasing emissions. Fusion power exploration aims to combat fossil fuel reliance, but current operations heavily impact the environment.
AI's $600B Question
The AI industry's revenue growth and market dynamics are evolving, with a notable increase in the revenue gap, now dubbed AI's $600B question. Nvidia's dominance and GPU data centers play crucial roles. Challenges like pricing power and investment risks persist, emphasizing the importance of long-term innovation and realistic perspectives.
The Encyclopedia Project, or How to Know in the Age of AI
Artificial intelligence challenges information reliability online, blurring real and fake content. An anecdote underscores the necessity of trustworthy sources like encyclopedias. The piece advocates for critical thinking amid AI-driven misinformation.
AI Is Already Wreaking Havoc on Global Power Systems
AI's rapid growth strains global power grids as data centers expand to meet energy demands. Major tech firms aim for green energy, but challenges persist in balancing AI's energy hunger with sustainability goals.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's impact on energy consumption in data centers is debated. Current data shows AI's energy use is a fraction of overall consumption, with potential growth by 2027. Efforts to enhance efficiency are crucial.
Now the problem is that if someone puts a $10/month price on the AI functionality, they will stop paying for it because it doesn't really have a $10/month ROI.
Literally every AI thing I've seen people use has got an interest taper off way before the free trial even ends.
The constraining factor will prevent an energy apocalypse: It doesn't materially improve most people's lives at all.
> A similar trend will likely guide the use of generative AI as a whole, with the energy invested in AI servers tracking the economic utility society as a whole sees from the technology.
This comparison is inappropriate. Cryptocurrency was and is mainly an object of speculation, not of intrinsic economic value. I'm contrast, AI has clearly a lot of economic value, and that value increases with time, since it gets better with time.
Moreover, the rate of improvement over the last few years have been staggering. Things that seemed far out of reach science fiction technology ten years ago are now unremarkable reality. So there are solid empirical grounds to expect that progress in the coming 10 years will be transformative as well. Nothing like that was or is the case for cryptocurrency.
Does AI even use more energy than crypto ever did? Doubtful.
Besides, we're projected to ingest all of humanity's written knowledge by 2026, and after that there will be a cliff in power consumption, we will mostly switch to inference on purpose built hardware, which is vastly more efficient.
while there is value and money-grifting involved in both crypto and genAI, there is a major differentiating factor between the wastage argument - artificially-adjusted proof of work, which is/was used for major cryptocurrencies.
people spend time and energy (figuratively and literally) on the internet on many ways one could define as wastefully (take scrolling down the "for you" feed for example). but for majority of cases, the "efficiencies" or their lack of are not done intentionally and are often improved on (e.g. new codecs like AV1 needing less bandwidth).
but to make crypto tokens work, the original decision of using PoW has led to a massive surge of energy consumption to achieve the same results. while there might be some work on countering its negatives (e.g. with eth network), it is still a major driving force, despite an arms race to make more efficient hardware.
while a lot of current AI has major inefficiencies (will be too long to list down here!), the root cause is often the fast-moving innovations and everyone building on someone's imperfect work.
coming back to the original argument, you should expect more dedicated hardware to do the underlying matrix multiplications to make AI work. for most people who would only care about inference, it is already becoming a reality at consumer level. but expect people to push the new stuff to its limit to stay competitive, whether you find all of this ridiculous or not.
I get it, GenAI became popular as a consumer-facing product and tech industry PR blitz in that year. But people in the tech world should know better. AI as massive industry-wide GPGPU workload took off in the 2010s, including widespread usage of smaller models like CNNs throughout both FAANG's stack and in scattered startups, as well as supermodels used as recommendation engines by everyone and their grandma. Arguably the entire business models of every irresponsibly large tech company ran on this shit. All the telemetry. Ad targeting. Social Media feeds. Hell, GPT-3 came out before 2020. It was a scandal in this world when OpenAI exclusively licensed it to Microsoft, who was already probably using it in search at that point. None of this was actually that long ago, this is way too soon to have cultural amnesia. I get that I'm in somewhat of a bubble as an AI researcher but surely tech publications should at least know these basic facts, right? Is this yet another reason to be annoyed at the 2010s lingo for calling all these pervasive neural networks "the algorithms"?
From the perspective of energy expenditure from AI workloads, the statement that it's a major driving force of the rising energy demands of datacenters is a perfectly reasonable conclusion given a graph where the TWH more than triples between 2012 and 2024. The article sometimes specifies "generative AI" (which did exist in 2012, but was in a way less interesting state for most people and businesses until 2022), but often just says "AI", which is a big umbrella term people have at least consistently been using for most neural networks for that entire span of time (and longer, and for lots of other things, and it's hopelessly overloaded to the point of being nonsensical sometimes, but regardless this is an incredibly uncontroversial usage). So someone at a data center with a graph that basically tracks the rise of GPGPU neural networks and shows a big jump in energy expenditure over that period attributing this to "AI" is very reasonable!
Related
AI is exhausting the power grid
Tech firms, including Microsoft, face a power crisis due to AI's energy demands straining the grid and increasing emissions. Fusion power exploration aims to combat fossil fuel reliance, but current operations heavily impact the environment.
AI's $600B Question
The AI industry's revenue growth and market dynamics are evolving, with a notable increase in the revenue gap, now dubbed AI's $600B question. Nvidia's dominance and GPU data centers play crucial roles. Challenges like pricing power and investment risks persist, emphasizing the importance of long-term innovation and realistic perspectives.
The Encyclopedia Project, or How to Know in the Age of AI
Artificial intelligence challenges information reliability online, blurring real and fake content. An anecdote underscores the necessity of trustworthy sources like encyclopedias. The piece advocates for critical thinking amid AI-driven misinformation.
AI Is Already Wreaking Havoc on Global Power Systems
AI's rapid growth strains global power grids as data centers expand to meet energy demands. Major tech firms aim for green energy, but challenges persist in balancing AI's energy hunger with sustainability goals.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's impact on energy consumption in data centers is debated. Current data shows AI's energy use is a fraction of overall consumption, with potential growth by 2027. Efforts to enhance efficiency are crucial.