Unmasking the Fear of AI’s Energy Demand

How to stop worrying and love our high-energy planet

Amongst the many energy-hungry technologies supporting modern society, artificial intelligence (AI) is emerging as a major driver of energy demand. Data centers—the physical infrastructure enabling AI—are becoming larger, multiplying, and consuming more energy. Environmental organizations such as Greenpeace are concerned that this will jeopardize decarbonization efforts and halt progress in the fight against climate change. AI can track melting icebergs or map deforestation, all the while consuming excessive amounts of carbon-intensive energy. But a closer look at the data shows that fears of AI’s insatiable appetite for energy may be unwarranted.

If we take reports at face value, we might conclude that AI-induced climate stress is all but inevitable. Niklas Sundberg, a board member of the nonprofit SustainableIT.org claims that a single query on ChatGPT generates 100 times the amount of carbon as a Google search. The International Energy Agency predicts that global energy demand from data centers, cryptocurrency, and AI will double by 2026. Even the U.S. government believes that AI will exert a major influence on society: the Department of Homeland Security announced the first 10 hires for a newly formed AI Corps, to provide advice on how best to use AI within the federal government. The Department of Energy has created a working group on the energy needs of data center infrastructure and is talking to utilities on how to meet energy demand. AI’s energy demands, according to the Bloomberg Energy Daily, are “a source of trepidation.”

Climate activists have raised the alarm. Greenpeace is calling for an official emissions tracking system to quantify AI’s environmental impacts. Climate researcher Sasha Luccioni proposes that governments restrict AI’s energy use, including by the use of AI “sobriety" measures or a carbon tax to deter electricity consumption. Vox warns that the benefits of the modern world—while substantial—come with tradeoffs and that none of these trade-offs is as important as energy: “As the world heats up toward increasingly dangerous temperatures, we need to conserve as much energy to lower the amount of climate-heating gasses we put into the air.” AI’s energy consumption has become yet another way to push back at a high-energy planet.

But a closer look reveals a more complex relationship between AI use and energy demand, energy efficiency, and decarbonization that isn’t all bad news. First, there is the question of whether businesses are using AI. With data from the US Census Bureau, Guy Berger at the Burning Glass Institute shows that the two most common applications of AI are marketing automation (2.5% of US businesses) and virtual agents/chat bots (1.9% of businesses). Only 1% of businesses have used large language models. Berger’s analysis also shows that labor-saving uses of AI are still quite small, with 1 out of every 4 businesses using AI to perform a few tasks that were previously carried out by humans. And the largest businesses are most likely to say that they are using AI but also the most likely to say that they don’t know if they are using AI. Of course, the use of AI will increase in the future, but for now, it seems mostly confined to a few sectors and activities.


But beyond the eye-catching statistics, estimates of energy consumption are difficult to find, in part because industry data are heavily guarded and researchers have to rely on overly simplistic extrapolations. The Goldman Sachs Group estimates that AI power demand in the UK will rise 500% over the next decade. U.S. data centers could account for 8% of total electricity needs by 2030, up from 3% in 2022. States that house data centers appear to be running out of power. The Boston Consulting Group comes up with similar numbers—electricity consumption by data centers is projected to reach 7.5% of total US electricity consumption by 2030, up from about 2%. Generative AI is expected to contribute at least 1% to this growth. Rystad Energy says that data centers and AI energy use will increase by 177 TWh, reaching 307 TWh by 2030.

In a detailed thread on X, MIT Innovation Fellow and former National Economic Council director Brian Deese argues that forecasters consistently overestimate electricity demand, in part because they emphasize static load growth over efficiency gains. Deese points out that in the early 2000s, analysts predicted surging electricity demand. Instead, U.S. electricity demand has stayed flat for two decades. And although data center energy use is increasing, energy intensity (energy use per computation) has decreased by 20% every year since 2010. Nvidia—one of the largest companies designing graphics processing units (GPUs) for gaming, professional visualization, data centers, and automotive markets—is continuously improving the energy efficiency of its GPUs. Its new AI-training chip, Blackwell, for example, will use 25 times less energy than its predecessor, Hopper. Deese points out that analysts may be double-counting energy use by data centers because technology companies initiate multiple queries in different utility jurisdictions to get the best rates.

A (carbon-heavy) query to ChatGPT suggests AI and data service providers have considerable room to improve the energy efficiency of data center infrastructure using various measures:

Virtualization and Consolidation: Virtualization technology can be used to consolidate servers and reduce the number of physical machines running. This can lead to significant energy savings by optimizing server utilization rates.

Efficient Cooling Systems: Cooling accounts for a substantial portion of a data center's energy consumption. Implementing efficient cooling techniques such as hot/cold aisle containment, using free cooling when ambient temperatures allow, and employing modern cooling technologies like liquid cooling can reduce energy usage.

Energy-Efficient Hardware: Energy-efficient servers, storage devices, and networking equipment can be a priority, as can the use of products with high energy efficiency ratings (such as ENERGY STAR certified devices), with use configurations optimized for lower power consumption.

Power Management Software: Power management tools and software can monitor and adjust power usage based on demand. This includes dynamically adjusting server power levels during periods of low activity (e.g., using power capping techniques).

Optimized Data Center Layout: Data center layouts can be designed to minimize energy waste and optimize airflow. This includes proper rack layout, efficient cable management, and ensuring equipment is placed to minimize cooling requirements.

Energy-Efficient Data Storage: Efficient data storage technologies and practices, such as data de-duplication and compression, can be used to reduce the overall storage footprint and associated energy requirements. Continuous monitoring and optimization will also help.

Electricity demand from electric vehicles (EVs) may prove to be comparable or even higher than that of AI. The Princeton REPEAT model estimates the demand for electricity in the United States at 391 TWh for EV transportation (light-duty vehicles and other electric transport) in 2030, which is similar to BCG’s 2030 estimates for data centers (320 - 390 TWh). Rystad Energy predicts EV usage will grow from 18.3 TWh to 131 TWh for the same period. Despite the additional energy demand, policymakers strongly encourage the purchase of EVs and the construction of charging infrastructure, while commentators seem relatively unconcerned about EV charging needs. This may be because EVs are seen to be filling an existing societal need for transportation, as well as a solution to the problem of climate change. Even though AI has potential to raise productivity and improve lives, it is a new and energy-intensive technology whose value runs counter to the priorities of the environmental community.

Note: Projections are for the United States.

No matter the level of future AI use, AI’s energy demand will make it more difficult—if not impossible—to dismiss the intermittency challenges associated with powering commercial and industrial loads with wind and solar energy. Data centers’ real-time power demand requires continuous, dispatchable power which cannot be provided solely by renewables without significant excess generation capacity and large amounts of cheap storage.

Technology companies like Microsoft and Google are taking steps to meet their data center energy needs. Microsoft recently inked an agreement with Constellation Energy to supply its data center with nuclear-produced power. Other firm clean energy sources may also play crucial roles in decarbonizing AI energy consumption. Last year, Google partnered with Fervo Energy to power its Nevada-based data center with geothermal power. At least one hydropower developer—Rye Development—is planning to develop hydroelectric facilities to match data center electricity use.

The bottom line is that we do not need to fear AI’s challenge to the energy grid. Utilities and tech companies will meet increased demand by using a mix of energy sources, including clean and firm electricity supplies like nuclear energy, geothermal power, and even hydropower. AI is not the first—and nor will it be the last—game changer in society’s energy consumption. The discourse on AI's energy footprint must therefore shift from apprehension to proactive problem-solving, focused on energy efficiency gains and diversification of clean energy sources, driven by the notion that a high-energy planet is essential for human progress.