The AI-Energy Collision: How Data Centers Are Driving the Next Energy Crisis

Home > News > Technology > The AI-Energy Collision: How Data Centers Are Driving the Next Energy Crisis
The AI-Energy Collision
May 17, 2026 / By Peter

Global data centers consumed 415 terawatt-hours of electricity in 2024. By 2030, that figure is set to more than double. Behind every AI query lies a grid under siege, a water table depleted, and an electricity bill quietly climbing.

Research Desk · May 2026 · Sources: IEA, Brookings Institution, Pew Research Center, Carnegie Mellon, Belfer Center

415 TWh

Global data center electricity use, 2024

945 TWh

IEA projected use by 2030 (base case)

$580B

Estimated global AI data center spend, 2025

For most of human history, the rise of a new technology meant finding more coal, more oil, more steam. Today, the fuel driving artificial intelligence is electricity — and the global grid is struggling to keep up. Data centers, the physical backbone of every AI query, every language model response, every image generated on demand, have quietly become one of the world’s fastest-growing electricity consumers.

In 2024, the world’s data centers consumed approximately 415 terawatt-hours (TWh) of electricity, representing about 1.5% of all global electricity use. That number has been growing at a compound annual rate of 12% since 2017 — more than four times faster than global electricity consumption as a whole. The International Energy Agency (IEA) projects that figure will more than double by 2030, reaching 945 TWh in its base case scenario, and climbing to 1,200 TWh by 2035.

“If data centers were a country, they would already be the world’s fifth-largest energy consumer by 2026 — slotting in between Japan and Russia.”

The United States sits at the epicenter of this explosion. In 2024, the U.S. accounted for 45% of all global data center electricity consumption. American data centers consumed 183 TWh — more than 4% of the country’s total electricity use, roughly equivalent to the annual demand of the entire nation of Pakistan. By 2030, U.S. consumption from data centers alone is projected to surge by 130% to reach 426 TWh.

The AI boom has triggered an unprecedented capital spending wave. In 2024, Amazon, Microsoft, Google, and Meta collectively spent over $200 billion on capital expenditures — a 62% year-over-year increase from 2023. Amazon’s CapEx alone hit $85.8 billion (up 78%), Microsoft’s was $44.5 billion (up 58%), Google’s was $52.5 billion (up 63%), and Meta’s was $39.2 billion (up 40%). In 2025, Amazon is projected to surpass $100 billion in CapEx, while global investment in AI-focused data center infrastructure is estimated at $580 billion for the year.

The Scale Inside the Numbers

Abstract terawatt-hours are difficult to grasp. Consider this: a typical AI-focused hyperscale data center consumes as much electricity annually as 100,000 U.S. households. The next generation of facilities currently under construction are expected to consume 20 times that amount. One large data center can use up to 5 million gallons of water per day — the daily consumption of a town of 10,000 to 50,000 people.

In 2023, U.S. data centers directly consumed roughly 17 billion gallons of water, with hyperscale and co-location facilities accounting for 84% of that total. Hyperscale data centers alone are expected to consume between 16 and 33 billion gallons of water annually by 2028 — before accounting for water used indirectly in electricity generation or semiconductor manufacturing.

Training a single large AI model tells the same story in miniature. GPT-4’s training run consumed an estimated 50 gigawatt-hours of electricity. A single text response from a large language model consumes hundreds to thousands of joules depending on model size. Generating a five-second AI video at high quality can consume the equivalent of more than 3.4 million joules per clip — before accounting for the energy cost of the chips themselves.

Energy Sources: Still Fossil-Dependent

As of 2024, natural gas supplied over 40% of electricity for U.S. data centers, according to the IEA. Renewables such as wind and solar supplied about 24%, nuclear power around 20%, and coal approximately 15%. Natural gas is projected to continue supplying the largest share through 2030, even as tech giants aggressively sign power purchase agreements (PPAs) for clean energy.

Big Tech companies collectively accounted for 43% of all clean energy PPAs signed globally in 2024. PPA prices rose by an average of 35% in 2024, driven largely by this surge in procurement. In 2025, hyperscalers including Google, Meta, and Amazon were estimated to spend $364 billion on data center construction in the U.S. alone. Texas has committed over $1 billion in subsidies for data centers in 2025; Virginia offered $732 million in 2024.

Renewable energy production for data centers is growing at an average rate of 22% per year and is expected to cover nearly half of additional demand by 2030. However, in the U.S., accelerated server electricity consumption — mainly driven by AI — is projected to grow at 30% annually, outpacing clean energy additions in the near term.

Grid Reliability: A System Under Strain

In July 2024, a voltage fluctuation in northern Virginia — the world’s densest data center cluster — simultaneously disconnected 60 data centers, creating a 1,500-megawatt surplus that forced emergency grid adjustments to prevent cascading outages. The incident was a warning shot. By July 2025, the Electric Reliability Council of Texas (ERCOT), which serves over 26 million customers, called the “disorganized integration” of large data center loads the biggest growing reliability risk facing the state’s grid.

Nearly half of U.S. data center capacity is concentrated in just five regional clusters — a geographic concentration that makes grid integration far more challenging than diffuse demand growth from electric vehicles or air conditioning. Data center load, unlike EV charging, operates nearly 24 hours a day, 7 days a week, with little flexibility to shift consumption during peak grid stress.

Virginia’s Dominion Energy proposed its first base-rate increase since 1992 in February 2025, adding approximately $8.51 per month per household in 2026. A Carnegie Mellon University study estimates that data centers and cryptocurrency mining could lead to an 8% increase in the average U.S. electricity bill by 2030 — potentially exceeding 25% in the highest-demand markets of northern Virginia. In the PJM electricity market covering the Mid-Atlantic and Midwest, data centers already accounted for a $9.3 billion price increase in the 2025-26 capacity market.

The Public Health Dimension

A December 2024 research paper from Caltech and UC Riverside found that under a medium-growth scenario, U.S. data centers in 2030 could contribute to approximately 600,000 asthma symptom cases and 1,300 premature deaths annually — exceeding one-third of all asthma deaths in the U.S. each year. The researchers estimated the resulting public health burden could surpass $20 billion, potentially exceeding the health costs of on-road emissions from the entire state of California.

The data center electricity carbon intensity is already 48% higher than the U.S. national average. AI’s annual carbon footprint is estimated to reach between 32.6 and 79.7 million tons of CO2 by 2025. Training an AI model at the scale of Llama-3.1 can produce air pollutants equivalent to driving a passenger car for more than 10,000 Los Angeles-to-New York round trips — and the associated health cost exceeds 120% of the training electricity cost itself.

The Efficiency Paradox

Google reported that between May 2024 and May 2025, it reduced the median energy consumption per Gemini AI prompt by a factor of 33 and the associated carbon footprint by a factor of 44 through more efficient model architectures, custom tensor processing units, and optimized inference. These gains are real and significant.

But they may be overwhelmed by what economists call Jevons Paradox: as AI becomes cheaper and more efficient to run per query, total usage skyrockets. If AI is more efficient and less expensive per response, the total number of responses could increase by orders of magnitude — erasing the per-unit gains. The IEA projects data center electricity consumption growing at 15% per year through 2030, more than four times faster than total electricity consumption from all other sectors combined.

Sources: IEA Energy and AI Report 2025; Pew Research Center, October 2025; Brookings Institution, April 2026; Belfer Center for Science and International Affairs, February 2026; Caltech/UC Riverside, December 2024; Carnegie Mellon University; Semi Engineering, August 2025.