Morning Overview

AI data centers are complicating Big Tech’s clean-energy climate pledges

The artificial intelligence boom is pushing electricity demand from data centers far beyond what major technology companies expected when they set their clean-energy targets. Data centers consumed roughly 4.4% of all U.S. electricity in 2023, totaling 176 terawatt-hours, and federal projections now estimate that share could nearly triple by 2028. That rapid growth is forcing a collision between Big Tech’s climate commitments and the physical reality of powering millions of AI servers around the clock.

How Fast Data Center Power Demand Is Growing

A recent U.S. Department of Energy report puts the scale of the problem in sharp terms. In 2023, American data centers drew 176 TWh from the grid, accounting for about 4.4% of national electricity consumption. By 2028, the DOE projects that figure will climb to between 325 TWh and 580 TWh, representing 6.7% to 12% of total U.S. electricity depending on the pace of AI deployment and efficiency gains.

The global picture is equally striking. The International Energy Agency estimated that data centers worldwide consumed roughly 415 TWh in 2024, about 1.5% of global electricity, with the United States alone responsible for approximately 45% of that total. The IEA projects global data center electricity use will reach around 945 TWh by 2030. AI workloads, which require far more computing power per query than traditional cloud tasks, are a primary driver of that acceleration.

These numbers matter because they dwarf the assumptions that companies like Google, Microsoft, and Amazon used when they announced net-zero or carbon-neutral targets earlier this decade. The gap between planned clean-energy procurement and actual power consumption is widening, and it is widening fast.

Clean Energy Purchases Cannot Keep Pace

Tech companies have responded by purchasing record amounts of clean energy. Industry groups say corporate clean-energy purchases hit new highs in 2024 and 2025, with data centers cited as a major reason for the surge. But buying renewable energy credits or signing long-term power purchase agreements does not guarantee that a data center actually runs on wind or solar at any given hour. When the sun sets or the wind dies down, the grid fills the gap with natural gas or, in some regions, coal.

That mismatch helps explain why some companies report rising emissions in sustainability disclosures even as clean-energy spending rises, according to reporting on the tension between corporate targets and on-the-ground energy realities. AI and data center electricity demand is growing faster than companies anticipated when their pledges were set, according to reporting that has documented the tension between corporate targets and on-the-ground energy realities. The result can be a credibility challenge: firms tout green energy investments while their reported carbon footprints may still rise as electricity demand grows.

Much of the current discussion treats clean-energy procurement as though it directly offsets emissions, but that framing obscures a critical distinction. A data center in Virginia running 24 hours a day draws from a regional grid where fossil fuels still supply a large share of generation. Researchers used modeling of U.S. data centers in the AI era to estimate how carbon intensity in major data center clusters can, under certain conditions, approach that of coal-heavy grids. Renewable energy certificates purchased in a different state or time zone do not change the emissions profile of the electrons actually consumed.

In practice, that means a company can meet its annual renewable-energy accounting targets while still driving higher hourly emissions in the regions where its AI clusters operate. As more firms race to deploy large language models and generative AI tools, the gulf between paper commitments and physical emissions could grow even wider unless the underlying grid becomes substantially cleaner and more flexible.

Regulators Step In on Grid Reliability

The strain is not just environmental. It threatens the reliability of the power grid itself. The Federal Energy Regulatory Commission took direct action by ordering a review of co-location arrangements, in which AI-enabled data centers connect directly to power plants in the PJM regional grid, the largest wholesale electricity market in the country. FERC raised concerns about grid reliability and costs when large loads bypass the normal transmission system.

The co-location model is attractive to data center operators because it guarantees a dedicated power supply without waiting years for new grid connections. But regulators have raised concerns that such arrangements could affect reliability and cost allocation for other customers when large loads bypass parts of the normal transmission and interconnection process. FERC’s action signals heightened regulatory scrutiny of how large data center loads are integrated into the grid, beyond a commercial negotiation between power producers and tech firms.

For ordinary electricity customers, the stakes are direct. If data centers lock up generation capacity through co-location deals, remaining grid users may face higher prices and reduced reliability during peak demand. That dynamic turns what sounds like a corporate energy strategy into a consumer pocketbook issue and raises questions about who ultimately pays for the infrastructure needed to support AI growth.

Can Federal Programs Close the Gap?

The Department of Energy has identified several pathways that could help reconcile AI growth with clean-energy goals. Among them: repurposing retired coal plant sites for data centers, which already have grid connections and transmission infrastructure, and accelerating development of advanced nuclear and geothermal generation through innovation programs. The DOE’s GENESIS initiative and related efforts through the Infrastructure Exchange highlight approaches to siting and infrastructure planning intended to better align large new loads with available grid capacity and potential clean-power development.

The logic is straightforward. Instead of scattering AI facilities wherever land is cheap or fiber is available, federal planners want to steer them toward locations where there is both existing grid capacity and realistic prospects for adding low-carbon generation. That could mean clustering data centers near hydropower resources, in regions suitable for enhanced geothermal systems, or at brownfield sites where coal plants have recently shut down and transmission lines already exist.

At the same time, DOE-supported research programs are trying to make the underlying technologies more efficient. Through platforms such as the Office of Scientific and Technical Information, the department disseminates studies on energy-efficient chips, advanced cooling systems, and software optimizations that reduce the number of computations needed for AI training and inference. Incremental gains in efficiency at the chip and algorithm level, multiplied across millions of servers, could significantly blunt the trajectory of electricity demand.

Yet even optimistic scenarios in federal analyses suggest that efficiency alone cannot fully offset the surge in AI-driven workloads. That leaves policymakers with a more difficult task: aligning corporate climate pledges, grid planning, and community concerns about land use and pollution with the reality that AI is becoming a major industrial load, comparable to heavy manufacturing or petrochemicals in some regions.

What Comes Next for AI, Power, and Climate Goals

In the near term, the collision between AI expansion and clean-energy ambitions is likely to intensify. Data center developers are already competing with factories, electric vehicle charging networks, and new housing for scarce grid capacity. Utilities, which traditionally planned for gradual load growth, must now contend with clusters of facilities that can each demand hundreds of megawatts, often on compressed timelines.

For climate advocates, the challenge is to ensure that AI does not become an excuse for backsliding on emissions. That could mean pushing for hourly matching of clean-energy purchases to data center consumption, stricter disclosure of regional emissions impacts, and conditions on federal support that tie new AI infrastructure to verifiable zero-carbon power. For regulators, it will involve balancing innovation with fairness, making sure that the benefits of AI do not come at the expense of higher bills and greater outage risks for everyone else.

The AI era is reshaping the electricity system faster than many anticipated. Whether it ultimately accelerates or undermines the transition to clean energy will depend less on corporate press releases and more on the concrete choices governments, utilities, and technology companies make about where and how these energy-hungry data centers are built and powered.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.