The rapid expansion of artificial intelligence is forcing a collision between two of the technology industry’s biggest priorities: staying competitive in the AI race and honoring climate commitments made earlier this decade. U.S. data centers consumed roughly 4.4% of the nation’s electricity in 2023, and federal projections show that share could rise sharply by 2028. For companies that pledged net-zero emissions by 2030 or sooner, the math is getting harder to reconcile with the reality of powering millions of new GPU clusters around the clock.
Data Center Power Use Is Tripling in a Decade
A federal assessment of data center electricity demand puts hard numbers on what grid operators have been warning about for months. U.S. data centers drew 176 terawatt-hours in 2023, more than tripling from 58 TWh in 2014. Under the Department of Energy’s projections, that figure could reach 325 to 580 TWh by 2028, pushing data centers from 4.4% of national electricity consumption to somewhere between 6.7% and 12%.
Those ranges are wide for a reason. The lower bound assumes current AI deployment trends continue at a steady pace, while the upper bound reflects a scenario in which generative AI adoption accelerates and new hyperscale campuses come online faster than expected. Either outcome represents a step change in how much electricity a single industry segment demands from the U.S. grid, and it arrives as utilities plan for broader electrification in areas such as transportation and manufacturing.
A Congressional Research Service analysis of the same data cautions that projections carry significant uncertainty because grid interconnection timelines, chip efficiency gains, and corporate build-out schedules are all moving targets. That uncertainty, though, cuts both ways: if AI workloads grow faster than modeled, the upper end of the DOE range could prove too low.
Climate Pledges Meet Rising Emissions
The tension between AI growth and decarbonization is not theoretical. Company sustainability reports cited by the Associated Press show that emissions have risen during the first roughly five years of major climate commitments across parts of the tech sector. That pattern is the opposite of what corporate roadmaps anticipated when firms set aggressive net-zero targets earlier this decade.
The core problem is timing. Building renewable generation and storage at the scale needed to offset hundreds of new megawatts of data center load takes years of permitting, procurement, and construction. AI demand, by contrast, is scaling on a quarterly cadence driven by competitive pressure. When a company signs a power purchase agreement for a wind farm that will not deliver electricity for years, but breaks ground on a data center that will draw power sooner, the gap gets filled by whatever generation is already on the grid, which can include fossil fuels.
Critics and some energy analysts warn this dynamic could lock in additional fossil fuel infrastructure that persists beyond original target dates for emissions reductions. The race to deploy artificial intelligence is complicating tech companies’ commitments to reduce greenhouse gas emissions, according to AP reporting.
Georgia Offers a Ground-Level View
National statistics tell part of the story. Regional utility planning tells the rest. In Georgia, the state’s dominant electric utility has told regulators it needs a massive increase in power capacity specifically to serve incoming data center customers. The utility cited testimony and multi-billion-dollar investment plans for capacity expansion, a scale of spending that reflects how concentrated AI-driven load growth has become in certain parts of the country.
Georgia is not unique. Regions with cheap land, favorable tax structures, and existing transmission infrastructure have attracted clusters of hyperscale facilities. But when a single utility territory absorbs large blocks of new demand in a short window, it can strain generation planning, transmission upgrades, and rate structures in ways that affect other customers on the system, not just the tech companies requesting service. Residential and commercial customers in these areas could face higher electricity costs or reliability concerns, depending on how utilities and regulators manage the surge in demand.
The Efficiency Argument and Its Limits
Industry leaders counter that AI will eventually reduce overall energy consumption by making systems smarter and processes leaner. Josh Parker, sustainability chief for chipmaker Nvidia, has said AI will ultimately cut electricity use because it is more efficient than the processes it replaces.
That claim deserves scrutiny. Efficiency gains from AI are real in specific applications: optimizing logistics routes, reducing energy waste in buildings, and accelerating materials science research. But the history of energy economics offers a consistent lesson known as the rebound effect. When a technology makes a resource cheaper or more productive to use, total consumption of that resource tends to rise, not fall, because the technology opens new use cases that did not previously exist. Some analysts argue generative AI could fit this pattern. Every efficiency improvement in chip design or model architecture enables new workloads, from real-time video generation to autonomous agents, that consume the freed-up capacity and then some.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.