In Northern Virginia, the largest data-center market on the planet, Dominion Energy has warned that proposed facilities could require more electricity than the utility can deliver before the end of the decade. Across the country, from Dallas-Fort Worth to Phoenix to the outskirts of Chicago, a similar collision is playing out: the artificial intelligence boom is demanding enormous volumes of power, and the grid was never built to supply it this fast.
Data centers consumed roughly 4.4% of all U.S. electricity in 2023, according to a federal assessment by Lawrence Berkeley National Laboratory, published through the U.S. Department of Energy in September 2024. Under scenarios shaped largely by AI training and inference workloads, that share could climb to between 6.7% and 12% by 2028. The wide range reflects genuine uncertainty about adoption speed, chip efficiency gains, and whether utilities can build generation and transmission infrastructure on anything close to the timelines the tech industry needs.
As of spring 2026, the gap between those two timelines has only widened.
A surge the grid was not designed for
The scale of AI’s electricity appetite is unlike anything the data-center industry has seen before. Training a single frontier model can require tens of thousands of GPUs running continuously for weeks or months, drawing power loads that rival small industrial plants. Inference, the process of serving AI responses to end users, is less concentrated per query but growing rapidly as tools like chatbots, image generators, and coding assistants reach hundreds of millions of people.
The International Energy Agency, in its mid-2025 electricity update, projected strong global electricity demand growth through 2025 and 2026, with U.S. increases tied directly to data-center expansion. In a separate analysis of AI-driven energy demand, the IEA identified the most advanced machine-learning systems as among the most electricity-intensive digital technologies ever deployed and flagged supply-chain constraints, particularly shortages of large power transformers and high-voltage equipment, that are slowing grid buildouts worldwide. These assessments represent the most recent comprehensive international analyses available as of spring 2026.
Those constraints are not abstract. A 2024 investigative feature by Bloomberg News documented companies facing multi-year waits for grid connections and regional restrictions on new data-center power allocations. Bloomberg’s analysis converted data-center capacity figures into electricity consumption estimates using third-party datasets, offering a granular view of where demand was outstripping supply. In several cases, utilities told prospective customers that serving all proposed projects would require new high-voltage lines and substations that simply could not be completed before the end of the decade. While conditions have continued to evolve since that reporting, the structural constraints it identified remain central to the current landscape.
Northern Virginia and the interconnection bottleneck
Nowhere is the crunch more visible than in Northern Virginia’s “Data Center Alley,” centered around Loudoun County. The region hosts more data-center capacity than any other market globally, and Dominion Energy, the area’s primary utility, has faced a flood of interconnection requests that far exceeds its near-term ability to supply power. PJM Interconnection, the regional grid operator covering 13 states and the District of Columbia, has seen its queue of proposed projects, many of them data centers, swell to levels that have forced procedural overhauls to manage the backlog.
The problem is not unique to Virginia. Similar pressures have emerged in Texas, where ERCOT has fielded a surge of large-load interconnection requests, and in markets across the Sun Belt and Midwest where hyperscale operators from Google, Microsoft, and Amazon have clustered near fiber backbones and favorable tax structures. In each case, the mismatch is the same: tech companies operate on 18- to 24-month construction cycles, while new transmission lines, substations, and generation plants typically require five to ten years from proposal to energization.
The efficiency question
Chipmakers are working to close part of the gap. Nvidia, AMD, and Intel have each released or announced accelerators that deliver more computations per watt than their predecessors, and data-center operators are experimenting with liquid cooling and immersion systems designed to cut energy waste from traditional air-cooled facilities. Google, Microsoft, and Amazon have all made public commitments to power operations with clean energy, and several have signed agreements to purchase output from nuclear plants or invest in small modular reactor development.
Yet the historical pattern in computing offers a cautionary note. Efficiency gains have repeatedly enabled larger and more complex workloads, which in turn consume more total electricity, a dynamic economists call the Jevons paradox. The DOE’s projection range implicitly captures this tension: the low end assumes meaningful efficiency improvements slow aggregate growth, while the high end reflects a scenario where demand simply outpaces those gains. The available evidence does not settle which trajectory is more likely, and the answer will depend on decisions that thousands of companies and dozens of regulators have yet to make.
How power availability is reshaping data-center strategy
For companies planning data-center investments, power availability has overtaken land cost and construction capacity as the binding constraint on new development. The federal data and international analysis both point to sustained tightness through at least the middle of the decade, making early engagement with utilities and grid operators a competitive necessity rather than a secondary consideration.
For regulators, the emerging picture demands a rethink of how critical digital infrastructure is planned and approved. Traditional permitting processes that evaluate one project at a time are poorly suited to a wave of clustered developments that collectively require gigawatts of new capacity. The documented delays and restrictions show that, in many regions, policy frameworks have not caught up with the speed and scale of AI-driven demand.
Some states have begun to respond. Virginia’s legislature has debated measures to tie data-center tax incentives to grid-impact assessments, and federal agencies have explored ways to accelerate transmission permitting under existing authority. Whether those efforts move fast enough to prevent today’s localized crunch from hardening into a structural brake on both AI deployment and broader economic growth remains an open question, one that the period ahead will likely begin to answer.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.