Somewhere in the sprawl of Northern Virginia, a new substation hums to life, feeding electricity to rows of servers training the next generation of AI models. Multiply that scene across hundreds of facilities in Texas, Georgia, Ohio, and beyond, and you arrive at a number that would have seemed absurd five years ago: the power demand from AI data centers in the United States has reached approximately 29.6 gigawatts. That is roughly 94 percent of what New York state needs at the single hottest hour of summer to keep every apartment, office tower, subway car, and factory running.
The comparison is not hypothetical. The New York Department of Public Service’s 2025 summer energy outlook, drawing on projections from the New York Independent System Operator, pegs the state’s expected peak demand at 31,471 megawatts, or about 31.5 GW. A single technology sector is now within striking distance of that benchmark, and the gap is narrowing fast.
Where the numbers come from
New York’s peak demand figure is the more solid of the two data points. It is a regulatory filing built on engineering models, historical load curves, and weather forecasts. If NYISO gets it wrong, the consequences are tangible: rolling blackouts, emergency power purchases, and public accountability. That institutional weight is what makes it a useful yardstick.
The 29.6 GW figure for AI data centers is less tidy. No single government agency tallies every AI facility in the country and publishes a running total. The number is assembled by firms such as DC Byte and Synergy Research Group, which aggregate announced projects, signed power purchase agreements, and facilities already drawing power. Because many projects have not yet reached full operational load, the real-time draw on the grid at any given moment is almost certainly lower. The gap between contracted capacity and actual consumption can run 30 percent or more, depending on how many servers are active and how intensively they are working.
The International Energy Agency’s special report on energy and AI, published alongside a companion structured dataset, traces the trajectory behind this surge. The IEA documents how large language model training and inference workloads have pushed data center electricity consumption well beyond what traditional cloud computing ever required. The growth is geographically concentrated: the United States absorbs a disproportionate share because of its dominance in AI chip design, hyperscale cloud providers, and the venture capital that funds buildouts. The IEA’s own projections for U.S. data center electricity demand, while using slightly different category boundaries, point in the same direction as the 29.6 GW estimate, though the agency has not published a single figure that directly confirms or contradicts that specific number.
Why the exact number is hard to pin down
Isolating AI-specific power demand from general cloud computing is one of the biggest measurement challenges in the field. Google, Microsoft, and Amazon run mixed workloads across their facilities. The electricity powering a large language model training run flows through the same substation as the power behind a video stream or a financial transaction. These companies rarely disclose internal metering data at that level of detail, and the IEA acknowledges that definitions of “AI-related” demand vary across institutions.
That ambiguity does not erase the trend. Even conservative estimates place AI-driven data center demand in the mid-20-gigawatt range for the U.S. as of mid-2025, and the pipeline of announced projects suggests continued acceleration. Whether the precise figure is 25 GW or 32 GW, the structural reality is the same: utilities and grid operators now treat AI as a load category on par with entire industrial sectors.
Real-world pressure on the grid
The strain is already visible in the regions absorbing the most construction. In Northern Virginia, home to the densest cluster of data centers on Earth, Dominion Energy has filed rate cases citing AI-driven load growth as a primary driver of proposed infrastructure investments. Customers in the region face the prospect of higher electricity bills to fund new substations and transmission lines. In Texas, ERCOT’s interconnection queue is swelling with data center applications, and wholesale electricity prices in key zones have begun reflecting the added demand.
PJM Interconnection, the grid operator covering 13 states from Virginia to Illinois, has flagged data center load as a factor reshaping its long-term resource adequacy planning. New facilities often arrive in clusters, requesting hundreds of megawatts in areas with constrained transmission capacity. Even when enough generation exists on a regional basis, bottlenecks in power lines can limit what actually reaches a given site. Long interconnection queues for new substations are already delaying some projects by years.
Some of the most dramatic responses have come from the nuclear sector. In 2024, Constellation Energy announced a deal to restart a unit at Three Mile Island to supply power to Microsoft’s AI operations. Amazon struck a similar arrangement with Talen Energy’s Susquehanna nuclear plant in Pennsylvania. These agreements signal that AI companies are willing to pay premium prices for firm, around-the-clock power, and that the existing grid cannot always deliver it on its own.
The climate dimension
New York’s grid planning unfolds within state-level emissions limits that mandate a transition away from fossil fuels. Additional load must be balanced against those targets. When AI facilities draw power in regions that still rely heavily on natural gas or coal, their growth can slow or even reverse progress on decarbonization unless new clean generation comes online at a matching pace.
The IEA’s analysis underscores this tension. Without targeted policy and investment, the agency warns, AI could become a significant driver of global electricity demand growth and the emissions that come with it. Efficiency gains in chips, cooling systems, and software offer some relief, but the IEA notes that total data center energy consumption has continued to climb even as power usage effectiveness (PUE) ratios have improved, a pattern consistent with the broader observation that growing demand for new digital services has historically absorbed and exceeded the savings from efficiency gains.
That dynamic is already reshaping how utilities approach long-term planning. Instead of treating data centers as just another commercial customer, some planners now model them as a distinct category with rapid, lumpy growth and highly concentrated geographic footprints. The shift affects everything from substation design to procurement of battery storage and demand response programs. It also raises a politically charged question: who pays for the upgrades? Shareholders, data center operators through higher connection fees, or residential ratepayers who never asked for an AI boom in their backyard?
What the New York comparison actually tells us
Framing AI data center demand as “almost a New York” is a striking shorthand, but it can mislead if taken too literally. New York’s 31,471 MW peak is a once-per-summer event tied to extreme heat. Data centers, by contrast, run at high utilization around the clock. In terms of total annual energy consumption, 29.6 GW of AI capacity could dwarf the electricity used by residential air conditioning that spikes for a few afternoon hours and then recedes.
At the same time, not all 29.6 GW will materialize simultaneously. Some capacity exists only on paper, and operators can adjust deployment schedules in response to chip availability, market conditions, or regulatory hurdles. The precise crossover point between AI demand and a state like New York’s peak may shift. But the underlying trend of convergence is unlikely to reverse.
For grid planners, ratepayers, and policymakers, the most useful way to read this comparison is as an early warning that has already arrived. An industry that barely appeared in utility forecasts a decade ago is now large enough to move state-level demand projections, trigger nuclear plant restarts, and reshape electricity pricing across entire regions. Whether that growth ultimately delivers broad economic benefits or concentrates costs on communities least equipped to absorb them will depend on how quickly power systems adapt, how transparently AI operators disclose their energy use, and whether climate and reliability constraints are treated as hard boundaries rather than obstacles to negotiate around.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.