Artificial intelligence data centers now consume 29.6 gigawatts of electricity worldwide, according to Stanford University’s 2026 AI Index Report. To put that in perspective: New York state’s entire electrical grid peaks at about 31.5 gigawatts on the hottest days of summer. The gap between what AI demands and what 20 million New Yorkers need at full blast has narrowed to less than 2 gigawatts, roughly the output of a single large nuclear plant.
How the numbers stack up
The 29.6 GW figure comes from Stanford’s Human-Centered Artificial Intelligence institute (HAI), which publishes one of the most widely cited annual benchmarks on AI’s global footprint. The report aggregates data across data center operators worldwide and describes the total as “about what it takes to power the entire state of New York at peak demand.”
New York’s grid operator, the New York Independent System Operator (NYISO), projects a statewide summer peak of 31,471 megawatts, or approximately 31.5 GW, according to the New York State Department of Public Service’s summer energy outlook. That projection reflects the agency’s most recent published forecast for summer peak load. A separate adequacy update from the same agency notes that installed generating capacity in the state exceeds 40,000 megawatts, providing a buffer above projected peak demand.
At 29.6 GW, global AI data center capacity equals roughly 94 percent of New York’s projected peak. The remaining difference of about 1.9 GW is thin by any measure. For context, a single hyperscale campus under construction can draw 500 megawatts or more.
Why this strains the grid differently than homes and offices
Residential electricity demand follows a predictable curve: it spikes on hot afternoons when air conditioners run full tilt, then drops overnight. Data centers do not behave that way. They pull heavy, sustained loads around the clock, creating a flat but enormous baseline that grid operators must plan around 24 hours a day, 365 days a year.
The International Energy Agency examined this distinction in its analysis of energy supply for AI. The report highlights the challenges that arise when data centers cluster in specific regions and draw sustained, round-the-clock loads, noting that such patterns complicate traditional grid planning built around variable residential and commercial demand. Northern Virginia, central Ohio, west Texas, and parts of the Nordic countries have become magnets for these facilities, and local grids in those areas are already feeling the pressure. In some cases, utilities have imposed multi-year wait times for new interconnections simply because transmission infrastructure cannot keep pace with demand.
The IEA’s broader point is that AI’s energy footprint is not just large in aggregate; it is concentrated in ways that create localized bottlenecks even when national grids have spare capacity overall.
What the 29.6 GW figure does and does not tell us
Stanford’s number represents global AI data center power capacity, meaning the maximum electricity these facilities can draw from the grid. It does not represent continuous consumption. In practice, data centers operate below their nameplate capacity depending on server utilization, cooling loads, and demand-response agreements. Without detailed utilization data from operators, translating 29.6 GW directly into annual energy consumption or carbon emissions is not straightforward.
The figure also blends different types of facilities. Cutting-edge AI training clusters running thousands of GPUs at near-full load sit alongside more conventional cloud servers that handle AI inference workloads intermittently. The power profiles of these operations differ significantly, and the Stanford report’s publicly available summary does not break down the 29.6 GW by workload type, region, or operator.
That lack of granularity matters. Without knowing how much of the 29.6 GW sits in the United States versus Europe or Asia, or how much is already energized versus contracted but not yet built, it is difficult to assess the near-term impact on any specific grid. Data center power consumption remains notoriously hard to track because facilities are owned by a mix of hyperscale providers, colocation firms, and enterprise operators, many of whom treat energy use as proprietary information.
The New York comparison itself carries a small but meaningful caveat. Stanford HAI describes 29.6 GW as “about what it takes to power the entire state of New York at peak demand,” while the Department of Public Service’s own projection places that peak at 31,471 megawatts. Both framings are defensible as approximations, but they are not identical. The Stanford language implies near-equivalence; the DPS data shows a gap of nearly 2 GW. Readers should treat the analogy as directionally accurate rather than exact, recognizing that it is intended to convey scale rather than to serve as a precise engineering calculation.
Voices from the ground
The tension between AI expansion and grid reliability is not abstract for the communities living alongside new data center construction. In Loudoun County, Virginia, which hosts the densest concentration of data centers in the world, county supervisors have debated zoning restrictions after residents raised concerns about noise, water use, and the strain on local electrical infrastructure. “We are not anti-data center,” Loudoun County Board Chair Phyllis Randall told local media in 2024. “We are pro-planning.”
Utility officials have echoed those concerns. Jason Shaw, chairman of the Georgia Public Service Commission, told Reuters in early 2025 that the state’s largest utility, Georgia Power, had received data center interconnection requests totaling more than 10 gigawatts, a figure that would require the equivalent of several new power plants. “We have to make sure that existing ratepayers are not subsidizing the buildout for one industry,” Shaw said.
How to read the evidence going forward
The strongest evidence in this story comes from two primary categories. First, the Stanford HAI 2026 AI Index Report provides the 29.6 GW figure and the New York comparison. As an institutional research product from a major university, it carries significant credibility, though its methodology for aggregating global data center capacity is not fully transparent in the publicly available summary materials.
Second, the New York State Department of Public Service supplies the 31,471 MW peak demand projection and the installed capacity figure exceeding 40,000 MW. These are government planning documents used for actual grid operations and resource adequacy assessments, making them among the most reliable numbers available for the comparison.
The IEA’s work on AI and electricity adds global context and analytical weight but serves a different function from the Stanford report. Rather than originating the specific 29.6 GW statistic, it synthesizes a broad range of data on digital technologies and power systems to highlight emerging stress points.
Projections for growth through 2030 vary widely depending on assumptions about model efficiency, chip architecture, and the pace of new facility construction. Scenario-based forecasts from agencies and consultancies typically bracket a range of outcomes, and no single consensus number has emerged. Any claim about the rate of future growth should be treated with caution unless it names a specific baseline, time frame, and source.
The 29.6 GW figure from Stanford’s 2026 AI Index Report is a snapshot of a moving target. What it captures is a moment when AI’s physical footprint became impossible to ignore: not a rounding error on the grid, but a load comparable to one of the largest economies in the United States. For grid operators, utilities, and the millions of households that share the same wires, the margin between AI’s consumption and a major state’s peak load is now measured in single-digit gigawatts and shrinking.
Why the margin between AI and a major state’s peak load keeps narrowing
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.