Morning Overview

AI data centers drive soaring power demand and strain electric grids

The rapid expansion of artificial intelligence is accelerating U.S. electricity demand, with data centers emerging as a major driver. According to a federal assessment by national laboratory researchers, U.S. data centers consumed roughly 176 terawatt-hours of electricity in 2023, up from 58 TWh in 2014. Federal forecasters at the U.S. Energy Information Administration say demand growth is strengthening, with data centers among the key contributors to its strongest four-year increase since 2000. The shift is raising reliability and cost questions for grid operators and regulators as they adapt rules for how massive new loads connect to the power system.

U.S. Data Center Energy Use Has Tripled in Under a Decade

A federal assessment prepared by national laboratory researchers tracked U.S. data center electricity consumption from 2014 through 2023 and modeled several scenarios through 2028. The report found that data centers accounted for roughly 4.4% of total U.S. electricity in 2023. Under the range of growth scenarios, that share could climb to between 6.7% and 12% by 2028, with total consumption reaching 325 to 580 TWh depending on how quickly AI adoption scales and whether new efficiency measures take hold. Even on the low end, the analysis implies that data center demand would roughly double again in just five years.

The U.S. Department of Energy released the findings and tied the demand increase to three overlapping forces: AI applications, broader industrial growth, and economy-wide electrification. That framing matters because it signals that data center load is not a temporary spike but part of a structural shift in how the country uses electricity. For grid planners, the challenge is not simply building more generation capacity but doing so fast enough to keep pace with demand that is accelerating on multiple fronts simultaneously, from electric vehicles and heat pumps to factories and server farms.

Federal Forecasters Flag the Strongest Demand Growth in a Generation

The U.S. Energy Information Administration reinforced those findings in its January 2026 Short-Term Energy Outlook, which attributed a return to multi‑year sales growth largely to demand from large computing facilities. For years, total U.S. electricity consumption was essentially flat, held in check by efficiency gains in lighting, appliances, and industrial processes even as the digital economy expanded. That era appears to be over as AI clusters, crypto mining, and other high‑density computing loads arrive faster than new efficiency standards can offset them, reversing a long‑running decoupling between economic growth and electricity use.

The EIA separately stated that it forecasts the strongest four‑year growth in U.S. electricity demand since 2000, driven largely by data centers, with additional context on solar capacity additions, coal and natural gas generation outlooks, and Henry Hub price assumptions shaping the broader supply picture. Globally, the pattern is similar. The International Energy Agency estimated that data centers consumed approximately 415 TWh worldwide in 2024, equal to about 1.5% of global electricity use. The IEA’s analysis pointed to accelerated AI servers as a driver of rising power density inside facilities, meaning each rack now draws significantly more electricity than the servers it replaced, and suggesting that even modest expansions in physical footprint can translate into outsized jumps in load.

Grid Regulators Scramble to Write New Rules

The speed of data center expansion has outrun the regulatory frameworks that govern how large electricity consumers connect to the grid. The Federal Energy Regulatory Commission took action on co‑location arrangements in which AI data centers seek to plug directly into nearby power plants, bypassing the broader transmission network. FERC opened a formal investigation under docket EL25‑49‑000 and related proceedings, finding that co‑located facilities in PJM raise unresolved tariff, reliability, and cost‑allocation questions. At issue is whether these bespoke deals comply with open‑access rules that require similarly situated customers to be treated alike.

In a related action, FERC directed PJM, the nation’s largest grid operator, to establish transparent rules for serving AI‑driven data centers and other large loads co‑located with generation, with an emphasis on reliability and consumer protection. The concern is straightforward: if a data center draws power directly from a generator without paying its share of grid maintenance and reliability costs, those costs shift to other ratepayers. Residential and small‑business customers end up subsidizing the infrastructure that keeps the lights on while a tech company’s facility operates on a separate arrangement, and regulators are moving to clarify how interconnection queues, capacity obligations, and emergency curtailments should apply to these new kinds of customers.

Rising Bills and Regional Cost Pressures

The financial consequences are already visible in regions with heavy data center concentration. In the PJM electricity market, which stretches from Illinois to North Carolina and serves as the backbone of the eastern grid, data centers were associated with an estimated $9 billion in costs in Pew Research Center’s summary of PJM-related estimates, according to Pew Research Center analysis. Northern Virginia, which hosts the densest cluster of data centers in the world, has become the focal point for debates about whether local grids can absorb continued growth without degrading service for existing customers, and whether local zoning and tax policies should continue to encourage new construction in already‑strained corridors.

Bloomberg reporting captured the pressure on ordinary ratepayers, quoting one individual saying of electricity prices, “They’re going up and up.” The same person asked, “What is your breaking point?” That question resonates beyond any single utility territory. When data centers bid up the price of electricity in wholesale markets or trigger expensive grid upgrades, the costs ripple outward. Homeowners and small businesses, which lack the bargaining power of large industrial customers, can find themselves facing higher bills even if no new server farm is built in their own community, deepening concerns that the AI boom is socializing its energy costs while privatizing its profits.

Balancing AI Growth With Grid Reliability

The emerging policy challenge is to reconcile the economic promise of AI with the physical limits of the power system. Utilities and developers are racing to site new data centers near abundant renewable resources, such as large wind and solar installations, in hopes of pairing growth in computing with lower‑carbon generation. Yet transmission lines to move that power to where it is needed most often take a decade or more to permit and build, far slower than the typical construction timeline for a data center. Without careful coordination, regions could see bottlenecks where power is plentiful in theory but constrained in practice, forcing grid operators to rely more heavily on existing fossil‑fuel plants during peak demand.

Regulators are also weighing how to harness efficiency as a counterweight to rising load. The same federal researchers who documented the tripling of data center consumption emphasized that next‑generation chips, advanced cooling, and workload management could substantially reduce the electricity required per unit of computation. Policymakers are exploring whether to translate those technical possibilities into standards, incentives, or procurement rules that nudge operators toward more efficient designs. The outcome will help determine whether AI’s energy footprint continues its steep climb or gradually flattens even as the technology becomes more deeply embedded across the economy.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.