Morning Overview

AI data centers are consuming so much electricity that the existing grid cannot meet demand — and nuclear is the only baseload option scaling fast enough

In northern Virginia, the densest cluster of data centers on the planet, Dominion Energy’s interconnection queue now stretches years into the future. Facilities that house the servers behind ChatGPT, cloud computing, and AI model training are requesting so much new power that the local utility has, at times, simply stopped accepting applications. The bottleneck is not bandwidth or land. It is electricity, and there is not enough of it.

That regional crunch reflects a national and global pattern now backed by hard federal data. A Congressional-requested study from Lawrence Berkeley National Laboratory tracked U.S. data-center electricity consumption from 2014 through 2024 using bottom-up hardware shipment and utilization data. Its finding: consumption has climbed steeply, with AI workloads acting as the primary accelerant in recent years. The U.S. Department of Energy endorsed those findings and framed the surge as an active grid-planning challenge, not a theoretical one.

Globally, the International Energy Agency’s base-case scenario projects data-center electricity demand will roughly double from about 460 TWh in 2022 to approximately 945 TWh by 2030. To put that in perspective, 945 TWh is more than the total annual electricity consumption of France. AI training and inference workloads are the single largest driver of that growth.

As of mid-2026, the central question for utilities, regulators, and the tech industry is straightforward: where will all that power come from? And increasingly, the answer circles back to nuclear energy, the only zero-carbon source that generates electricity around the clock, regardless of weather or season.

Why the grid was not built for this

The U.S. power grid was designed around a basic assumption: electricity demand grows slowly and predictably, roughly 1% per year. For two decades, that assumption held. Energy efficiency gains in lighting, appliances, and industrial processes offset population and economic growth, keeping total demand nearly flat.

AI shattered that pattern. A single large-scale AI training run can consume as much electricity as tens of thousands of households use in a day. When dozens of these runs happen simultaneously across a region’s data centers, the load on the grid resembles the sudden arrival of a new city, except this city runs 24 hours a day, 365 days a year, with no seasonal dip.

The LBNL data shows this is not hypothetical. U.S. data centers consumed an estimated 4.4% of total national electricity in 2023, up from roughly 1.8% a decade earlier. The DOE’s review noted that some regional grid operators are now revising their demand forecasts upward by double-digit percentages, driven almost entirely by data-center applications.

PJM Interconnection, the grid operator covering 13 states from Virginia to Illinois and the backbone of the nation’s data-center corridor, saw its 2024 capacity auction clear at record-high prices. The reason, according to PJM’s own analysis: surging demand from data centers colliding with the retirement of older fossil-fuel plants. The grid is being squeezed from both sides.

Nuclear’s unique fit for always-on loads

Solar and wind have become the cheapest sources of new electricity generation in most of the country. But they share a fundamental limitation for data-center operators: they are intermittent. Solar panels produce nothing at night. Wind turbines go still on calm days. For a hyperscale data center that must run continuously at near-full capacity, intermittency is not a minor inconvenience. It is a structural mismatch.

Grid-scale batteries can bridge short gaps, storing solar energy generated at midday and releasing it after sunset. But today’s lithium-ion systems typically provide four to six hours of storage, nowhere near enough to cover a multi-day weather event or a prolonged winter lull in solar output. Longer-duration storage technologies exist in pilot form, but none operate at the scale needed to backstop hundreds of megawatts of continuous data-center load.

Nuclear power sidesteps this problem entirely. A conventional reactor operates at roughly 90% or higher capacity factor, producing steady output for 18 to 24 months between refueling outages. That operating profile matches the demand profile of a large data center almost exactly. It is also why tech companies, not just utilities, have started pursuing nuclear deals directly.

In 2024, Microsoft signed an agreement with Constellation Energy to restart Three Mile Island Unit 1, a reactor in Pennsylvania that had been shut down for economic reasons in 2019. Amazon Web Services struck a deal with Talen Energy to purchase power from the Susquehanna nuclear plant, also in Pennsylvania. Google announced investments in advanced nuclear development. These are not peripheral experiments. They represent billions of dollars in committed capital from companies whose core business depends on reliable, large-scale electricity.

The small modular reactor question

Much of the long-term optimism around nuclear and AI centers on small modular reactors, or SMRs. These are factory-fabricated reactor units, typically producing 50 to 300 megawatts each, designed to be deployed faster and at lower upfront cost than conventional gigawatt-scale plants. In theory, an SMR could be sited adjacent to a data-center campus, providing dedicated power without relying on long-distance transmission.

The reality, as of mid-2026, is more complicated. The most advanced U.S. SMR project, NuScale Power’s VOYGR design for the Carbon Free Power Project in Idaho, was canceled in November 2023 after cost estimates rose sharply. NuScale’s design retains its NRC Standard Design Approval, the first SMR to achieve that milestone, but no commercial unit is operating in the United States.

Other projects are moving forward. Kairos Power broke ground on its Hermes demonstration reactor in Oak Ridge, Tennessee, a non-power test unit intended to validate its molten-salt coolant technology. TerraPower, backed by Bill Gates, is building a demonstration sodium-cooled reactor in Kemmerer, Wyoming. Both projects benefit from the ADVANCE Act, signed into law in July 2024, which streamlined NRC licensing timelines and reduced regulatory fees for advanced reactor applicants.

But demonstration reactors are not commercial power plants. The gap between proving a design works and deploying dozens of units to serve data-center loads is measured in years, possibly a decade or more. For the late-2020s demand surge that LBNL and the IEA project, SMRs are unlikely to arrive in time. Their contribution, if it materializes, belongs to the early-to-mid 2030s at the earliest.

What can be done right now

The most immediate nuclear response to AI electricity demand does not involve building anything new. It involves getting more out of what already exists.

The U.S. operates 93 commercial reactors at 54 plant sites. Many of these units were licensed for 40 years and have already received 20-year extensions. The NRC is now evaluating applications for a second round of extensions that would allow some reactors to operate for 80 years. Every reactor that stays online is roughly 1,000 megawatts of continuous, carbon-free power that does not need to be replaced.

Uprates offer another path. By upgrading turbines, steam generators, or other components, operators can squeeze additional megawatts from existing reactors without building new ones. The NRC has approved hundreds of uprates over the past two decades, collectively adding thousands of megawatts to the nation’s nuclear fleet. These are incremental gains, but in a grid straining under new AI loads, incremental gains matter.

Restarting shuttered reactors, as Microsoft and Constellation are attempting at Three Mile Island, is a third option. Not every retired plant is a candidate. Some have been partially decommissioned or lack the grid connections to resume service. But for plants that closed for economic rather than safety reasons, restart is technically feasible and far faster than building from scratch.

The rest of the toolkit

Nuclear alone will not close the gap. Even the most aggressive deployment scenarios leave room, and need, for other resources.

Utility-scale solar and wind remain essential for adding low-cost energy to the grid, even if they cannot serve as the sole backbone for always-on data-center loads. Pairing renewables with expanding battery storage can cover a growing share of demand, particularly during peak afternoon and early evening hours.

On the demand side, efficiency improvements in AI hardware are real and ongoing. Each new generation of GPUs and custom AI chips delivers more computation per watt than the last. Advanced cooling systems, including liquid cooling and immersion cooling, reduce the overhead electricity that data centers consume beyond their servers. Workload scheduling, shifting non-urgent AI training runs to hours when renewable energy is abundant, can further reduce peak grid stress.

Natural gas plants, for better or worse, remain the grid’s primary flexible resource. They can ramp up in minutes to meet sudden demand spikes, a capability that neither nuclear nor renewables can match. Their carbon emissions conflict with climate commitments, but in regions where new nuclear and storage have not yet arrived, gas plants are filling the gap by default. Pretending otherwise does not help grid planning.

Geothermal energy also deserves mention. Like nuclear, geothermal provides continuous baseload power with minimal carbon emissions. Enhanced geothermal systems, which drill deep into hot rock to create artificial reservoirs, are advancing rapidly. Fervo Energy’s Project Red in Nevada demonstrated commercial-scale output in 2023. But geothermal’s total U.S. capacity remains small, roughly 3.7 gigawatts, and scaling it to meet data-center demand would require a buildout that has not yet begun in earnest.

What the next two years will reveal

The period between now and 2028 will test nearly every assumption in this debate. If AI adoption continues on its current trajectory, the demand numbers in the LBNL and IEA reports will prove conservative, and the pressure on grid operators will intensify. If a major efficiency breakthrough in AI models reduces computational requirements, the crisis may ease before it fully arrives.

On the supply side, the progress of SMR demonstration projects, the outcome of reactor restart efforts, and the pace of NRC licensing decisions will determine whether nuclear can move from promising candidate to actual contributor. The ADVANCE Act removed some regulatory friction, but construction timelines, supply-chain constraints, and financing challenges remain.

What is no longer in doubt is the scale of the problem. Three independent institutional analyses, from LBNL, the DOE, and the IEA, confirm that AI-driven electricity demand is growing faster than the grid was built to handle. Nuclear energy, with its unmatched combination of reliability, density, and zero operational carbon emissions, is the strongest baseload candidate to meet that demand. Whether it can deliver on that potential fast enough is the defining energy question of the next decade.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.