Morning Overview

By 2035 data centers will need ~3x today’s power

Data centers are on track to become one of the defining energy stories of the next decade, with global facilities projected to require roughly three times as much electricity by 2035 as they do today. That surge is not a distant abstraction, it is a concrete planning problem for utilities, regulators, and technology companies that are already racing to keep up with artificial intelligence and cloud demand. If the industry gets this wrong, the result will not just be higher power bills, it will be stalled innovation and strained grids in some of the world’s most critical economic regions.

I see the next ten years as a stress test for how quickly infrastructure can adapt to digital growth that refuses to slow down. Forecasts that data center energy demand could soar nearly 300 percent, or that AI-specific facilities might need far more than that, are forcing a rethink of how and where we build the internet’s physical backbone. The question is no longer whether data centers will reshape power systems, but how fast the rest of the economy can adjust.

Why data center power demand is exploding

The core reason data center electricity use is set to multiply is simple: the world is asking servers to do far more work than they were ever designed for. Traditional web hosting and enterprise workloads are now layered with generative AI, real-time analytics, and high-resolution streaming, each of which pushes hardware to run hotter and longer. When I look at the projections that data center energy demand is forecasted to soar nearly 300% through 2035, it is clear that this is not a marginal uptick but a structural shift in how much power digital infrastructure will consume.

Behind those headline figures sit very physical constraints, from the number of substations that can feed a campus to the cooling systems needed to keep racks of GPUs from overheating. The same analysis that points to that nearly 300 percent rise also underscores how much of the growth is tied to AI training and inference, which concentrate enormous compute into dense clusters. That is why names like Nov and Data, and analysts such as Tim De Chant, have become shorthand in industry circles for a new era of power-hungry computing that is forcing utilities to revisit long term capacity plans and grid operators to rethink where they can safely accommodate such concentrated demand in the first place.

The AI boom and the 3x baseline

Artificial intelligence is the accelerant that turns a steady climb in data traffic into a near vertical line on the power demand chart. Training large language models and recommendation engines requires thousands of high end chips running flat out for weeks, and then serving those models to billions of users keeps the load elevated around the clock. That is why recent projections suggest data center electricity demand will surge 165% by 2035, a figure that effectively locks in a tripling of energy use compared with today’s baseline as AI spreads from chatbots into finance, healthcare, logistics, and consumer apps.

In practice, that means the “3x” headline is a conservative floor rather than a ceiling. AI workloads are not just another application running in the background, they are becoming the dominant driver of new capacity, from hyperscale campuses to specialized inference clusters at the edge. When I talk to operators, they describe a world in which Nov and Data are no longer abstract labels but markers of a shift in which AI is treated as a first class citizen in power planning, with dedicated substations, priority interconnection queues, and long term contracts that assume a permanently higher draw on the grid.

When 3x is not enough: AI facilities that could hit 30x

Even within this broader tripling, some AI focused facilities are on a very different trajectory. Analysts tracking the most advanced AI data centers warn that power demand for these sites alone could surge as much as 30 times by 2035, a figure that dwarfs the overall sector average. In other words, while the typical cloud region might “only” need three times today’s power, the most intensive AI campuses could behave more like industrial megaprojects, with their own dedicated generation and transmission. That is the context in which I read forecasts that AI data center power demand could surge 30x by 2035, a scenario that reflects how quickly generative models are scaling in size and complexity.

The language used in those projections is telling. Phrases like Significant growth in AI is spurring unprecedented demand for new AI data centers. Power demand f are not hyperbole, they are a sober description of what happens when every major platform company races to build its own AI stack. If even a fraction of that 30x scenario materializes, utilities will have to treat AI campuses the way they treat aluminum smelters or petrochemical plants, with bespoke grid upgrades and long term resource plans that assume a constant, high intensity load rather than the more variable patterns of traditional IT.

Billions in new builds and the Perplexity factor

To meet this demand, developers are pouring billions of dollars into new data center projects that are explicitly designed around AI. These are not incremental expansions of existing server rooms, they are greenfield campuses with their own substations, high capacity fiber, and in some cases on site generation. I see this in the way industry conversations now revolve around multi billion dollar buildouts that assume at least a tripling of power needs over the next decade, with AI as the anchor tenant from day one rather than an add on. That is the backdrop for commentary that AI data centers are already planning for 3x demand, with investors treating power availability as a gating factor for whether a project can even move forward.

One illustration of how quickly expectations have shifted comes from a widely discussed response by Perplexity, which framed the rapid growth of AI as a driver of both massive new infrastructure and a push to reduce carbon intensity. In that exchange, the phrase Following is Perplexity became a kind of shorthand for the industry’s own internal debate about how to reconcile 3x demand with climate goals. The fact that such a response could anchor a broader discussion about billion dollar projects and new energy strategies shows how central AI specific facilities have become to the overall data center narrative.

Grid bottlenecks and the PJM warning sign

All of this new capacity has to plug into real world grids that were not built with AI in mind, and the strain is already visible in some regions. In parts of the eastern United States, for example, clusters of proposed data centers are converging on the same transmission corridors, raising questions about whether local infrastructure can safely deliver the required power. When I look at reports that describe how They lie within a region known to industry experts at the PJM Interconnection, it is clear that grid operators are already sounding the alarm about how concentrated data center growth could outpace available capacity.

The PJM Interconnection example is more than a local squabble over permits, it is a preview of the friction that will surface wherever data center clusters collide with finite transmission and generation. As PJM weighs its obligation to ensure reliable service against the risk that rapid buildouts could be “unreasonable,” it is effectively writing the playbook other regions will follow. I see this as a turning point where grid planners, regulators, and developers will have to coordinate far earlier in the process, with transparent assessments of how much additional load a given area can absorb before it triggers costly upgrades or reliability concerns.

New data centers, new security and reliability risks

The next wave of facilities will not just be larger, they will also be more critical to everyday life, which raises the stakes for security and reliability. As more banking, healthcare, and public services move into AI enhanced cloud platforms, any disruption at a major data center could ripple across entire economies. That is why projections that new data centers will need almost triple the current energy demand by 2035 are often paired with warnings about the need to harden these sites against both cyber and physical threats. In my view, the same planning that goes into securing a nuclear plant or major refinery will increasingly be applied to hyperscale campuses that host essential digital infrastructure.

Security professionals are already treating power availability as part of the threat model. If a facility is drawing three times as much electricity as its predecessors, it is more vulnerable to targeted grid disruptions, whether from extreme weather or malicious actors. The recognition that New data centers will need almost triple the current energy demand by 2035 is already prompting operators to invest in redundant feeds, on site backup generation, and more sophisticated load shedding strategies. I expect that by the early 2030s, the line between “data center security” and “energy security” will be almost impossible to draw.

How much power are we really talking about?

Tripling sounds dramatic, but it becomes more tangible when translated into gigawatts. Recent projections suggest that data centers could be drawing around 106 G of power by 2035, a figure that rivals the total electricity consumption of some mid sized countries. When I map that number against current capacity, it underscores just how much new generation and transmission will be required to keep the digital economy running. It is not simply a matter of building more server halls, it is a question of whether the broader energy system can scale in parallel.

Those same forecasts frame the trend in stark terms, with phrases like Data Center Energy Demand Set to Triple by 2035 Amid AI Driven Expansion Data capturing both the magnitude and the cause of the shift. The reference to 106 G is not just a statistic, it is a planning target that utilities and policymakers will have to bake into long term resource plans. For me, the key takeaway is that data centers are no longer a niche load tucked into the “commercial” category, they are emerging as a standalone pillar of electricity demand that will shape investment decisions across the entire energy sector.

Why utilities and regulators cannot wait

Given these trajectories, waiting for demand to fully materialize before upgrading the grid is no longer a viable strategy. Large power plants, high voltage lines, and substations take years to permit and build, which means decisions made in the next few years will determine whether the system can handle 3x data center demand by the mid 2030s. I see a growing recognition among utilities that they must treat AI and cloud growth as a central scenario in their planning, not a speculative upside. That includes revisiting interconnection queues, revising load forecasts, and working with developers to align project timelines with realistic infrastructure buildouts.

Regulators, for their part, are being pulled into unfamiliar territory. They are used to balancing residential, commercial, and industrial needs, but now they must weigh the benefits of digital services against the costs of accelerated grid expansion. The involvement of entities like Nov, Data, and PJM in these debates signals that the conversation has moved from technical working groups into the realm of public policy. I expect more commissions to follow PJM Interconnection’s lead in scrutinizing whether proposed clusters are compatible with reliability obligations, and to demand clearer commitments from developers on efficiency, demand response, and local economic benefits.

The efficiency wildcard and what comes next

One open question is how far efficiency gains can bend these curves. Historically, each new generation of chips and cooling systems has delivered more compute per watt, which helped offset rising demand. With AI, however, the appetite for larger models and more complex workloads is outpacing those improvements. Even if servers become significantly more efficient, the sheer volume of new applications suggests that total power use will still climb steeply. In my view, the realistic best case is that efficiency slows the rate of increase rather than reversing it, keeping the sector closer to a 3x trajectory instead of veering toward the 30x extremes seen in some AI specific forecasts.

That is why I see the coming decade as a race between innovation and inertia. On one side are engineers pushing for more efficient chips, advanced cooling, and smarter workload scheduling that can shave megawatts off peak demand. On the other are the hard limits of permitting, construction, and public tolerance for new energy infrastructure. The names and figures that keep surfacing, from Tim De Chant’s nearly 300 percent projection to the 165 percent surge tied to AI, are not just datapoints, they are signposts marking how quickly the digital and physical worlds are colliding. By 2035, the success or failure of that collision will be measured not only in teraflops, but in whether the grid kept pace with the data centers that now depend on it.

More from MorningOverview