
Google is sounding an unusually blunt alarm about the future of artificial intelligence infrastructure in the United States, and the problem is not chips or cooling. The company is warning that the country’s aging power grid, and especially the high voltage transmission system, has become the single biggest obstacle to building the next wave of data centers. As AI workloads surge and facilities grow more power hungry, the bottleneck is no longer how fast developers can code, but how quickly utilities can deliver electrons.
Behind the technical jargon is a simple reality: the United States built its grid for a different era, and it is now colliding with a data center boom that expects industrial scale power on internet timelines. That mismatch is forcing some of the world’s largest technology companies to wait years just to plug in new capacity, even as demand from AI models, cloud services, and everyday apps keeps rising.
The grid connection queue that could stall AI
Google has started describing U.S. transmission delays as its top challenge in bringing new data centers online, a remarkable admission from a company that usually frames obstacles in terms of software or hardware. According to Jan, utilities are now quoting four to ten year waits for large grid connections, with at least one utility telling the company it would need a 12 year study period before interconnection could even be approved. Those kinds of wait times are fundamentally at odds with how quickly AI demand is growing.
Jan has also highlighted that the U.S. transmission system itself, not local distribution lines, is now the biggest challenge for connecting hyperscale facilities. The long distance, high voltage network that moves bulk power from plants to population centers was never designed to serve clusters of data centers that can each draw hundreds of megawatts. As Google and its peers try to site new campuses, they are running into a system that moves slowly, requires complex studies, and was built around traditional utilities rather than single customers that might consume the entire output of a large generator. That tension is now visible in how Google says US is its biggest challenge for connecting data centers.
Developers cannot outrun a power shortage
Even if permitting and studies moved faster, the underlying supply of power is tightening. Jan has relayed warnings from Uptime that developers will not outrun the power shortage, because the most pressing challenge is that data center demand is rising faster than utilities can add generation and wires. In Uptime’s view, the industry is racing toward a power crisis that technology alone will not solve, a point underscored in its research analyst commentary on what lies ahead for data centers.
Uptime Inst has gone further, warning that the data center sector is heading into a period where power constraints will shape everything from site selection to service reliability. It argues that the industry is “racing toward a power crisis that technology alone will not be able to solve,” a stark assessment that shifts the focus from clever efficiency tricks to hard infrastructure. That perspective is embedded in the Uptime Institute 2026 data center predictions, which frame power availability as the defining constraint on growth.
AI’s appetite and the scale of the coming crunch
The urgency behind Google’s warnings becomes clearer when I look at the scale of AI’s projected power draw. Jan has pointed to analysis suggesting that AI data centers could consume electricity equivalent to about 100 m U.S. houses, roughly two thirds of all homes in the country, if current build out plans materialize. That comparison, drawn from an energy thesis on AI infrastructure, captures how the sector is starting to look less like a niche tech load and more like an entire new residential grid layered on top of the existing one.
At the same time, the number of facilities chasing that power is exploding. Dec has highlighted that more than 4,000 data centers are in various stages of planning or construction worldwide, a figure that illustrates how global the race has become. Rising Electricity Rates In Focus are already a political and economic flashpoint, and More operators are being forced to confront the reality that this boom is fundamentally tied to soaring energy demand. Those dynamics are laid out in detail in an analysis of 4,000 planned data centers and the backlash over electricity costs.
Slow moving infrastructure meets internet speed demand
What makes this clash so difficult to resolve is the mismatch between how quickly digital services scale and how slowly physical infrastructure changes. Jan has noted that most people do not think about infrastructure, unless it fails, and that sentiment applies directly to the grid. Users expect AI assistants, streaming platforms, and enterprise tools to be available instantly, but the substations, transformers, and transmission lines behind them take years to plan and build. A widely shared explanation of how AI is pushing America’s power grid to its limits captures this disconnect, observing that Most people only notice infrastructure Unless it fails.
Jan has also described how the world is building its future on top of systems that were never designed for this kind of load profile. The U.S. grid was optimized for centralized plants serving broad regions, not for clusters of hyperscale campuses that may each demand as much power as a mid sized city. As AI workloads grow, that design gap becomes a structural risk, not just an inconvenience. The result is a growing list of projects that are technically ready but stuck in queues, waiting for utilities and regulators to catch up to a digital economy that moves at a very different speed.
How Google and policymakers are trying to respond
Faced with these constraints, Google is not just complaining about the grid, it is trying to reshape how it connects to it. Jan has explained that Google says U.S. transmission delays are now the new choke point for its AI data centers, and that some utilities are quoting a 12 year wait just to study interconnection. That frustration has pushed the company to call for reforms to permitting and planning, and to explore siting facilities closer to existing power plants where capacity already exists. The scale of the challenge is evident in reports that Google flags 12-year waits as a central risk to its AI roadmap.
Jan has also detailed how Google faces major challenges in connecting data centers to the U.S. power grid because the transmission system was built for a different mix of customers and generators. The company’s experience reflects a broader pattern in which the world’s largest technology companies are running up against the realities of a slow moving power system, even as they pour money into AI and cloud expansion. That tension is captured in reporting that Google faces major in connecting its facilities, with Jan emphasizing that the U.S. transmission system is now the biggest challenge for those links.
More from Morning Overview