Google, Tesla, and several other major companies have thrown their weight behind Utilize, a coalition campaign aimed at squeezing more performance out of the existing U.S. power grid rather than waiting years for new transmission lines to be built. The effort, which also includes Carrier and other industry players, targets a problem that has quietly worsened as data centers, electric vehicles, and AI workloads drive electricity demand sharply upward. At the core of the coalition’s argument is a striking claim backed by federal data and academic research: American grid infrastructure is dramatically underused, and software-driven upgrades could unlock capacity that is already sitting idle.
A Grid Running Well Below Its Limits
The case for Utilize rests on a simple but counterintuitive fact. Despite warnings about grid strain and rising blackout risks, U.S. electric power infrastructure operates far below its rated capacity. The Federal Reserve Bank of St. Louis tracks this through its capacity utilization dataset for electric power generation, transmission, and distribution under NAICS code 2211. The most recent observation in that continuously updated series covers January 2026, providing a standardized baseline for measuring how much of the grid’s potential goes untapped.
That federal metric matters because it challenges a common assumption in energy policy debates. When utilities and regulators discuss the need for billions of dollars in new infrastructure, the conversation often skips past the question of whether existing assets are being used efficiently. Utilize and its corporate backers are forcing that question into the open, arguing that better software, sensors, and operational practices could extract significantly more value from lines and transformers already in place.
The broader research ecosystem has been circling this issue for years. Scholars connected to energy-focused postdoctoral programs have examined how grid utilization metrics often mask large pockets of unused capacity. Their work points to operational constraints, market rules, and conservative planning assumptions as key reasons why the physical grid is rarely pushed close to its true safe limits.
Stanford Research Exposes Idle Western Transmission
The most detailed evidence supporting Utilize’s thesis comes from a study conducted through Stanford-affiliated research focused on the Western Electricity Coordinating Council region. That work, led by Liang Min and affiliated with the Bits and Watts initiative, examined how much of the region’s transmission infrastructure was actually being used during the roughly 90 hours per year when demand peaks.
The findings were stark. Even during those highest-stress windows, utilities in the American West used only 18% to 52% of their transmission line capacity, with most lines clustering around 30% utilization. Transformer utilization ranged from 28% to 68%. In practical terms, this means that during the very hours when grid operators worry most about reliability, roughly two-thirds or more of available transmission capacity sits unused on many lines.
This gap between rated capacity and actual use is not simply a safety margin. While grid operators do reserve headroom for contingencies, the Stanford data suggests the unused share is far larger than what reliability standards require. That excess represents potential room for new renewable energy connections, EV charging infrastructure, and data center loads, all without building a single new transmission tower.
The study’s authors, working under the umbrella of Stanford University, emphasize that these results do not mean every line can be safely pushed to its nameplate rating. Instead, they argue that better monitoring and control could allow operators to use a larger fraction of existing capacity without compromising reliability, particularly during predictable peak windows.
Why Tech Giants Are Betting on Software Over Steel
Google and Tesla did not join Utilize out of abstract concern for grid efficiency. Both companies have direct, growing stakes in how quickly the power system can absorb new demand. Google’s expanding fleet of AI-focused data centers requires enormous and reliable electricity supply. Tesla’s vehicle and energy storage businesses depend on a grid that can handle widespread fast charging and distributed battery systems. For both, the traditional timeline for building new transmission, often a decade or more from proposal to energization, represents a bottleneck that threatens their growth plans.
The coalition’s approach centers on what might be called “grid software upgrades” rather than physical construction. Technologies like dynamic line rating, which adjusts a transmission line’s capacity in real time based on weather conditions, and advanced power flow optimization can increase throughput on existing infrastructure without the permitting battles and capital costs of new builds. Carrier, Tesla, and the other coalition members launched their campaign specifically to lower electricity costs by unlocking this underused grid capacity.
The financial logic is straightforward. New high-voltage transmission lines can cost millions of dollars per mile and take years to permit and construct. If software and operational changes can recover even a fraction of the idle capacity identified in the Stanford study, the savings for ratepayers and developers could be substantial. The exact dollar figures remain uncertain, as no verified industry-wide cost projections have been published, but the direction of the math favors optimization over construction wherever it is technically feasible.
What Existing Coverage Gets Wrong
Much of the public discussion about grid capacity treats the problem as a simple shortage: demand is rising, so the country needs more power plants and more wires. That framing, while not entirely wrong, misses a critical layer. The Stanford and Federal Reserve data together show that the U.S. grid is not running out of physical capacity in most regions. It is running out of the operational flexibility and market structures needed to use what already exists.
This distinction has real consequences for consumers and businesses. Building new transmission triggers rate increases that flow through to electricity bills. Permitting delays for new lines can stall renewable energy projects, keeping dirtier generation online longer than necessary. And the assumption that only new construction can solve the problem tends to benefit incumbent utilities, which earn regulated returns on capital investment, over technology companies and startups offering software-based solutions.
Utilize’s corporate backers are, in effect, arguing that the regulatory framework rewards building new assets even when better use of old ones would be cheaper and faster. That is a politically charged claim, and it will face resistance from utilities and construction interests that benefit from the current model. But the underlying data is difficult to dismiss when academic researchers show transmission lines operating at roughly 30% of capacity during peak hours and federal statistics point to a sector-wide pattern of underutilization.
Limits of the Optimization Argument
The Utilize coalition’s pitch has real constraints that its backers have not fully addressed in public materials. Software-based grid optimization works best on lines and transformers that are underused because of conservative operating assumptions, not because of genuine physical limitations. In some corridors, thermal limits, voltage stability concerns, or contractual obligations genuinely prevent higher loading. The Stanford analysis of Western transmission emphasizes that each line has its own engineering constraints, and not all of the observed headroom can be safely converted into usable capacity.
There are also institutional hurdles. Grid operators must trust the new sensors and algorithms that underpin dynamic line ratings and real-time power flow controls. Regulators need to decide how to compensate utilities for investing in software and analytics rather than concrete and steel. And market rules must evolve so that independent power producers, data centers, and large industrial loads can actually access the newly freed capacity without getting bogged down in interconnection queues designed for a slower, more infrastructure-heavy era.
Critics of the optimization-first narrative warn that focusing too heavily on software could delay necessary long-term investments. They argue that some regions, particularly fast-growing metropolitan areas and constrained interregional corridors, will need substantial new transmission regardless of how efficiently current assets are used. From this perspective, grid optimization is a bridge strategy: essential for the next decade, but not a permanent substitute for building more lines.
A Dual Track for the Future Grid
The emerging consensus among many researchers and system planners is that the United States needs a dual-track strategy. On one track, advanced software, better data, and revised operating procedures should push existing infrastructure closer to its safe limits, as highlighted by the Western transmission findings and the national capacity utilization data. On the other, targeted new transmission projects should relieve genuine bottlenecks and connect high-quality renewable resources to demand centers.
Utilize’s corporate members are betting that the first track has been badly neglected, and that there is commercial opportunity in helping utilities, grid operators, and regulators catch up. If they are right, the payoff could be lower costs for consumers, faster integration of clean energy, and a more resilient grid, all achieved years sooner than traditional build-out timelines would allow. If they are wrong, and the headroom identified in studies proves difficult to convert into real-world capacity, the U.S. power system could find itself scrambling to build infrastructure even more urgently later on.
For now, the coalition has succeeded in reframing a central question in energy policy: not just how much new infrastructure the grid needs, but how intelligently it uses what it already has. As electricity demand accelerates with AI, electrified transport, and industrial decarbonization, that question will determine whether the next decade of grid investment is dominated by steel in the ground or intelligence in the wires.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.