Morning Overview

Scholten warns data centers could drain water and crash the power grid

Congresswoman Hillary Scholten, a Democrat representing Michigan’s 3rd District, has warned that the rapid expansion of data centers could strain local water supplies and put added pressure on the electrical grid. Separate actions by federal regulators and findings from university researchers highlight concerns about large new electricity loads and heavy water use, adding context to Scholten’s argument that communities hosting these facilities may face real resource constraints.

Federal Regulators Step In on Grid Reliability

A key sign of regulators’ growing focus on large new electricity loads came when the Federal Energy Regulatory Commission directed PJM, the nation’s largest grid operator, to write new rules for handling massive electricity loads. The directive specifically targets AI-driven data centers and other large facilities that co-locate with power generation sources, a practice that lets these operations draw electricity before it ever reaches the broader transmission network. FERC framed the action around two priorities: reliability and consumer protection, signaling that regulators view unchecked data center expansion as a genuine threat to everyday ratepayers.

The order, filed under Docket Nos. EL25-49-000, requires PJM to establish transparent procedures for connecting and servicing these enormous loads. That matters because PJM coordinates electricity across 13 states and the District of Columbia, serving tens of millions of people. When a single data center campus can consume as much power as a small city, the absence of clear interconnection rules means existing customers could face higher costs or degraded service. FERC framed the directive around reliability and consumer protection, signaling concern that large, fast-growing loads could create reliability and cost risks if rules and planning do not keep pace.

Water Consumption Strains Local Communities

Scholten’s warnings about water depletion find strong support in academic research. A study from the University of Michigan’s Gerald R. Ford School of Public Policy examines what happens when large computing facilities move into smaller towns and suburbs, concluding that data centers consume millions of gallons of water annually for cooling their servers. That volume of withdrawal can compete directly with residential, agricultural, and industrial users, especially in regions already facing drought, shrinking aquifers, or aging water infrastructure that struggles to keep up with new demand.

The study calls for new policies to mitigate these local impacts, a recommendation that aligns with Scholten’s position. Most data center approvals happen at the county or municipal level, where zoning boards often lack the technical expertise or political leverage to impose meaningful water-use restrictions on well-funded tech companies. The result is a pattern in which communities absorb environmental costs while the economic benefits, primarily in the form of property tax revenue and a modest number of permanent jobs, flow disproportionately to the facility operator. Without binding requirements that tie construction permits to water conservation benchmarks, transparent reporting, or community benefit agreements, the imbalance is likely to widen as AI workloads grow.

The Gap Between Economic Promises and Resource Costs

Much of the public debate around data centers focuses on the jobs and investment they bring. But the assumption that these facilities are net positives for host communities deserves scrutiny. Data centers are highly automated. After construction crews leave, a campus that draws power equivalent to thousands of homes may employ only a few dozen full-time technicians. Meanwhile, the strain on water and electricity infrastructure affects every resident and business connected to the same utility systems. The economic calculus shifts further when municipalities must upgrade water treatment plants, build new substations, or negotiate emergency power purchases during peak demand periods, all costs that tend to land on local taxpayers rather than the data center operator.

Scholten’s critique suggests that the current policy environment effectively subsidizes tech industry growth at the expense of household budgets and natural resources. That framing gains credibility from FERC’s decision to intervene in PJM’s planning process, underscoring that the risks are not hypothetical. When regulators warn that uncoordinated large-load growth could undermine reliability, they are implicitly acknowledging that communities may already be overexposed. The question facing lawmakers is whether federal and state governments will impose conditions before approving new facilities or continue to react after the damage is done, when grid constraints and water shortages are harder and more expensive to fix.

Why Existing Regulations Fall Short

One of the less discussed aspects of this issue is how fragmented the regulatory authority over data centers actually is. FERC can set rules for wholesale electricity markets and interstate transmission, but it has no jurisdiction over local water permits or land use. State public utility commissions regulate retail electricity rates but rarely have a say in whether a data center gets built or how much water it draws from a municipal system. County governments control zoning but often lack the data to model cumulative resource impacts, especially when multiple facilities are proposed across a region. This patchwork means that no single agency is responsible for evaluating the full cost of a new data center to a community, and companies can exploit the gaps by negotiating favorable terms with whichever level of government has the weakest oversight.

The University of Michigan research implicitly acknowledges this structural problem by urging coordinated policy responses rather than piecemeal fixes. Scholars highlight that the environmental and economic impacts of data centers cut across traditional jurisdictional lines, touching everything from groundwater withdrawals to high-voltage transmission planning. Scholten has argued that policymakers should consider stronger standards for issues such as water use and grid impact assessments as data center development accelerates. Without such baseline rules, local officials remain under pressure to approve projects quickly in the hope of landing investment, even when they lack the tools to fully assess long-term resource risks.

What Comes Next for Communities and Regulators

FERC’s directive to PJM is a procedural step, not a final solution. The grid operator must now develop and propose specific rules for large-load interconnection and co-location, a process that will involve public comment periods and likely face pushback from data center operators and utilities that profit from the current arrangement. The timeline for implementation remains uncertain, and in the interim, new data center projects continue to advance through local permitting processes with little federal oversight of their cumulative grid impact. How PJM ultimately defines “large” loads, and whether it requires developers to pay for associated transmission upgrades, will shape whether existing customers are shielded from higher rates and reliability risks.

For communities weighing whether to welcome a data center, the evidence now available suggests caution is warranted. Facilities that consume millions of gallons of water per year and draw power on the scale of a small city can reshape local infrastructure priorities for decades. Scholten’s warnings, echoed by federal regulators and academic researchers, point to a simple conclusion: without clear rules that internalize the true costs of these facilities, the benefits will remain concentrated while the burdens are widely shared. As AI-driven demand accelerates, the choices made by city councils, state regulators, and Congress will determine whether data centers become pillars of sustainable local economies or engines of unchecked resource strain.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.