California’s growing AI-driven data center footprint could intensify pressure on the state’s already-stressed water systems, and the full scale of that impact can be difficult to track across existing reporting and permit databases. As data centers multiply to feed surging demand for artificial intelligence, their cooling systems are drawing from the same strained water supplies that serve farms, cities, and ecosystems across the state. The energy-water link at the heart of this expansion threatens to reshape how California allocates one of its most contested resources.
Energy Surge Tied to AI Drives Hidden Water Costs
The scale of the problem starts with electricity. Data centers consumed roughly 4.4% of all U.S. electricity in 2023, according to a landmark analysis summarized in a Berkeley Lab report released with support from the U.S. Department of Energy. That share is projected to reach between 6.7% and 12% by 2028 under scenarios driven largely by AI workloads. Those are national figures, but the consequences can concentrate in water-stressed states such as California, where additional industrial demand can collide with competing needs during dry periods. As utilities scramble to add generation and transmission capacity, every new cluster of server farms also implies a longer-term commitment of local water resources.
What makes this an acute water problem rather than just an energy story is the physics of cooling. Most large data centers rely on evaporative cooling towers or similar systems that consume significant volumes of water to dissipate heat from thousands of servers running around the clock. Research compiled by the Berkeley Lab data center program underscores how water demand often rises in step with energy demand because of these cooling requirements. Each new megawatt of computing capacity installed for AI training or inference adds not only to the electric grid’s burden but also to the draw on local water systems, a connection that rarely appears in public debates about tech expansion. In water-stressed regions, that hidden demand can become significant relative to other local uses.
Recycled Water as a Fix Carries Its Own Risks
California regulators have promoted recycled water as a partial solution for industrial users, including data centers. The State Water Resources Control Board oversees a statewide recycled water program that defines permitting pathways under the California Water Code for industrial and commercial reuse, and the state has been advancing direct potable reuse rules and guidance. In theory, steering data centers toward treated wastewater instead of freshwater could relieve pressure on rivers and aquifers. In practice, the shift introduces a different set of tensions, because the same reclaimed supplies are also being counted on to bolster urban resilience and environmental flows.
Recycled water is not an unlimited resource. Municipal wastewater treatment plants produce finite volumes, and every gallon diverted to a data center cooling tower is a gallon unavailable for irrigating parks, recharging groundwater basins, or supplementing drinking water supplies through DPR. If tech companies secure long-term recycled water contracts, agricultural users and smaller municipalities competing for the same supply could face tighter availability or reduced flexibility. California’s recycled water framework was designed primarily for landscape irrigation and industrial processes that existed before the AI surge, not for an industry whose water appetite could double or triple within a few years. Without explicit allocation rules that account for this new class of demand, recycled water may simply shift the burden rather than easing it.
Permit Data Remains Fragmented and Hard to Track
One reason the water footprint of data centers stays largely invisible is that the relevant permit and compliance data is scattered across multiple state and federal databases with no single dashboard connecting them. California’s own tracking systems, including the CIWQS and related databases, contain facility-level discharge permits, monitoring reports, violations, and water-right filings. But extracting a clear picture of how much water any individual data center withdraws, consumes through evaporation, and discharges requires cross-referencing records that were never designed to be read together. Each system uses its own identifiers and categories, forcing analysts to manually reconcile records across platforms.
At the federal level, the U.S. Environmental Protection Agency maintains the ECHO platform, where the ECHO data downloads provide Clean Water Act permit and compliance information nationally, including NPDES permits that cover discharges such as cooling tower blowdown and industrial wastewater. Inspections, violations, and enforcement actions are logged there as well. Yet identifying which facilities in the dataset are data centers, rather than other industrial operations, requires manual cross-referencing with corporate filings and local planning records. The broader regulatory framework, described across the agency’s main EPA portal, was never tailored to track a fast-growing digital industry whose primary environmental impacts are mediated through power and water systems. The result is that neither state nor federal regulators can quickly answer a basic question: how much total water is California’s data center sector actually using, and how fast is that number growing?
Agricultural Users Face a Quiet Squeeze
The most immediate real-world consequence of this data gap falls on California’s agricultural sector. Farmers in the Central Valley and other water-stressed regions already operate under strict allocation rules during drought conditions. When a new data center secures water rights or a recycled water contract in the same basin, it adds a competitor whose demand is constant year-round, unlike seasonal crop irrigation. Data centers do not fallow fields during dry years; they run at full capacity regardless of precipitation, and their cooling needs actually increase during California’s hottest months, precisely when water is scarcest. That temporal mismatch amplifies the pressure on shared aquifers and surface reservoirs.
This dynamic creates what amounts to a hidden equity problem. Large corporate buyers may have more flexibility than some local users when competing for available supply, whether that supply comes from recycled sources, groundwater wells, or municipal connections. Federal regulators monitor pollution and ecosystem impacts through a range of EPA environmental topics, including water quality and industrial discharge, but no existing regulatory framework in California specifically limits cumulative data center water consumption within a given watershed or aquifer. The absence of such a framework means the competitive pressure builds incrementally, facility by facility, without triggering any formal review of whether the total draw is sustainable. For growers already squeezed by climate-driven variability, the arrival of a large data campus can quietly lock in a new baseline of industrial demand that persists for decades.
Closing the Accountability Gap Before Drought Forces It
The conventional wisdom in tech policy circles treats data center water use as a solvable engineering problem, something that better cooling technology or a switch to recycled water can address without broader trade-offs. That framing misses the structural issue. Even the most water-efficient cooling system still consumes water, and the projected growth in AI-driven computing means efficiency gains are likely to be overwhelmed by sheer volume increases. The Berkeley Lab projections of data center electricity reaching as high as 12% of national consumption by 2028 suggest that cooling-related water demand could also rise in many configurations, unless operators shift toward cooling approaches that minimize evaporation or site the most water-intensive facilities where supplies are more resilient. Waiting for innovation alone to close that gap risks locking in water commitments that will be politically difficult to unwind.
California has an opportunity to move faster than crisis. Regulators could require standardized reporting of water withdrawals and consumption for all large data centers, integrating those disclosures into existing state and federal databases so that planners can see cumulative impacts at the basin scale. Water agencies could condition new permits or recycled water contracts on demonstrating that additional demand will not undermine agricultural viability or urban drought resilience. And local governments weighing new AI campuses could treat water as a central siting criterion, not an afterthought. Without those steps, the state’s next major drought will not just expose the fragility of its water system; it will also reveal how quietly the digital infrastructure behind artificial intelligence has been reshaping who gets to use California’s most limited resource.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.