Across the United States, the race to build artificial intelligence infrastructure is colliding with something far less futuristic: fear about power bills, water supplies and neighborhood survival. As AI companies pour money into server farms, residents from California to the Carolinas are asking whether this boom will leave them with blackouts, higher rates and industrial boxes where homes or parks might have stood. Many see a widening gap between the promise of AI and the anxiety of communities that feel they are being asked to carry its costs.
That tension is sharpening as the scale of the buildout becomes clear. The U.S. Energy Information Administration (EIA) reports that data centers already use about 3% of the country’s electricity and could devour 8% by the end of the decade, with AI as the main driver. Tech giants such as Google are planning dozens of new facilities, while local governments experiment with moratoriums and grassroots campaigns to slow or stop construction. Together, these forces have produced a national argument over who gets to approve AI growth, how much infrastructure it justifies, and what protections neighbors should expect in return.
AI’s power hunger hits the grid
The EIA has quietly become one of the most important sources in this debate. In a recent assessment of digital infrastructure, the agency reported that American data centers consumed roughly 190 billion kilowatt-hours of electricity in 2022, equal to about 3% of national use. In the same analysis, the EIA projects that data center electricity demand will roughly double by 2028, driven by AI systems that require far more computation than traditional cloud services. By 2030, the agency expects these facilities to account for about 8% of all U.S. power demand, a jump that will force utilities to rethink how they plan generation and transmission for years ahead.
Those figures help explain why local unease is hard to dismiss as simple technophobia. When a single sector is on track to more than double its share of the national grid in less than a decade, residents and regulators are right to ask who will pay for upgrades and how much slack the system really has. The EIA’s projection that data center electricity will double by 2028, and that these sites could draw 8% of U.S. power by 2030, suggests that AI growth is no longer just a corporate strategy story but a basic infrastructure challenge. If utilities must rush new substations and lines into place to serve server farms, the risk is that ordinary households will face higher rates or reliability trade-offs long before they see any direct benefit from machine learning tools.
Google’s expansion and the AI feedback loop
Google’s own disclosures show how quickly this feedback loop between AI demand and physical infrastructure is tightening. In its 2024 Environmental Report, the company stated that its data centers used 18.3 terawatt-hours of electricity in 2023, or about 18,300 gigawatt-hours. The same report notes that AI will drive a further increase in energy use, even as the company pursues efficiency gains and long-term contracts for wind and solar power. That admission matters because it comes not from critics but from one of the firms building the largest AI systems on earth, which has every incentive to present its operations in the best possible light.
Power use is only part of the story; Google is also planning more buildings. The company has outlined plans for 24 new data centers around the world, with a clear focus on the United States as a core market for these facilities. In practical terms, that means more towns will soon be weighing whether to welcome or resist a project backed by one of the world’s wealthiest corporations. When Google explains in its environmental report that 18.3 terawatt-hours of electricity already flow into its data centers and that AI will push this higher as it adds 24 largely U.S.-focused sites, it is also signaling to residents that the AI boom is not a distant future. It is a building permit, a substation and a new high-voltage line coming to a zoning board near them.
Monterey Park and the rise of local revolts
Nowhere is that tension clearer than in places where residents have decided they would rather fight than adapt. In Monterey Park, California, a proposed data center became a rallying point for neighbors who saw the project as a threat to their quality of life and local resources. Over the past year, homegrown revolts against data centers have emerged in multiple communities, but Monterey Park stands out because activists there organized sustained opposition and ultimately stopped the development. Their campaign turned what might have been a routine planning decision into a referendum on who controls the direction of AI infrastructure in their city.
The Monterey Park story also hints at a broader political shift. Residents who might disagree on national issues found common cause in opposing an industrial-scale facility they feared would bring noise, traffic and heavy energy use without enough local benefit. Reporting in The Guardian describes how this resistance helped unite a fractured community, suggesting that data centers are becoming a new flashpoint where concerns about climate, inequality and corporate power converge. For AI companies, that means public opposition is no longer confined to abstract debates about algorithms; it is showing up in city council chambers with packed audiences and detailed questions about transformers, water lines and tax breaks.
Canton’s moratorium and the language of threat
On the other side of the country, Canton, North Carolina, has taken a more formal step by pressing pause on data center growth. In early 2026, the town board approved a one-year moratorium on new data center construction, effectively freezing projects while leaders study how such facilities might affect resources, jobs and long-term planning. The move did not arise in a vacuum. Residents and officials had grown uneasy about what a cluster of large, power-hungry buildings could mean for their town’s character and costs, especially after the closure of a major paper mill that once employed about 1,100 people.
The language surrounding that decision was striking. In coverage by FOX Carolina reporter Ashley Listrom, Mayor Zeb Smathers called large data facilities a “threat to our community,” and the report noted that one proposed site would have needed up to 706 megawatts of power at full buildout. The same story said the moratorium would give Canton time to study whether data centers could replace even a fraction of the mill’s 698 lost jobs, since such facilities often promise far fewer permanent positions than traditional factories. When a town is willing to put potential investment on hold and label a project a “threat,” as described in the FOX Carolina account, it sends a clear message that the social license for AI infrastructure can no longer be taken for granted.
Fear, fairness and the future of AI buildouts
What connects the EIA’s projections, Google’s expansion plans, Monterey Park’s revolt and Canton’s moratorium is a shared sense that the AI data center boom is outrunning the public’s ability to respond. On one side are companies racing to secure land, power and fiber to train ever-larger models. On the other are residents who hear that data centers already use about 190 billion kilowatt-hours a year, could reach 8% of U.S. electricity demand by 2030, and are being built faster than schools or hospitals. Their questions are less about the fine points of AI and more about fairness: Will these facilities pay enough taxes to offset the roads, lines and substations they require? Will they be required to use clean power, or will local grids lean more on gas and coal to keep servers running?
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.