Image Credit: Charlie fong - CC BY-SA 4.0/Wiki Commons

Data centers were supposed to be the invisible backbone of the digital economy, humming quietly in the background while the world streamed, traded and increasingly ran artificial intelligence on demand. Instead, a growing body of research now suggests that the industry has made a basic but costly mistake: it has put most of this critical infrastructure in places that are simply too hot and too stressed for efficient cooling. The result is a global fleet of server farms that burn through electricity and water just to stay online, even as climate risks intensify.

New analysis of global facilities finds that the majority of the world’s data centers sit outside the temperature range engineers recommend for economical, resilient operation. That misalignment is already inflating energy bills, straining power grids and magnifying the environmental footprint of AI. It is also forcing governments and companies to rethink where the next wave of digital infrastructure should be built, and how to retrofit what already exists.

The scale of the problem: 7,000 sites in the wrong place

The clearest sign that something is structurally off comes from the numbers. Nearly 7,000 of the world’s 8,808 operational data centers are now estimated to be in climates that fall outside the recommended temperature range for efficient cooling. In other words, roughly four out of five facilities are fighting their surroundings instead of working with them, a design choice that locks in higher operating costs and higher emissions for years to come. The same research notes that these 7,000 sites are not marginal edge nodes but core facilities that underpin cloud computing, streaming and AI workloads across continents, which means the misplacement is systemic rather than a niche outlier.

This geographic mismatch matters because data centers already consume a significant slice of global electricity. According to the International Energy Agency, data centers used about 415 TWh of electricity, roughly 1.5% of worldwide demand in 2024, a share that is rising as AI workloads proliferate. When almost 7,000 out of 8,808 facilities are sited in hotter climates than recommended, every incremental teraflop of AI training or video streaming requires more cooling energy than it would in a more suitable location. That structural inefficiency is now baked into the global digital economy.

Heat, grids and the physics of bad siting

At the heart of the siting problem is a simple physical reality: servers generate heat, and the hotter the surrounding air, the harder and more expensive it is to remove that heat. Studies of global facilities show that nearly 7,000 data centers operate in regions where ambient temperatures are high enough to require heavy mechanical cooling for much of the year. As a result, operators must run chillers, compressors and fans harder and longer, which not only drives up electricity use but also shortens equipment lifespans and raises the risk of outages when heat waves hit. In many fast growing AI hubs, the local climate is already close to or above the upper end of what engineers consider an efficient operating band.

Those higher ambient temperatures have knock on effects beyond the data hall. Analysts warn that Higher ambient temperatures bring compounding risks, with increased cooling loads putting strain on local power grids in regions such as parts of Asia, the Pacific and the Middle East. When a heat wave pushes both household air conditioning and data center chillers to maximum output, grid operators must scramble to keep voltage stable. In extreme cases, that can mean curtailing industrial users, firing up older fossil fuel plants or accepting a higher risk of blackouts, all of which undercut the promise that digital infrastructure will be both reliable and climate friendly.

AI acceleration and a $3.3 trillion climate risk

The misalignment between climate and infrastructure would be worrying even if data center demand were flat. It is not. AI is driving a rapid proliferation of new facilities, with hyperscale operators racing to deploy ever larger clusters of GPUs and specialized accelerators. Analysts who have tried to quantify the financial exposure argue that the sector is now facing a multi trillion dollar climate risk. One assessment of global infrastructure warns that, taken together, the value at risk from heat, cooling failures, supply chain friction and data recovery issues could reach $3.3 trillion over the coming decades if operators do not adapt.

That same analysis, led by Dominic King, stresses that AI is not just another workload but a structural shift in how data centers are designed and run. Training large models requires dense clusters of high power chips that can draw tens of kilowatts per rack, far above traditional enterprise norms. As a result, the cost of cooling is rising as a share of total operating expenses, and the penalty for building in the wrong climate is growing. Projections suggest that without smarter siting and cooling strategies, the cost of climate related disruptions and retrofits could climb to $168 billion by 2065, a figure that reflects both direct damage and the knock on impact of downtime for AI dependent services.

Where the heat is worst: 21 countries and 600 hot zone sites

The climate mismatch is not evenly distributed. Mapping exercises that overlay data center locations with temperature data show that some countries have concentrated their entire digital infrastructure in zones that are already too hot for efficient operation. In 21 countries, all existing data centers are located in climates that exceed recommended thresholds, which means there is no domestic facility that can rely on free cooling for much of the year. In several of these markets, governments are simultaneously pushing for more AI capacity and stricter data localization rules, effectively locking themselves into a high cooling cost future.

Globally, a growing demand for AI and the desire to store data within national borders has driven at least 600 data centers into regions where average temperatures are above 27 C. In some of these hot zones, operators are already planning 300 more megawatts of capacity, even though the local climate will force them to rely heavily on mechanical cooling. One detailed heat map notes that in 21 countries, all data centers are in climates that are too hot, and that new projects are still being announced in those same areas despite the clear thermal disadvantage, a pattern that reflects regulatory and market pressures more than engineering logic.

Water stress and the AI boom

Temperature is only part of the story. Cooling large AI clusters often requires significant volumes of water, either directly in evaporative systems or indirectly through power plants that supply electricity. A recent environmental roadmap for AI infrastructure warns that many current data clusters are being constructed in water scarce regions, where every additional megawatt of cooling demand competes with households and agriculture. The report highlights that Many current data clusters are being constructed in water-scarce regions, such as Nevada and Arizona, precisely where AI computing is expanding fastest.

That choice creates a double bind. In arid states like Nevada and Arizona, operators cannot rely on abundant surface water to support evaporative cooling, yet they still face high ambient temperatures that make air cooling alone inefficient. As AI workloads grow, the tension between digital growth and local water security will sharpen, forcing regulators to weigh data center permits against long term resource planning. The roadmap’s authors argue that without a shift in siting strategy, AI data centers will become significantly worse for the environment than early estimates suggested, a warning echoed in a widely shared explainer that describes how this massive oversight from AI data centers is reshaping the sector’s true footprint.

Why companies keep building in the wrong places

If the physics and the climate data are so clear, why do operators keep putting facilities in hot, water stressed regions? Part of the answer lies in policy and incentives. Many governments have offered generous tax breaks, cheap land and fast track permits to lure data center investment, often without fully accounting for long term climate risks. In some cases, local leaders have prioritized short term construction jobs and prestige over the question of whether their region’s climate and grid can support decades of high density computing. The result is a patchwork of incentives that steer projects toward places that are politically eager but thermally suboptimal.

Another driver is the desire to keep data close to users and within national borders. In markets where regulators insist that sensitive information stay onshore, operators have little choice but to build in whatever climate the country offers, even if that means accepting higher cooling costs. Analysts who have mapped global hot spots note that In 21 countries, all data centers are located in climates that are too hot, yet governments in those same countries are still approving new capacity to meet AI and cloud demand. In effect, policy and market pressures are overpowering the engineering case for cooler sites.

The case for a colder data center belt

Researchers who study the sector’s emissions argue that the industry needs a deliberate pivot toward cooler, less water stressed regions, especially for the most energy intensive AI workloads. One modeling exercise focused on the United States concludes that the best locations for a data center over the next few years are states that strike a balance between low carbon electricity, moderate temperatures and robust transmission capacity. The analysis points to a corridor of northern and central states as particularly promising, where average temperatures are lower and renewable energy potential is high, making it easier to run large facilities with both lower cooling loads and lower emissions.

In practical terms, that colder belt includes places like Montana, Nebraska and South Dakota, where land is relatively abundant and the climate allows for more hours of free air cooling each year. Analysts note that the best locations for a data center over the next few years in the US are states that strike a balance between these traits, a point underscored in a detailed siting study that argues operators should look beyond traditional hubs and consider regions that have not yet become synonymous with cloud infrastructure. That same work observes that some of these states are already turning to targeted incentives to lure development, but with a stronger emphasis on aligning projects with grid and climate realities.

Legacy hubs and the politics of staying put

Shifting future builds to cooler regions is one thing, but the industry also has to grapple with the legacy of existing hubs. States like Texas have become magnets for data center investment thanks to large power markets, business friendly regulations and proximity to major population centers. Yet Texas also faces extreme heat, grid stress and, in some regions, water constraints, which makes it a textbook example of the trade offs that come with building in a hot climate. Operators there are investing heavily in on site generation, advanced cooling and grid services to manage those risks, but the underlying thermal disadvantage remains.

Globally, similar dynamics play out in established hubs across parts of Asia, the Pacific and the Middle East, where data centers are deeply embedded in local economies and supply chains. Analysts who track climate risk in infrastructure argue that retrofitting these hubs will require a mix of technical upgrades and policy shifts, from mandating higher efficiency standards to encouraging off peak operation of the most power hungry AI workloads. A recent synthesis of climate and infrastructure risk notes that Nearly 7,000 data centers operate in climates that are already putting strain on local power grids, a figure that underscores how deeply the siting problem is intertwined with national energy policy.

What a smarter build out would look like

Fixing the climate mismatch will not mean abandoning hot regions altogether, but it will require a more nuanced approach to what gets built where. For latency sensitive services such as gaming or real time trading, smaller edge facilities may still need to sit close to users in warmer cities, but the most energy intensive AI training clusters could be shifted to cooler, renewables rich regions that can support large, efficient campuses. That kind of tiered architecture would align the heaviest cooling loads with the most favorable climates, while still preserving performance for end users.

Policy makers have a central role in steering that transition. Siting studies argue that The best locations for a data center over the next few years in the US are those that combine low carbon power, cooler temperatures and supportive but climate aware regulation, and that incentives should be redesigned to reward projects that meet those criteria. Internationally, governments that currently require all data to stay within hot, water stressed borders may need to revisit those rules, or at least carve out exceptions for anonymized AI training workloads that can safely run in more suitable climates. Without that kind of coordinated shift, the world risks locking another generation of digital infrastructure into the wrong climate, with costs that will only grow as temperatures rise.

More from MorningOverview