Morning Overview

Study finds human-caused warming is accelerating to about 0.35°C a decade

Researchers Grant Foster and Stefan Rahmstorf have found that the rate of human-caused global warming has roughly doubled over the past decade, reaching about 0.35 degrees Celsius per decade after filtering out natural climate influences. The finding, published in Geophysical Research Letters, represents what the authors describe as the first statistically confirmed acceleration in the warming trend since approximately 2015. Yet the claim has drawn pushback from other scientists who argue that such a speedup cannot be reliably separated from short-term climate noise, setting up a sharp scientific disagreement with real consequences for how quickly governments need to cut emissions.

What the New Analysis Found

The study by Foster and Rahmstorf isolated the human-driven warming signal by stripping out the effects of El Niño cycles, volcanic eruptions, and other natural drivers of year-to-year temperature swings. After those adjustments, the researchers calculated that the underlying warming rate climbed to approximately 0.35 degrees Celsius per decade in the post-2015 period. That is nearly twice the pace recorded in earlier decades, when the long-term trend hovered closer to 0.18 to 0.20 degrees Celsius per decade.

The press announcement from the American Geophysical Union framed the result as a first, researchers confirming with statistical confidence that global warming has accelerated in the last decade. The study drew on multiple independent temperature records to test whether the signal held up across different measurement approaches, rather than appearing only in a single dataset, and it emphasized that the acceleration appears after known natural fluctuations are removed from the data.

Three Datasets, One Consistent Signal

A key strength of the acceleration claim is that it does not depend on a single thermometer network. The researchers cross-checked their results against NASA’s GISTEMP record, the UK Met Office’s HadCRUT5 dataset, and NOAA’s GlobalTemp series. Each of these primary observational records is maintained independently, uses different spatial interpolation methods, and handles gaps in ocean coverage differently. When all three show a steepening post-2015 slope after natural variability is removed, the case for a real shift in the warming rate becomes harder to dismiss as a measurement artifact.

NASA’s analysis enables researchers to compute decadal trends and compare post-2015 slopes against earlier baselines, while the UK Met Office’s HadCRUT5 dataset adds built-in uncertainty quantification, which helps determine whether an apparent trend change falls outside the range of expected statistical fluctuation. NOAA’s GlobalTemp record provides yet another independent check using its own methodology. The convergence across all three lends weight to the acceleration finding, though it does not settle the debate entirely, because the post-2015 period is still relatively short in climate terms.

The Scientific Counterargument

Not all climate statisticians are convinced. A separate peer-reviewed study published in Communications Earth and Environment argued that a post-1970 acceleration in global mean surface temperature is not reliably detectable once researchers properly account for short-term variability, autocorrelation in the data, and uncertainty about where any true changepoint might fall. The authors of that study warned that common statistical assumptions, such as treating annual temperature errors as independent from one year to the next, can produce spurious acceleration signals that look real but are not.

Additional commentary reported by Nature’s news coverage has underscored this caution, with some experts stressing that the climate system is noisy enough that even a decade of unusually rapid warming may not yet prove that the underlying trend has shifted. From this perspective, the recent cluster of record-hot years could reflect a combination of steady anthropogenic warming, a strong El Niño, and other natural fluctuations, rather than a permanent jump in the background rate.

This disagreement is not trivial. If Foster and Rahmstorf are right, the world is heating up faster than the trajectory embedded in most national climate pledges, and the window for limiting warming to 1.5 degrees Celsius is closing even more rapidly than the latest IPCC assessments suggested. If the counterargument holds, the recent string of record-breaking years may reflect a temporary clustering of natural warmth on top of a steady, already alarming trend, rather than a genuine gear shift in the climate system. Either way, the planet is warming, but the pace at which future thresholds will be crossed remains contested.

Why the Methodological Split Matters

The core tension between the two camps comes down to how researchers handle the messy statistics of a warming planet. Foster and Rahmstorf explicitly adjusted for El Niño and volcanic forcing before testing for a trend change, essentially trying to isolate the purely human-caused component. Their critics counter that even after such adjustments, the remaining data series is short enough and noisy enough that a decade of elevated temperatures does not yet clear the bar for a confirmed acceleration, especially when more conservative statistical models are applied.

Both sides agree on the basic physics: rising greenhouse gas concentrations warm the planet. Where they diverge is on whether the rate of that warming has shifted upward in a statistically meaningful way or whether the apparent speedup could be an artifact of how the question is framed. As the Communications Earth and Environment study noted, model assumptions about error independence can inflate confidence in a changepoint that may not withstand more robust testing. The choice of dataset, baseline period, and statistical framework all influence the answer, which is why the Geophysical Research Letters paper’s multi-record analysis was designed to address exactly that vulnerability.

The debate is unfolding within a broader scientific community anchored by organizations such as the American Geophysical Union, which convenes conferences, publishes journals, and provides forums where competing methods can be scrutinized. Researchers who want to follow the technical arguments more closely can access AGU publications, attend sessions, or use tools such as the member portal to track new work as it appears. The institutional infrastructure of peer review and professional critique is central to resolving disputes like this one.

Consequences for Climate Policy and Risk

For policymakers, the practical difference between a steady 0.20 degrees Celsius per decade and an accelerating 0.35 degrees Celsius per decade is enormous. At the higher rate, every major temperature threshold arrives years sooner, compressing the timeline for adaptation investments in flood defenses, heat-resilient infrastructure, and agricultural systems. The IPCC’s recent assessments already warned that even the slower trajectory demanded rapid emissions cuts. An acceleration would make those warnings look conservative and could render some long-term planning assumptions obsolete.

For ordinary people, the question translates into how quickly extreme heat events, intensified storms, and shifting growing seasons become the norm rather than the exception. A warming rate that has nearly doubled in a decade, if confirmed, means that the climate conditions of the 2030s could look dramatically different from those of the 2020s, not in the distant, abstract future but within the span of a single mortgage or school career. Insurance markets, housing decisions, and public health systems would all feel the strain of a faster-changing baseline.

Even if the more cautious interpretation proves correct and no statistically robust acceleration is yet detectable, the existing trend is already severe enough to justify urgent action. The disagreement over acceleration does not change the underlying need to cut emissions steeply; instead, it shapes how quickly those cuts must occur to avoid breaching particular thresholds and how much additional risk society is willing to tolerate. In that sense, the argument is less about whether to act and more about how narrow the remaining margin for error has become.

Scientific institutions also face choices about how to communicate such contested findings. Overstating confidence in an acceleration could backfire if later analyses revise the signal downward, while underplaying the possibility of a faster rate might leave communities unprepared for worst-case outcomes. Groups like AGU, which depend in part on philanthropic support through initiatives such as donor programs, are likely to continue emphasizing transparent methods, open data, and robust debate as they navigate this balance between caution and urgency.

Ultimately, the dispute over whether global warming has measurably accelerated in the last decade illustrates how climate science progresses: through competing analyses, critical review, and gradual convergence. Whether the 0.35 degrees Celsius per decade estimate stands or is later revised, the core message remains stark. Humanity is pushing the climate system into unfamiliar territory, and the exact speed of that journey will shape the risks that governments, businesses, and communities must confront in the years ahead.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.