Morning Overview

New technique sharpens measurement of universe’s expansion rate

Physicists at the University of Illinois Urbana-Champaign and the University of Chicago have developed a new method for measuring the Hubble constant, the number that defines how fast the universe is expanding. Called the “stochastic siren” technique, the approach extracts cosmological information from gravitational waves without relying on light-based observations, offering a fresh angle on one of modern physics’ most stubborn disagreements.

How the Stochastic Siren Method Works

The technique draws on two distinct data streams from the LIGO-Virgo-KAGRA detector network. First, it uses the observed population of resolved binary black hole mergers, events where two black holes spiral into each other and send detectable ripples through spacetime. Second, and more unusually, it uses the non-detection of the stochastic background, the faint hum that should arise from countless unresolved mergers happening across the cosmos. The absence of that signal, expressed as upper limits, turns out to carry real information about the expansion rate.

The logic works like this: if the Hubble constant were very low, distant mergers would appear closer and louder, producing a stronger collective background hum. Because detectors have not yet picked up that background, low values of the Hubble constant can be ruled out. A team of astrophysicists and cosmologists at the Grainger College of Engineering and the University of Chicago combined this constraint with standard resolved-event measurements to tighten the overall estimate. The result is a gravitational-wave-only measurement that sidesteps the electromagnetic observations traditionally needed to pin down cosmic distances and can be updated automatically as more observing runs accumulate.

Why the Hubble Tension Demands New Tools

The Hubble constant has been measured two fundamentally different ways for years, and the answers refuse to agree. Observations of the cosmic microwave background, the afterglow of the Big Bang, point to a value near 67 kilometers per second per megaparsec. Measurements using supernovae and the cosmic distance ladder consistently land closer to 73. That gap, known as the Hubble tension, has survived repeated cross-checks and now sits at a statistical significance that makes it difficult to dismiss as a fluke. If the discrepancy is real rather than a systematic error in one camp, it could signal that the standard model of cosmology is incomplete and that new ingredients such as exotic dark energy or early-universe physics are required.

Gravitational-wave methods matter here precisely because they are independent of both traditional approaches. Standard sirens, the broader family of techniques that treat merging compact objects as distance markers, bypass the chain of calibrations that plagues the supernova distance ladder. The stochastic siren variant goes a step further by extracting distance information even from events too faint to resolve individually, turning a non-detection into a cosmological constraint. Because it depends on the overall population of mergers and on how loudly they would contribute to the background, it is sensitive to the expansion rate in a way that is complementary to electromagnetic probes and can, in principle, be combined with them to test for hidden inconsistencies.

A Growing Toolkit of Independent Approaches

The stochastic siren paper is not arriving in isolation. A separate research effort has demonstrated the first measurement of the Hubble constant from gravitational-wave and galaxy cross-correlations, a technique dubbed “Peak Sirens” that matches gravitational-wave luminosity-distance data against galaxy redshift surveys. Another team has pursued a Gaussian-process reconstruction that combines standard sirens from the GWTC-3 catalog with electromagnetic observations in a model-independent way. Each approach attacks the same question from a different statistical angle, and none depends on the same calibration chain as the supernova distance ladder. That makes any eventual agreement between them especially informative.

Outside the gravitational-wave community entirely, researchers have been refining cosmic chronometers, a method that estimates the expansion rate by measuring the stellar ages of cluster galaxies and projecting to the Hubble constant with cosmological priors. These age-based techniques rely on detailed modeling of galaxy evolution but are subject to very different systematics than either supernovae or gravitational waves. The proliferation of independent methods is significant because each carries its own set of uncertainties. If several unrelated approaches converge on the same value, the case for new physics or for a particular resolution of the tension becomes much stronger, whereas a scatter of answers would point observers back toward hidden biases in the data.

Detector Upgrades Will Sharpen the Constraint

The stochastic siren technique is currently limited by detector sensitivity. LIGO, Virgo, and KAGRA have progressively tightened their upper limits on the gravitational-wave background through successive observing runs, and collaboration-wide syntheses of background searches document how those limits have improved across both astrophysical and early-universe models. Each incremental improvement in sensitivity translates directly into a tighter constraint on the Hubble constant under the stochastic siren framework, because a more sensitive non-detection pushes the excluded range of expansion rates higher. In practice, that means the same hardware and data-analysis upgrades that enable new types of source detections also strengthen the cosmological reach of the method.

Separate forecasting work has explored how gravitational-wave-only methods could reach a few-percent measurement of the Hubble constant with the current detector network and its planned upgrades through upcoming observing runs, all without electromagnetic information. The stochastic siren approach adds a complementary channel to those population-based statistical methods, and as detector noise floors drop and the catalog of resolved mergers grows, the two data streams (resolved events and background limits) will reinforce each other. Institutions such as the University of Illinois Urbana-Champaign and its dedicated physics programs are already positioning researchers to take advantage of these improvements, building analysis pipelines that can ingest future observing runs and update cosmological inferences in near real time.

What a Gravitational-Wave Answer Could Change

If gravitational-wave measurements of the Hubble constant land squarely on the value inferred from the cosmic microwave background, the implication would be that late-time distance-ladder measurements are missing something, perhaps subtle calibration drifts, underestimated dust effects, or sample-selection biases in supernova surveys. In that scenario, the standard cosmological model would remain largely intact, and the focus would shift toward improving local measurements and re-examining how nearby galaxies are used as rungs in the ladder. Conversely, if stochastic sirens and other gravitational-wave methods converge on the higher value favored by supernovae, theorists would be pushed to modify early-universe physics in ways that preserve the microwave background data while allowing the present-day expansion rate to be faster.

An intermediate outcome is also possible: gravitational-wave estimates might fall between the two current camps or even disagree with both, suggesting that the Hubble tension is a symptom of a deeper, more intricate problem. Because stochastic sirens exploit a non-detection and population properties rather than individual well-localized events, they are sensitive to assumptions about the merger-rate evolution of black holes and the distribution of their masses. Cross-checking this method against Peak Sirens, cosmic chronometers, and more traditional standard sirens will therefore be essential. Whatever the final answer, the emergence of a gravitational-wave-based measurement marks a shift in how cosmologists approach the expansion rate, transforming it from a quantity inferred almost entirely from light into one that can be triangulated using the fabric of spacetime itself.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.