johnsonm/Unsplash

Physicists have spent decades arguing over whether our universe is a fundamental reality or a kind of cosmic software, and the debate has just taken a sharp turn. A group of researchers now say they have used hard mathematics to show that the cosmos cannot be a computer model at all, even as other scientists insist that gravity, information and even deadly diseases point in the opposite direction. I want to trace how these competing claims add up to a striking moment, where some physicists are comfortable using the word “proof” about a question that once belonged mostly to science fiction.

From thought experiment to “proof” claims

The modern simulation debate began as a philosophical puzzle and has steadily migrated into physics labs and math departments. In 2003, philosopher Nick Bostrom argued that if any advanced civilization can create realistic simulated minds, then almost all observers with experiences like ours would be simulated, not real, which is why later discussions often cite Nick Bostrom when they claim the odds are stacked against “base reality.” That argument did not prove anything about our specific universe, but it reframed the question as a statistical bet, one that technologists and physicists have been trying to cash out ever since.

Over the past few years, the tone has shifted from speculation to assertion, with some teams now saying they have mathematically ruled out a simulated cosmos while others argue they have found physical signatures of code. One group of physicists has leaned on Kurt Gödel’s incompleteness theorems to argue that any consistent mathematical system rich enough to describe our universe would contain true statements that cannot be derived within that system, and they claim this means the full structure of reality cannot be simulated on any computer. That is a bold leap from logic to metaphysics, but it shows how far the conversation has moved from late-night dorm-room speculation into formal proofs and counterproofs.

The gravity gambit: Dr Melvin Vopson and the Second Law of Infodynamics

On the other side of the ledger, some physicists argue that the universe looks exactly like a system trying to manage data efficiently. Dr Melvin Vopson, a physicist who has become one of the most visible champions of this view, has proposed a “Second Law of Infodynamics” that treats information as a physical quantity that must be conserved or optimized, in parallel with energy and entropy. In interviews, Dr Melvin Vopson has described a cosmos where the laws of physics, symmetries and even the entire universe are shaped by information, and he has outlined how such a framework could make sense if reality were a kind of solution or imitation running on an underlying substrate.

Supporters of this approach point to the way information seems to crop up in fundamental physics, from black hole entropy to quantum bits, and argue that this is exactly what one would expect in a vast computation. A snippet shared in a discussion of Vopson’s work notes that “What this all adds up to, in Vospon’s estimation, is that the Second Law of Infodynamics co…” and goes on to suggest that everything we observe could be the output of such an information law, which is why some fans of the idea on forums like SimulationTheory treat it as near proof that we live in a simulation. Even if that is overstated, the fact that a working physicist is trying to write down a new law of nature in these terms shows how seriously the information-first view is being taken.

When gravity starts to look like code optimization

Vopson has gone further than abstract information talk and has tried to tie one of the most familiar forces in nature directly to data management. In research published in AIP Advances, he argues that gravity is not just a “pull” between masses but an emergent effect of the universe trying to keep its data organized, with additional microstates acting as information-bearing configurations that contribute to entropy. In this picture, the curvature of spacetime is not just geometry, it is an optimizing mechanism that compresses and arranges information-bearing microstates within the system, much like a clever compression algorithm rearranges bits on a hard drive.

Another account of the same work emphasizes that, in the research published in AIP Advances, Vopson proposes that gravity is actually something like a housekeeping function that keeps the universe’s data structures from becoming chaotic. In that description, gravity is not a fundamental interaction but a side effect of the way a simulated universe might minimize storage and processing costs, which is why one report framed it as a possible clue that reality is trying to keep its data organized and asked whether this could be proof we are living in a simulated universe. I see this as less a final answer and more a provocative attempt to reinterpret a familiar force in the language of computation.

Diseases, data compression and a “super complex universe”

Vopson has also tried to ground his ideas in concrete systems, including the behavior of deadly diseases. In one study, he examined how information is distributed in the genomes of viruses and argued that their evolution reflects a drive toward data optimization and compression, similar to what one would expect in a resource constrained computer program. He has suggested that a super complex universe like ours, if it were a simulation, would require built in data optimization and compression to remain tractable, and that the patterns he sees in biological systems are consistent with that requirement, which is why he has described our cosmos as potentially a simulated construct spread across 93 billion light years.

In that same line of work, he connects the idea of information density to the distribution of matter and energy, arguing that there is a maximum amount of information that can be stored in a given space and that the universe appears to operate near such limits. If a simulation had to represent every particle and interaction explicitly, it would quickly become unmanageable, so the argument goes that the system must use shortcuts, compression and emergent laws to keep the computation feasible. I find it telling that this reasoning mirrors how software engineers think about large scale simulations, even if the leap from viral genomes to the entire cosmos remains speculative and, as critics note, unverified based on available sources beyond the reported claims.

Designing experiments for a Matrix-style universe

Rather than stopping at theory, Vopson has proposed specific experiments that he believes could reveal whether reality is digital at its core. One idea is to look for evidence that information has mass, by measuring tiny differences in the weight of particles and antiparticles as they annihilate, which he argues could expose a hidden bookkeeping of bits. He has suggested that one possible route would be an experiment devised to confirm a fifth state of matter in the universe and to change physics as we know it using particle antiparticle collisions, a proposal he has described in detail when discussing how a new law of physics could prove Elon Musk is right about the simulation hypothesis.

More recently, he has outlined an experiment that would treat the universe as a Matrix style simulation and look for telltale limits in the way information is stored and processed. In that scenario, he starts by assuming that everything is made up of bits of information and then asks whether there are observable consequences, such as noise floors or discretization effects, that could be measured with high precision instruments. Reports on this work describe how, in response to the question “But could we devise a proper test? Is our universe a Matrix-style simulation?”, the answer is framed as “Enter Vopson,” who proposes that the cosmos is made up of bits of information and that his experiment could, in principle, tell us whether that is true. Whether these tests are feasible with current technology is an open question, but they mark a shift toward treating the simulation idea as experimentally addressable.

The 50–50 crowd and the skeptics

While Vopson and his supporters push for concrete tests, other researchers have tried to quantify how likely a simulated universe is in the first place. Astrophysicist David Kipping, for example, has used Bayesian reasoning to argue that, under certain assumptions, the odds that we live in base reality versus a simulation are roughly even. Kipping showed that even in the simulation hypothesis, most of the simulated realities would be nulliparous, meaning they do not themselves spawn further simulations, which changes the naive expectation that simulations vastly outnumber base realities and leads to a roughly 50–50 split. I read this as a reminder that even clever probability arguments depend sensitively on how one models future civilizations and their computing habits.

Public figures have amplified these odds in ways that blur the line between science and speculation. The Tesla and SpaceX CEO has famously said that the chances we are living in base reality are only a billion to one, while an astrophysicist quoted in the same discussion suggested that the probability we are living inside someone’s computer is closer to 50–50, a disagreement that highlights how subjective these estimates remain even among technically literate people. One analysis bluntly argued that asking whether we live in a simulation is not a scientific question at all, precisely because such probability claims are not grounded in testable evidence, despite the confidence with which the Tesla and SpaceX CEO and others throw them around.

Mathematicians strike back: “Universe cannot be a simulation”

The most dramatic recent development comes from a team of physicists and mathematicians who say they have finally killed the simulation hypothesis with math. According to their account, they have identified features of our universe that cannot be reproduced by any computation, no matter how powerful, because they rely on structures that are provably non computable. One summary of the work notes that this is the leap that sets the study apart from earlier critiques, because it stops arguing about processing power entirely and instead claims that the universe contains elements computation cannot produce, which is why the authors feel justified in saying that physicists prove universe cannot be a simulation.

Another account of the same research emphasizes that the team employed Kurt Gödel’s incompleteness theorems, which state that any consistent mathematical system rich enough to describe arithmetic will contain true statements that cannot be proven within that system. The researchers argue that if our universe instantiates such a system in full, then no finite algorithm running on a computer could capture all of its truths, and therefore no computer could host a perfect copy of our cosmos. Reports summarizing the claim say that these physicists believe they have proven whether we are living in a simulation by showing that the universe cannot be simulated on any computer, a conclusion that has been widely shared under headlines about physicists settling the question. Whether the logic truly rules out all possible simulation architectures is already being debated, but the assertive language marks a new phase in the argument.

UBC Okanagan’s Platonic realm and the “not a simulation after all” claim

Closely related to the Gödel based argument is a line of work from researchers at UBC Okanagan, who have framed the issue in terms of a Platonic realm of mathematical objects. In their description, the underlying rules of this Platonic realm may seem similar to those governing a computer simulation, but they insist that the resemblance is superficial and that the universe’s deep structure is not algorithmic in the way a digital program is. One summary of their findings puts it bluntly, saying that new research from UBC Okanagan shows that the universe is not a simulation after all and asking “What is this?” before explaining that if the underlying rules of the Platonic realm seem similar to those governing a computer simulation, that does not mean the cosmos is literally a simulation.

Popular write ups of the same work have leaned into the culture war over the simulation idea, with one piece quoting the researchers as saying that the universe being a simulation is not just unlikely but impossible, and that it never was and never will be a simulation. That account stresses that if such a simulation were possible, the simulated universe could itself give rise to life, which in turn might create further simulations, but that this infinite regress is irrelevant if the starting assumption is wrong and the universe is not and never can be a simulation, a line that has been widely shared as evidence that physicists claim they have proof the universe is not a simulation. I see a tension here between the careful mathematical statements in the technical work and the sweeping rhetoric in the popular coverage, which is worth keeping in mind when we hear the word “proof.”

How definitive is “definitive”? Weighing the competing narratives

Even as some researchers declare the simulation hypothesis dead, others caution that the new proofs may not be as final as they sound. One commentary on the recent work notes that a team of physicists claims to have killed the simulation hypothesis with math, but immediately adds that whether this really settles anything is another question, hinting at the gap between a formal result and its philosophical interpretation. That same account points out that the study has been relayed through outlets that emphasize its dramatic implications, while the underlying argument is more modest, which is why it concludes that mathematicians say they have proven the universe cannot be a simulation but leaves open whether that actually settles anything.

From my perspective, the clash between Vopson’s information based arguments and the Gödel inspired no simulation proofs highlights a deeper issue about what counts as evidence in this domain. On one side, we have attempts to reinterpret gravity, disease evolution and particle physics as signs of data compression and optimization, framed as hints that we inhabit a carefully coded environment. On the other, we have abstract mathematical theorems about non computability and Platonic realms, translated into claims that no computer could ever host a perfect copy of our universe. Both camps use the language of proof, but they are operating at very different levels of abstraction, and neither has yet produced the kind of decisive experimental result that would convince a skeptical physicist in the way a new particle detection or a precise measurement of a constant would.

Why the simulation fight still matters

Despite the lack of a universally accepted verdict, the current wave of claims has real consequences for how physicists think about information, computation and the nature of laws. If Vopson is right that gravity and other forces can be reinterpreted as information management, then even if the universe is not literally running on a server somewhere, the tools of computer science may be the best way to describe it, and experiments like those he proposes could reveal new states of matter or hidden conservation laws. If the UBC Okanagan team and the Gödel based arguments hold up, then the dream of fully simulating the universe, down to every last quantum event, may be fundamentally out of reach, which would reshape expectations in fields from cosmology to quantum computing and temper some of the more exuberant talk about uploading minds or running ancestor simulations.

For now, I think the most honest position is to treat both the “we have proof it is a simulation” and “we have proof it is not” slogans with caution, while paying close attention to the concrete physics and mathematics that underlie them. The fact that serious researchers are willing to tie their reputations to such strong claims, whether by arguing that gravity is an optimizing mechanism, that the Second Law of Infodynamics governs everything, or that Gödel’s theorems rule out any possible cosmic computer, is itself a sign that the simulation question has matured into a legitimate scientific battleground. Whether future experiments and theorems will finally tip the balance one way or the other remains, appropriately enough, an open problem in a universe that may or may not be running on someone else’s machine.

More from MorningOverview