Morning Overview

Study: 10,000 qubits could crack key encryption sooner than expected

Researchers affiliated with Caltech and the quantum computing startup Oratomic have published a preprint claiming that Shor’s algorithm, the theoretical tool capable of breaking widely used public-key encryption, could run on as few as 10,000 reconfigurable atomic qubits. That figure represents a dramatic drop from earlier projections that placed the threshold at millions of qubits, compressing the timeline for when quantum machines might threaten the cryptographic systems that protect banking transactions, government communications, and everyday internet traffic.

From Millions to Thousands of Qubits

For years, the conventional wisdom held that cracking standard encryption with a quantum computer would require hardware far beyond anything on the horizon. A widely cited 2019 resource estimate calculated that factoring a 2048-bit RSA integer would demand 20 million noisy qubits running for eight hours under a surface-code error-correction model. That number was large enough to give governments and enterprises a comfortable buffer. No lab was close to building such a machine, so migration to quantum-resistant cryptography felt like a problem for the next decade.

Subsequent work chipped away at that estimate. A 2025 paper by Craig Gidney of Google Quantum AI, according to its arXiv preprint, reduced the requirement to fewer than one million qubits for the same RSA-2048 problem, applying updated surface-code and error-rate assumptions. That alone was a significant step, but the new preprint goes much further. The Caltech and Oratomic team reports that by switching to a reconfigurable neutral-atom architecture, a cryptographically relevant implementation of Shor’s algorithm becomes feasible with roughly 10,000 physical qubits, including runtime estimates for computing a discrete logarithm on the P-256 elliptic curve, a standard used across TLS certificates and digital signatures.

Why Neutral Atoms Change the Math

The difference between 20 million qubits and 10,000 is not just a matter of degree. It reflects a shift in the underlying hardware assumptions. Superconducting qubit platforms, the kind operated by several major tech companies, rely on error-correction schemes that demand enormous overhead. Dozens or even hundreds of physical qubits are needed to produce a single reliable logical qubit. Neutral-atom systems, by contrast, use individually trapped atoms that can be physically rearranged during computation. This reconfigurability allows more flexible connectivity between qubits and, according to the preprint’s resource estimates, slashes the total qubit count needed for a given algorithm.

In the neutral-atom model described by the authors, qubits are arranged in two-dimensional arrays and shuttled into proximity for entangling operations. Because atoms can be moved rather than relying on fixed wiring, the architecture can implement long-range gates without the overhead of routing information through chains of intermediate qubits. The paper argues that this reduces both the depth of the quantum circuit and the number of ancillary qubits required for arithmetic subroutines inside Shor’s algorithm, leading to the 10,000-qubit headline figure for breaking commonly used cryptographic primitives.

The startup Oratomic has positioned itself as the commercial vehicle for this approach. In its launch announcement, the company describes plans to build utility-scale neutral-atom hardware explicitly aimed at cryptographically relevant workloads. Oratomic frames the 10,000-qubit threshold as a direct challenge to the “millions of qubits” narrative that has long defined the field and ties its founding mission to the Caltech-affiliated research team behind the new preprint.

Even so, the gap between a resource estimate and a functioning machine remains vast. The preprint assumes gate fidelities and error-correction performance that no existing neutral-atom system has yet demonstrated at the required scale. Today’s experimental platforms typically involve hundreds of atoms, not tens of thousands, and maintaining precise control over larger arrays will demand advances in laser stability, vacuum engineering, and classical control electronics. Still, the research reframes the engineering target. Building a 10,000-qubit neutral-atom processor is a qualitatively different challenge than assembling 20 million superconducting qubits, and it is close enough to current experimental regimes that it can plausibly shape roadmaps.

What This Means for Encryption Standards

The practical question is whether the organizations that depend on RSA and elliptic-curve cryptography have enough time to switch. The U.S. National Institute of Standards and Technology has been working on that problem for years. In August 2024, NIST approved three Federal Information Processing Standards for post-quantum cryptography (FIPS 203, FIPS 204, and FIPS 205), formalizing new quantum-resistant algorithms for encryption and digital signatures.

Those standards are the first fruits of a multiyear effort. NIST’s ongoing post-quantum standardization program continues to evaluate additional candidates for both primary and backup roles. A recent status report on the fourth round of that process confirmed the selection of the HQC key-encapsulation scheme as an alternate option, part of a deliberate strategy to maintain algorithmic diversity in case vulnerabilities are discovered in any one design.

The new 10,000-qubit estimate adds urgency to this migration. Cryptographic transitions are notoriously slow, often taking a decade or more to fully propagate through legacy systems, embedded devices, and supply-chain software. Many organizations still rely on RSA-2048 or P-256 for core functions such as TLS handshakes, VPN tunnels, and code-signing infrastructures. If a viable quantum attack on those schemes requires thousands rather than millions of qubits, the comfortable assumption that quantum threats lie far in the future becomes harder to defend.

Security planners must also contend with “harvest now, decrypt later” strategies, in which adversaries record encrypted traffic today with the expectation of decrypting it once quantum hardware matures. Long-lived secrets, such as diplomatic cables, health records, or industrial designs, are particularly exposed. Even if a 10,000-qubit neutral-atom machine remains years away, data with a comparable or longer sensitivity window should already be considered at risk under conservative threat models.

Caveats and Open Questions

The preprint has not yet undergone formal peer review, and its resource estimates rest on assumptions about gate fidelities, connectivity, and error rates that no existing neutral-atom system has demonstrated at scale. By contrast, the earlier 20-million-qubit and sub-million-qubit projections were derived using surface-code models that have been studied extensively in both theory and experiment, lending them a degree of robustness even if the absolute numbers remain uncertain.

There is also a tension between the two earlier estimates that the new paper positions itself against. The 2019 analysis that yielded the 20-million-qubit figure made conservative assumptions about error rates and code distances, effectively building in a safety margin for real-world imperfections. Gidney’s later work tightened those assumptions based on more optimistic projections for hardware performance and more efficient circuit constructions, which is how it arrived at the sub-million-qubit range for RSA-2048. The neutral-atom proposal pushes that optimism further, arguing that architectural advantages can dramatically shrink overhead, but it does so in a parameter regime that has not yet been empirically validated.

Another open question is how generalizable the 10,000-qubit figure will prove to be. The preprint focuses on specific cryptographic targets (factoring RSA-2048 and solving discrete logarithms on P-256) under particular algorithmic choices. Alternative implementations of Shor’s algorithm, different error-correcting codes, or hardware constraints not fully captured in the model could change the required resources by orders of magnitude in either direction. Independent teams will need to reproduce and stress-test the analysis before policymakers treat the number as a hard benchmark.

Finally, the mere existence of an aggressive resource estimate does not guarantee that attackers will prioritize building such a machine. Quantum hardware is expensive and fragile, and many of the same capabilities required to break encryption could also be used for commercially valuable tasks in chemistry, materials science, or optimization. Whether nation-states or other actors choose to invest in cryptanalytic quantum computers at scale will depend on strategic calculations that extend beyond technical feasibility.

A Compressed but Uncertain Timeline

For now, the Caltech–Oratomic preprint should be read less as a prediction that 10,000-qubit neutral-atom machines are imminent and more as a signal that the upper bound on how many qubits are needed to threaten today’s cryptography is moving downward. Together with NIST’s post-quantum standards and ongoing work on backup algorithms, it underscores a broader message. Waiting for definitive proof that a large-scale quantum computer exists before beginning migration is a risky strategy.

Enterprises and governments already have a roadmap for deploying quantum-resistant schemes, and the latest research suggests that roadmap should be treated as a near-term imperative rather than a distant precaution. Even if the 10,000-qubit threshold ultimately proves too optimistic, planning for that scenario now is likely to be far less costly than scrambling to replace broken cryptography after a practical quantum attack has arrived.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.