Researchers from Caltech and Oratomic, a Caltech-linked startup, published findings on March 31, 2026, arguing that a useful quantum computer capable of running Shor’s algorithm on real cryptographic targets could be built with as few as 10,000 to 20,000 physical qubits. That estimate is roughly two orders of magnitude lower than the tens of millions of qubits that earlier projections demanded, a shift driven largely by the unique flexibility of reconfigurable neutral-atom hardware and advances in error-correcting codes.
From Tens of Millions to Tens of Thousands
For years, the standard assumption held that breaking widely used encryption schemes with a quantum computer would require an enormous machine. Earlier resource analyses, including a 2017 study focused on Toffoli networks for elliptic-curve point addition across NIST curves like P-256, established baseline logical qubit and gate counts that, when translated to physical hardware through surface codes, pointed toward machines with millions of qubits. A separate line of research reinforced that framing, with press materials describing the challenge as needing tens of millions of qubits for Shor-based code-breaking.
The new work from Caltech and Oratomic directly challenges that scale. According to the team’s official announcement, the 10,000 to 20,000 qubit figure represents a reduction by up to two orders of magnitude compared to prior estimates. The key enabler is not a single algorithmic trick but a combination of high-rate quantum low-density parity-check codes, or qLDPC codes, paired with the physical reconfigurability that neutral-atom arrays offer over rigid two-dimensional nearest-neighbor chip architectures.
Why Neutral Atoms Change the Math
Most quantum computing platforms, including superconducting circuits, arrange qubits on a fixed grid. Error correction on such architectures typically relies on surface codes, which carry heavy overhead: many physical qubits are needed to encode each logical qubit. The Caltech team’s argument hinges on the fact that neutral-atom processors can physically rearrange their qubits mid-computation by shuttling individual atoms with optical tweezers. That flexibility allows the hardware to natively support qLDPC codes, which encode information more efficiently than surface codes but require long-range connections that fixed grids struggle to provide.
Earlier theoretical work proposed hardware-efficient syndrome extraction using atom rearrangement for high-rate qLDPC codes and reported simulated regimes where the overhead becomes effectively constant, beating surface-code performance at realistic physical error rates. Building on that foundation, the technical paper underlying the Caltech announcement claims that cryptographically relevant instances of Shor’s algorithm can run with roughly 10,000 reconfigurable neutral-atom physical qubits, with a P-256 discrete logarithm computation achievable in a few days using approximately 26,000 physical qubits. RSA-2048, a harder target, would take longer but still falls within the same architectural framework.
The dramatic reduction comes from the ability to route entangling operations between distant atoms without being locked into a planar grid. By dynamically reshaping the array, the processor can realize the sparse but long-range connectivity patterns that qLDPC codes require. In effect, the hardware supplies the graph structure that the code prescribes, rather than forcing the code to conform to a nearest-neighbor lattice.
Parallel Estimates on Superconducting Hardware
The neutral-atom results do not exist in isolation. A separate and independent resource-estimate paper, also released on arXiv, analyzed Shor’s algorithm applied to ECC-256 and arrived at fewer than 1,200 logical qubits with fewer than 90 million Toffoli gates. Mapped onto a superconducting planar architecture operating at roughly one-in-a-thousand error rates, that translates to fewer than 500,000 physical qubits. While 500,000 is still a large number, it is dramatically smaller than the multi-million-qubit projections that dominated earlier discussions and shows that algorithmic improvements alone, independent of hardware platform, are compressing resource requirements.
The gap between 500,000 physical qubits on a superconducting chip and 10,000 on a neutral-atom system reflects the encoding advantage that reconfigurability provides. On a fixed grid, surface codes demand many physical qubits per logical qubit to maintain fault tolerance. On a reconfigurable atom array, qLDPC codes can achieve the same or better error suppression with far less redundancy, because the hardware can directly implement the code’s sparse parity-check graph. That architectural difference is the central reason the Caltech team’s estimates land so much lower.
Hardware Milestones Already in Hand
These projections are not purely theoretical. Experimental work on neutral-atom logical processors has already demonstrated key building blocks. A peer-reviewed study reported an encoded logical processor using up to 280 qubits, with demonstrations that included scaling surface-code distance, break-even color-code qubits, logical GHZ state preparation, teleportation, and sampling circuits. Those results bridge the gap between paper resource estimates and actual hardware capability, showing that the operations required for fault-tolerant quantum computing can be performed on neutral-atom systems at meaningful scale.
Crucially, that experiment also validated fast and high-fidelity rearrangement of atoms within the array, a prerequisite for any architecture that leans on reconfigurability. Logical operations were executed while atoms were moved to new positions, indicating that the physical shuttling needed for qLDPC-based error correction can be integrated into real control stacks rather than remaining a purely theoretical construct.
The commercial ecosystem around neutral-atom quantum computing is also expanding. Atom Computing has partnered with Microsoft to integrate its Phoenix system, which features stable nuclear-spin qubits, into a broader effort to scale toward fault-tolerant systems. Other companies are pursuing similar architectures, betting that neutral atoms’ long coherence times and flexible connectivity will provide a smoother path to large-scale, error-corrected machines than more rigid chip-based platforms.
Timelines, Caveats, and Cryptographic Impact
Even with reduced qubit counts, the road to a cryptographically relevant quantum computer remains steep. The Caltech estimates assume physical error rates low enough for high-rate qLDPC codes to operate effectively, as well as control electronics and laser systems capable of orchestrating tens of thousands of atoms with tight timing precision. Scaling from a few hundred to tens of thousands of qubits will test everything from vacuum engineering to calibration software.
Still, the shift from “tens of millions” to “tens of thousands” has clear implications for cybersecurity planning. If neutral-atom platforms or improved superconducting designs can reach the required scale within one or two decades, then widely deployed schemes based on RSA and elliptic-curve cryptography face a shorter safety window than many roadmaps assumed. Standards bodies have already begun transitioning toward quantum-resistant algorithms, but the new resource estimates add weight to calls for accelerating that migration.
At the same time, uncertainty remains high. Resource numbers depend sensitively on assumptions about gate fidelities, parallelization, and classical pre- and post-processing. Small changes in those parameters can shift qubit counts and runtimes by factors of several. Moreover, engineering realities (such as crosstalk between tightly packed atoms or thermal constraints in large cryogenic systems) may introduce bottlenecks that current models do not fully capture.
Caltech’s Role and Access to the Ecosystem
Caltech’s involvement in both the theoretical and experimental sides of neutral-atom quantum computing underscores its broader institutional push into quantum information science. Researchers, students, and collaborators interact through a dense campus network of labs and centers, supported by shared facilities and centralized services. Prospective team members interested in contributing to this work can find open roles and application details through the institute’s careers portal, which aggregates opportunities across research, engineering, and technical staff positions.
For those already affiliated with the campus, internal tools such as the single sign-on gateway provide entry points to computational resources, documentation, and collaboration platforms that underpin day-to-day quantum research. Faculty, staff, and students engaged in quantum projects can also be located and contacted via the public online directory, which connects the abstract resource estimates in recent papers to the people and groups responsible for producing them.
Taken together, the new results from Caltech and Oratomic, the parallel superconducting estimates, and the latest neutral-atom hardware demonstrations all point in the same direction: the resource gap between today’s prototypes and tomorrow’s code-breaking machines is shrinking faster than many anticipated. The precise timelines remain uncertain, and the technical barriers are increasingly matters of engineering scale-up rather than fundamental impossibilities. For cryptographers, policymakers, and technologists alike, that is a signal that the quantum era of practical, large-scale computation is no longer a distant abstraction. It is a concrete target defined in tens of thousands of qubits, not tens of millions.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.