Morning Overview

Quantum computing in 2026: what it can do, what it can’t, and who is actually using it

In December 2024, a team at Google published a result in Nature that physicists had been chasing for nearly three decades: a surface-code quantum error correction experiment that crossed below the critical threshold needed for scalable, fault-tolerant quantum computing. A few months earlier, the National Institute of Standards and Technology had finalized the first cryptographic standards built to withstand quantum attacks. By mid-2026, those two developments have reshaped the conversation around quantum technology. The hardware still cannot solve commercial problems that classical supercomputers cannot. But the scramble to protect data from future quantum machines is no longer theoretical. It is producing real deadlines, real migration costs, and real compliance pressure across governments and industries worldwide.

What the hardware can actually do

Two experimental results define the current frontier of quantum hardware, and both are more modest than the hype surrounding them suggests.

The Google-led study, published in Nature in December 2024, demonstrated that a surface-code error correction scheme could suppress errors below the threshold physicists long considered a prerequisite for building reliable quantum machines. In practical terms, this means that adding more physical qubits to the system reduced logical errors rather than compounding them. Every prior generation of quantum processors had hit the opposite wall: more qubits meant more noise, which capped the complexity of any algorithm the machine could run. Crossing that threshold in a controlled experiment was a genuine milestone.

A separate experiment, detailed in a preprint posted to arXiv in early 2024, showed logical qubits outperforming their raw physical counterparts across repeated rounds of error correction. That result drew a sharper line between the noisy, limited machines of the past decade and the beginnings of corrected, reliable operation. (The paper has not yet appeared in a peer-reviewed journal as of June 2026, so its findings, while promising, carry the usual caveats of preprint research.)

Neither result translates into a machine that can outperform classical supercomputers on real commercial workloads. Both experiments used modest numbers of qubits under tightly controlled laboratory conditions, with extensive calibration. They validate core ingredients of a future large-scale quantum architecture, but the architecture itself does not yet exist outside roadmap slides.

What it still cannot do

No quantum computer available in 2026 can break modern encryption. That point deserves emphasis because it is the source of the most persistent public confusion about the technology.

Running Shor’s algorithm to factor the large numbers that underpin RSA encryption at cryptographically relevant key sizes would require millions of stable, error-corrected qubits. The most advanced machines today operate with roughly 1,000 to 1,500 physical qubits, and the ratio of physical qubits to usable logical qubits remains steep. Industry roadmaps from IBM, Google, and others project fault-tolerant machines arriving somewhere between the late 2020s and the mid-2030s, but those timelines are aspirational. They lack the experimental grounding of the published error correction results and should be read as goals, not forecasts.

The commercial applications most often cited for quantum computing, such as drug discovery, materials science, financial optimization, and logistics, remain in an exploratory phase. Pharmaceutical companies, banks, and logistics firms have announced partnerships with quantum hardware providers, but no peer-reviewed study or independent audit has documented a case where a quantum or hybrid quantum-classical workflow outperformed a purely classical approach on a real, end-to-end business problem. IBM’s utility-scale experiments on its Heron processors have shown that quantum circuits can produce results that are difficult to simulate classically, but “difficult to simulate” is not the same as “commercially useful.” Until verified results confirm measurable advantages like reduced costs, higher yields, or faster time-to-market, the commercial case for quantum computing rests on expectation rather than evidence.

Who is actually using quantum technology right now

The most tangible, widespread use of quantum-related technology in 2026 is not quantum computing at all. It is post-quantum cryptography: new encryption algorithms designed to protect data against the quantum machines that do not yet exist but someday might.

NIST finalized its first three post-quantum cryptographic standards in August 2024, including FIPS 205 (the SLH-DSA digital signature algorithm, based on SPHINCS+) and companion standards for key encapsulation. These are not drafts or recommendations. They are binding technical specifications, and they have already triggered compliance requirements across the U.S. federal government and its contractor ecosystem. The National Security Agency and the Office of Management and Budget have set migration timelines that treat post-quantum cryptography as a near-term obligation, not a distant aspiration.

Several major technology companies have moved beyond pilots into production deployment. Cloudflare enabled post-quantum key agreement across its network. Google integrated post-quantum algorithms into Chrome’s TLS implementation. Apple rolled out PQ3, a post-quantum protocol, for iMessage. Signal adopted the PQXDH protocol for its encrypted messaging. These are not press releases about future plans. They are shipping products protecting real user traffic today.

The urgency behind these deployments stems from a threat model known as “harvest now, decrypt later.” Adversaries, including nation-state intelligence services, can intercept and store encrypted communications today with the expectation that a future quantum computer will be able to break the encryption and read the data. For information that must remain confidential for a decade or more, such as diplomatic cables, health records, trade secrets, and weapons designs, the threat is not hypothetical. It is a storage problem with a long fuse. That is why governments are not waiting for quantum computers to arrive before mandating the switch to quantum-resistant encryption.

The geopolitical dimension

Quantum computing is also a national security competition, and the United States is not the only country investing heavily. China has committed billions of dollars to quantum research and has demonstrated notable results in quantum communication, including satellite-based quantum key distribution networks. The Chinese Academy of Sciences has published competitive results in photonic quantum computing, and Beijing has framed quantum technology as a strategic priority in its five-year plans.

The European Union launched its Quantum Technologies Flagship program with over one billion euros in funding. The United Kingdom, Japan, South Korea, and Australia have all established national quantum strategies with dedicated funding. This global race adds a layer of urgency to the U.S. effort: falling behind in quantum hardware could have long-term consequences for cryptographic security, scientific competitiveness, and military capability, even if the machines themselves are years away from practical dominance.

What organizations should do now

For most organizations, the practical priority in 2026 is cryptography, not hardware.

Start by inventorying cryptographic dependencies across systems and supply chains. Identify which protocols rely on algorithms vulnerable to quantum attack, particularly RSA and elliptic-curve schemes used in TLS, VPNs, code signing, and internal authentication. Map those dependencies against the finalized NIST standards. Agencies and contractors with federal compliance obligations should treat this inventory as an active project: the standards are final, migration guidance is published, and deadlines are approaching.

Prioritize long-lived assets first: data that must remain confidential for a decade or more, embedded systems that are difficult to update, and public-key infrastructures that anchor trust across multiple business units. Pilot deployments of post-quantum algorithms in non-critical environments can surface performance and interoperability issues before they affect production systems. Governance teams should document every step, both to satisfy regulators and to give internal stakeholders a clear picture of how the organization is managing quantum-related risk.

On the hardware side, most businesses do not need to take operational action yet. The error correction results are real scientific progress, but they describe capabilities that exist inside physics laboratories, not inside data centers. Organizations can track developments, participate in limited research pilots where they align with innovation goals, and build internal literacy about quantum algorithms. None of that requires purchasing quantum computing access or assuming that near-term revenue depends on it.

Where the field goes from here

The honest summary of quantum computing in mid-2026 is a field defined by a sharp split. On one side, post-quantum cryptography has moved from research into regulation and deployment, with real products protecting real data. On the other, quantum hardware has validated essential building blocks for future fault-tolerant machines but has not yet delivered broadly useful computational advantage on any real-world workload.

The most grounded strategy for any organization is to match that split: move decisively on cryptographic migration, move deliberately on hardware engagement, and let the growing body of peer-reviewed evidence, not vendor roadmaps or hype cycles, set expectations for what comes next. The quantum future is coming, but it is arriving unevenly, and the part that demands action right now has less to do with computing power than with protecting the data you already have.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.