Morning Overview

A fake disease went viral online, exposing how health misinformation spreads

In 1997, a ninth grader named Nathan Zohner stood in front of a science fair in Idaho Falls and asked a simple question: Should the government ban a chemical called dihydrogen monoxide? He handed out a fact sheet. The substance, he explained, causes severe burns in its gaseous state, accelerates corrosion of metals, has been found inside tumors removed from cancer patients, and kills thousands of Americans each year when accidentally inhaled. Every claim was true. Dihydrogen monoxide is water. Of the 50 people Zohner surveyed, 43 voted to ban it.

The project, titled “How Gullible Are We?,” earned Zohner a top prize at the Greater Idaho Falls Science Fair. Within months, it had jumped from a regional student competition into national media commentary and a mention in the Congressional Record. Nearly three decades later, the hoax keeps resurfacing online, and the psychological trick behind it has never been more relevant. In April 2026, as public health officials contend with misinformation about H5N1 bird flu and lingering distortions about COVID-19 boosters, the DHMO episode remains a case study in how technically accurate information, stripped of context, can push people toward dangerous conclusions.

The anatomy of a “Zohnerism”

Zohner did not lie. That was the point. He selected real chemical properties of water and repackaged them in the language of a public health warning. “Dihydrogen monoxide” is simply the chemical name for H₂O. Steam does cause severe burns. Drowning does kill thousands. Tumors, like every other human tissue, contain water. By isolating those facts from their ordinary context, Zohner built a dossier that sounded like a case for an emergency ban on a toxic substance.

Columnist James K. Glassman, writing in a widely cited October 1997 Washington Post opinion column, coined the term “Zohnerism” to describe the technique: using a tweezered selection of true facts to guide an audience toward a false conclusion. The term never entered mainstream dictionaries, but the tactic it describes has become one of the defining features of modern health misinformation.

The hoax did not fade after 1997. In 2004, city officials in Aliso Viejo, California, nearly placed an item on their council agenda to ban foam cups after a resident circulated a DHMO warning claiming the chemical leached from polystyrene. A staffer caught the error before the vote. The incident, reported by the Los Angeles Times and the Associated Press, showed that the same framing trick could fool not just teenagers but elected officials acting in an official capacity.

Why the trick still works in 2026

The mechanics Zohner exploited have not changed. What has changed is the speed and scale at which misleading health claims travel. A 2021 peer-reviewed study published in Proceedings of the Royal Society A by researchers including Brauer, Holder, Guo, and others modeled how misinformation affects the trajectory of disease outbreaks. The team compared simulated outcomes for influenza, monkeypox, and norovirus under scenarios with and without widespread false health claims. Their findings were stark: misinformation that reduced vaccine uptake or social distancing compliance consistently produced larger and longer outbreaks in every disease model tested.

The study’s models are simulations, not field observations, and the authors are careful to note that real-world conditions introduce variables no model can fully capture. But the core finding aligns with what public health agencies have documented during actual crises. During the COVID-19 pandemic, the World Health Organization described the parallel spread of false claims as an “infodemic.” During the 2022 mpox outbreak, misleading posts about transmission routes discouraged people from seeking vaccination. In early 2026, as H5N1 avian influenza cases in dairy workers prompted new public health guidance, social media platforms saw a fresh wave of posts misrepresenting the risks of both the virus and proposed countermeasures.

In each case, the underlying pattern mirrors what Zohner demonstrated at a science fair: real data points, stripped of context, assembled into a narrative designed to provoke fear rather than inform decisions. Vaccine adverse-event reports, for example, are public records maintained by systems like the CDC’s Vaccine Adverse Event Reporting System (VAERS). The reports are real. But presenting raw VAERS numbers without noting that the system captures unverified reports, not confirmed side effects, is a textbook Zohnerism.

What the evidence does and does not prove

The historical record of the DHMO hoax is solid. The Washington Post column, the Congressional Record entry, and contemporaneous news coverage of the Aliso Viejo incident all confirm that the prank repeatedly fooled educated adults, not just students. These are institutional and federal sources that can be treated as authoritative on the facts of the events themselves.

The peer-reviewed modeling research adds a second layer. The Proceedings of the Royal Society A paper provides a structured, citable framework showing that health misinformation can worsen outbreak outcomes under plausible conditions. Its conclusions are strong analytical support for the broader argument, but they describe simulated scenarios, not direct measurements of how many additional infections resulted from any specific false claim.

What no study has yet established is a direct, quantified causal link between DHMO-style hoaxes and specific real-world health outcomes. No platform analytics data in the public record tracks how often DHMO content has been shared or whether exposure to the hoax primes people to distrust legitimate warnings. When commentators say “we keep falling for dihydrogen monoxide hoaxes,” they are using metaphor. The metaphor is useful, but it is not the same as a documented causal chain.

That distinction matters. The DHMO episode is best understood as an early, unusually clean demonstration of a rhetorical technique that peer-reviewed research has since shown to be dangerous at scale. The school project and the outbreak models occupy different evidentiary lanes. Recognizing the shared mechanism they reveal, without conflating a science fair prank with a pandemic, is the most responsible way to use this story.

How to spot a Zohnerism before it spreads

For readers sorting through alarming health claims in their feeds, the DHMO story offers a practical filter. First, check whether a warning is built from isolated facts or from a balanced presentation of both risks and benefits. A claim that lists only dangers without acknowledging what a substance or treatment actually does is using Zohner’s playbook. Second, look at the sources. Primary research, official records, and statements from credentialed experts carry more weight than screenshots, anonymous posts, or opinion columns presented as news. Third, be skeptical of calls to immediate, drastic action, such as bans, boycotts, or abrupt changes in medical treatment, when those calls rest on information that has not been vetted by independent experts or confirmed by multiple reliable sources.

Nathan Zohner titled his project “How Gullible Are We?” The answer, nearly 30 years later, has not changed much. People still respond to scientific-sounding language, still accept alarming facts without asking what has been left out, and still share warnings before checking whether the danger is real. The difference is that in 1997, a misleading fact sheet could reach 50 people in a school cafeteria. In 2026, it can reach 50 million in an afternoon. The technique is the same. The stakes are not.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.