When “Mathematically Perfect” Becomes Scientifically Hollow, By Brian Simpson
This month, JAMA Health Forum published a paper with headlines almost too neat to challenge: COVID-19 vaccines, the authors declared, saved between 1.4 million and 4 million lives worldwide. It was an elegant piece of arithmetic, clear enough to impress, authoritative enough to reassure. But is elegant maths enough to justify sweeping public-health triumphalism?
Denis Rancourt, Ph.D., an independent researcher with Correlation, says no. In a new preprint he dissects the methods behind JAMA's estimate and concludes that the glossy numbers are built on sand.
The formula itself — infection fatality rate × vaccine efficacy × coverage — is simple. So simple, in fact, that it lulls the reader into believing that precision has been achieved. But as Rancourt notes, "garbage in, garbage out." If the inputs are suspect, then the result, no matter how clean the multiplication, is fiction dressed up as fact.
And those inputs are suspect.
Infection fatality rates were derived from antibody surveys, despite questions about test validity and calibration. In an emergency environment, manufacturers rushed out kits with little independent verification. If the seroprevalence numbers wobble, the fatality estimates wobble with them.
Vaccine efficacy was lifted straight from manufacturer-sponsored trials. Yet even John Ioannidis, lead author of the JAMA study, has long warned of the unreliability of industry-funded research. Suddenly, when the product is Pfizer's, scepticism evaporates? That's convenient.
Harms from vaccines were excluded altogether. The JAMA team acknowledged this explicitly, waving away the omission with the assumption that benefits vastly outweigh risks. That isn't science, that's presumption.
This is the "mental game" Rancourt describes: plug uncertain numbers into a tidy equation, generate an impressive headline figure, and let the world believe millions of lives were rescued. It's mathematically perfect, yet scientifically hollow.
Why does this matter? Because these glossy claims shape global health policy, reinforce mandates, and justify unprecedented spending. When the foundation rests on assumptions rather than hard evidence, when real-world all-cause mortality data are ignored, we risk building castles on quicksand.
The tragedy is that public trust in science has already been frayed by years of shifting narratives, rushed approvals, and conflicts of interest. Papers like the one in JAMA don't repair that trust; they erode it further.
Rancourt's critique is blunt: "There is no reason to believe that any lives were saved." Whether one agrees fully or not, his challenge highlights the need for true transparency. The scientific enterprise should welcome scrutiny, not shy from it. The stakes are too high for anything less.
In the end, numbers alone don't tell the story. The assumptions behind them do. And when those assumptions are shaky, no amount of mathematical elegance can cover the cracks.
Comments