Fredrick Nietzsche, the German philosopher, famously said “There are no facts, only interpretations.” And today, this seems truer than ever before. Two people can take the same set of data and interpret it completely different ways. As the economics Nobel Laureate Ronald H Couse said, “If you torture the data long enough, it will confess to anything,” and that is pretty much happening, whether the debate is about youth unemployment in India or the temperature rise due to climate change. The issue does not matter as much as different kinds of interpretations clouding creating a cacophony that drowns out any ability to have a nuanced conversation around the issue.
Facts. Counter Facts. Alternate Facts. Somewhat Facts. Non-Facts. Most of us grapple with telling the difference between what is true and what is not. For example, is coffee good for you or not? Is a glass of wine a day a great thing, or not? Is breakfast the most important meal of the day or not? And the answer is that we don’t know. The answer will depend on the research that you have read. And that in turn will depend on the interest groups — including funders — associated with the research.
The digital age, has intensified the challenge of discerning truth, particularly as digital platforms, chasing eyeballs, become hotbeds for misinformation. A notable instance is Andrew Wakefield's 1998 study, published in The Lancet, which erroneously linked the MMR vaccine to autism. Cited in over 750 other research papers, and countless other news platforms, Wakefield’s paper became the foundation for those fighting against mass inoculation. Despite swift academic critique, it lingered for over a decade before retraction, meanwhile fuelling global vaccine hesitancy and endangering countless lives. Even after its retraction, the study persists as "evidence" for vaccine conspiracy theorists. Wakefield's case exemplifies how individual agendas can exploit the academic system, manipulating research for personal ends.
But Wakefield was not the first to do something like this and is unlikely to the be last. In the 1940s and 1950s, scientists working in tobacco companies would routinely put out research to counter claims that smoking could have adverse effects on health. This is before the definitive link between smoking and cancer — and other illnesses — was discovered. Breakfast cereal manufacturers have sponsored research that strongly suggests that breakfast is the most important meal of the day.
In the digital age, algorithmic platforms amplify a cacophony of biased voices and sensationalised narratives. This fuels concerns about fanatics, backed by their followers, weaponising the information system. Key issues like climate change, vaccines, and reproductive health become ideological battlegrounds, attracting motivated groups that blur the lines between religion and science. These factions exploit academic research and publishing to advance their agendas, eroding scientific integrity and public trust. Recent high-profile cases underscore the urgency of addressing this manipulation.
The first involved the pro-life lobby in the USA, which had backed “research” purporting to show negative effects associated with abortion pills. Three separate papers written by researchers with strong links to the anti-abortion lobby were published by Sage. The reviewer who reviewed the paper, for scientific accuracy, also had the same links. A US federal judge quoted the papers in a case against anti-abortion pill, mifepristone. And then, as complaints mounted, Sage retracted the papers because of “fundamental problems with the study design and methodology, unjustified or incorrect factual assumptions, material errors in the authors’ analysis of the data, and misleading presentations of the data”. The retraction underscores the complex interplay between ideology and scientific inquiry, revealing how easily the latter could be subverted to serve the former. The incident not only spotlighted the vigilance required from publishers and peer reviewers but also raised questions about the robustness of the filters in place to sift scientific fact from fiction.
Similarly, the controversy surrounding a review article in a journal that claimed the mRNA Covid-19 vaccine was dangerous illustrates the dangers to public health. Despite the overwhelming body of evidence supporting the vaccine’s safety and efficacy, the publication of such a paper sowed doubt and fear. And the media latching on to it has not helped. The motivations behind this article, possibly rooted in vaccine scepticism or conspiracy theories, exploited the academic publishing system to lend credence to unfounded claims.
News media has neither the skills nor the funds to verify each piece of scientific research that is published. For media, peer reviewed and published articles in journals are all the verification that they need. For the general public, trust in science and scientists is high. The revelations around manipulated research and the propagation of misleading narratives remind us of the fragility of trust and the ease with which it can be eroded in the digital age.
In grappling with these revelations, we must ask ourselves: How do we safeguard the sanctity of scientific inquiry in an era where information can be weaponised? The answer lies not solely in the hands of publishers, peer reviewers, or even the scientists themselves, but also in the engagement of every individual with the information they encounter. Critical thinking, a healthy scepticism, and a commitment to seeking out multiple sources of information are the tools at our disposal in the fight against misinformation. By cultivating a culture of critical thinking and responsible information consumption, we can ensure that truth continues to be the foundation of our world.
The writer works at the intersection of digital content, technology, and audiences. She is a writer, columnist, visiting faculty, and filmmaker