top of page

Truth under pressure - How fraud, bias, and error feed misinformation

Updated: Jun 27

Some believe the Earth is flat. Others claim climate change is not real. A retracted study linking vaccines to autism continues to influence anti-vaccine movements. During the COVID-19 pandemic, conspiracy theories and false cures spread rapidly. And in one famous case, fabricated results in stem cell research earned global praise before being exposed.

These stories—some based on misinformation, others on scientific misconduct—reflect how quickly distorted or false claims can take hold and shape public opinion.

The shift has even made its way into our language. “Post-truth,” “fake news”, and “misinformation” have all been named word of the year by major dictionaries. This linguistic recognition points to just how widespread and familiar these issues have become and how often they’re repeated, reposted, and reinforced.

Misinformation is not new. Neither is fraud. But the speed, scale, and visibility of both have changed. And while these issues seem separate, they often overlap. Scientific uncertainty can be misrepresented as failure. Fraud can feed conspiracy. The correction of errors can be misunderstood as contradiction. Honest mistakes can be taken as proof that science can’t be trusted. And behind all of it, emotional bias and information overload can make it hard to know what to believe.

So how did we get here? And what can be done about it?

FRAUD

Fraud, bias, and error are often lumped together when scientific findings are questioned. But they don’t mean the same thing, and they don’t carry the same weight.

Fraud refers to the deliberate manipulation or fabrication of data. These cases exist, but they are relatively rare. Much more often, the problems seen in research stem from error or bias.

Error is an unavoidable part of research. Studies may rely on small samples, imperfect tools, or early-stage models. Science operates under uncertainty, and results often reflect that.

Bias, on the other hand, can be built into research design, data selection, or the types of questions asked, often shaped by what journals, funders, or institutions value.

“Scientific practice has been characterized by problematic rewards, such as prioritizing research quantity over quality and emphasizing statistically significant results.” 1 At the same time, we encounter a publish-or-perish culture, in which “a publication is no longer merely a way of reporting results; it is a coveted prize that can make or break careers.”2

While fraud, bias, and error aren’t interchangeable, they often feed into each other. “Given that top journals often look for exciting results of broad impact, these policies encourage researchers to hype their work. Worse still, they may encourage fraud.” 2

It’s not about excusing flaws, but understanding how scientific culture, incentives, and expectations influence research. For science communicators, the goal is to explain these complexities clearly, helping the public navigate science without fuelling distrust.

MISINFORMATION

Human culture strongly depends on people passing on information. But what we choose to share isn’t always based on accuracy, it’s often shaped by emotion. “People seem to mainly pass on information that will evoke an emotional response in the recipient, irrespective of the information’s truth value.” 3 Rumors are not a flaw in the system. They’re a built-in feature of how humans communicate.

Psychological science has shed much light on the cognitive processes with which individuals process, acquire, and update information. People tend to accept claims that align with their existing beliefs, especially when those claims feel familiar or come from trusted sources. “Messages that are inconsistent with one’s beliefs are also processed less fluently than messages that are consistent with them.” 3 Emotion plays a role too: fear, outrage, or surprise make a message more memorable and more likely to be shared.

Repetition further strengthens belief: “Retractions can backfire, reinforcing false beliefs due to cognitive biases, familiarity, and worldview resistance.” 4 Once a narrative feels stable, contradictory information becomes harder to absorb. “People may be uncomfortable with gaps in their knowledge of an event and hence prefer an incorrect model over an incomplete one.” 3

Some misinformation emerges unintentionally. “The media sometimes unavoidably report incorrect information because of the need for timely news coverage.”3 In fast-moving or complex events, simplification becomes necessary but can easily lead to distortion. Still, not all misinformation is accidental. There are also political agendas, economic incentives, and deliberate strategies of confusion. These are all less benign causes that seek to influence, not inform.

These cognitive tendencies don’t operate in isolation. Online environments amplify them. “The flexibility and fractionation offered by social media has allowed people to choose their favored ‘echo chamber’ in which most available information conforms to pre-existing attitudes and biases.” 3 These spaces reward visibility and emotion, not accuracy or nuance.

For science communicators, this means that facts alone aren’t enough. Understanding how people absorb, process, and hold onto information is essential. Otherwise, even well-intentioned messages risk being ignored, or becoming just another layer of noise.

(c) Creators for Climate
(c) Creators for Climate

REPLICATION CRISIS

In recent years, science has been said to be facing a replication crisis: with a wave of published results that fail to hold up when retested. In some fields, high-profile replication failures have raised concerns about credibility, quality, and trust.

But these concerns can be misleading if taken out of context. Not all studies are equally likely to replicate—and that doesn’t always mean they were flawed to begin with. “Part of the failure to replicate published results is caused by random variability intrinsic to any sampling procedure.” 5 In many areas of research, especially those testing exploratory or complex ideas, false positives are statistically likely. “Neglecting the low base rate of true hypotheses results in overestimating the reliability of positive results.” 6 This misunderstanding (known as the base rate fallacy) can inflate the perception of crisis.

Of course, not all issues stem from statistics. Ethical and practical constraints often require relatively small sample sizes. Expectations also play a role. “It is unreasonable to expect that results from a single study can give great predictive power over the results of future experiments.” 5 Yet when findings don’t replicate, it’s easy to assign blame. “It is harmful that a failed replication should immediately be regarded as casting aspersions on the competence and probity [honesty] of the original scientists.”6

Despite the growing fear around reproducibility, there’s little evidence that a systemic collapse is underway. “Issues with research integrity and reproducibility are not distorting the majority of the literature.” 7 Nor are they growing.

Failed replications are part of how knowledge improves, not signs of failure. Still, the crisis narrative has taken hold. “The new ‘science is in crisis’ narrative is not only empirically unsupported, but also quite obviously counterproductive. Instead of inspiring younger generations to do more and better science, it might foster in them cynicism and indifference.” 8

For science communicators, a shift in framing can make all the difference, “Contemporary science could be more accurately portrayed as facing ‘new opportunities and challenges’ or even a ‘revolution”. 8 A failed replication doesn’t mean science is unreliable. It means science is evolving, and understanding its limits is just as important as explaining its findings.


PEOPLE AND PLATFORMS TAKING ACTION

After laying out the issues, one thing is clear: identifying the cracks doesn’t mean everything is falling apart. “Errors in science do not disappear merely with the passage of time. Finding errors requires scientific work.” 9 In fact, many of the most impactful solutions are already in motion, driven by people, communities, and a shift in how science is shared, reviewed, and questioned.

Some of the most visible progress comes from individuals and grassroots platforms holding science accountable—both from within and outside of the scientific community:

  • Elisabeth Bik has reviewed tens of thousands of scientific images to identify duplications and manipulations, helping expose fraud and spark institutional responses.

  • Retraction Watch meticulously documents why studies are pulled from the literature—demonstrating that correction is an integral part of science, not a flaw.

  • Calling Bullshit, a course and public platform, teaches how to detect misleading data, spot faulty reasoning, and sharpen critical thinking.


GLOBAL AND SYSTEMIC INITIATIVES

While individuals raise awareness, broader reforms are reimagining the systems behind science.

  • Open science practices like preregistration, data sharing, and public peer review shift the emphasis from prestige to transparency.

  • Large-scale replication efforts like ManyLabs bring diverse teams together to replicate studies across various contexts, reinforcing the reliability of findings.

  • Community platforms such as PubPeer and Peer Community In open up the review process, encouraging constructive critique and broader participation from experts and peers.




What You Can Do as a Science Communicator

Psychological research shows that facts alone are not enough. Misinformation, misunderstanding, and misrepresentation are “sticky”: they appeal to emotion, identity, and familiarity. Correcting them—and preventing them—takes strategic, transparent communication rooted in how people actually think. Here is a toolkit drawn from empirical research.


PROACTIVE STRATEGIES

BEFORE MISINFORMATION (OR MISUNDERSTANDING) SPREADS

  • Prebunking and preexposure Warnings prepare audiences before exposure happens.  “People by default expect the information presented to be valid, but an a priori warning can change that expectation.”→ Such warnings increase resistance to both misinformation and uncertainty.

  • Accuracy prompts

    Quick nudges—such as asking “Is this accurate?”—shift attention from emotion to reflection.→ These simple cues have been shown to reduce the spread of falsehoods significantly.

  • Friction Even small barriers, like prompting people to read before reposting, reduce impulsive sharing. → Slowing the process increases deliberation, without requiring confrontation.

  • Emphasize the consensus Frame messages around the current body of evidence, not isolated facts. “Communicating in terms of the collective accumulated evidence shifts the message toward what the best available evidence indicates.”10 This helps people follow the science without expecting it to be final or absolute. 

REACTIVE STRATEGIES

WHEN RESPONDING TO MISINFORMATION OR MISUSE


  • Offer clear alternatives When addressing incorrect information or retracted studies, don’t just refute—explain. People are more receptive when they’re given a plausible narrative that fills the explanatory gap.

  • Repetition without repeating the myth Repeat the accurate version clearly and often, but avoid restating the myth. Familiarity reinforces belief, even if the information is false.

  • Speak to people’s values If a message feels like an attack on someone’s identity or worldview, they’ll likely reject it. Frame corrections in ways that show shared concerns—not sides.


  • Simplicity and focus Too many counterarguments can backfire. Clarity matters more than volume.

LONG-TERM STRATEGIES

BUILDING PUBLIC TRUST AND RESILIENCE


  • Lateral reading Encourage verifying claims by checking other sources. This is a simple, teachable habit. It boosts independent thinking and challenges the passive acceptance of headlines.

  • Media literacy and self-reflection Help audiences understand how algorithms, repetition, and even personality traits can affect what they believe.

  • Participatory engagement and representation “Involving more diverse perspectives may help scientists challenge cognitive biases when seeking or interpreting evidence.” 10 Include more voices in the conversation. Representation, citizen input, and inclusive communication increase the legitimacy and relevance of science.

  • Acknowledge uncertainty Be clear about limitations, evolving evidence, and the fact that science is always a work in progress. “Science is tentative because scientific ideas are open to revision when new evidence appears.” 9

  • Talk about error as part of the process “Portraying error in science reveals its human dimension.” 9Mistakes, updates, and disagreements are not flaws. They are features of how science works.









REFERENCES

  1. Korbmacher, M. et al. The replication crisis has led to positive structural, procedural, and community changes. Communications Psychology 1, (2023).

  2. West, J. D. & Bergstrom, C. T. Misinformation in and about science. Proc Natl Acad Sci U S A 118, (2021).

  3. Wittenberg, C. & Berinsky, A. J. Misinformation and its correction. in Social Media and Democracy: The State of the Field, Prospects for Reform 163–198 (Cambridge University Press, 2020).

  4. Kozyreva, A. et al. Interventions against misinformation: Toolbox of Interventions Against Online Misinformation and Manipulation.

  5. França, T. F. & Monserrat, J. M. Reproducibility crisis in science or unrealistic expectations? EMBO Rep 19, (2018).

  6. Bird, A. Understanding the Replication Crisis as a Base Rate Fallacy. doi:10.1093/bjps/axy051/5071906.

  7. Petersen, O. H. Reproducibility – again. Journal of Physiology vol. 597 657–658 Preprint at https://doi.org/10.1113/JP277486 (2019).

  8. Fanelli, D. Is science really facing a reproducibility crisus,and do we need it to? doi:10.1073/pnas.170827211

  9. Allchin, D. Teaching the nature of science through scientific errors. Sci Educ 96, 904–926 (2012).

  10. Holford, D. et al. Science Communication as a Collective Intelligence Endeavor: A Manifesto and Examples for Implementation. Science Communication vol. 45 539–554 Preprint at https://doi.org/10.1177/10755470231162634 (2023).


 
 
 

Comments


BE SciComm logo

Are you (aspiring to be) involved in science communication?

Join our community and attend one of our networking events. We bring together (future) science communicators in Belgium, from researchers to journalists, press officers, communication professionals, designers, policymakers, students, etc.

  • LinkedIn

Stay in the loop

Thanks for submitting!

With the support of

innoviris-logo.png

© 2026 BE SciComm | Privacy policy

bottom of page