KNOWLEDGE, DOUBT, AND THE HUMAN MIND
- Sooike Stoops

- Jun 26
- 9 min read
Updated: Aug 21
What happens when facts alone no longer convince? Philosopher Johan Braeckman has spent decades studying how beliefs are formed, how misinformation spreads, and how we might address it. In this conversation, we ask him, among other things, about the responsibilities and blind spots of science communication today.
Johan, you’ve long been active in the public debate about science, especially when it comes to pseudoscience and misinformation. How did that become such a strong focus in your career?
“I’ve always been intrigued by misinformation, even fascinated by it. That fascination led me into debates about evolution, creationism, and intelligent design, already more than twenty years ago. I began to ask: how can it be that so many people reject scientific findings? What is it that makes them believe these alternative views so strongly?”
“This interest is closely tied to evolutionary theory, but not only in a biological sense. I also approach it through anthropology, psychology, and philosophy. For me, it’s about understanding the broader context: how people think, what makes an idea stick, and why we sometimes cling to views that are demonstrably false.”

Was this already your focus from the very start of your academic career?
“Not quite. My PhD was on changing views of nature in the 19th century—scientific, romantic, philosophical worldviews—and Darwin played a central role in that. So the theory of evolution was already a research interest of mine even then, but in a more historical and conceptual sense.”
“The philosophy department at the university gave me a lot of freedom. I was able to follow my curiosity. Questions like ‘What’s the difference between science and pseudoscience?’ or ‘How do logical fallacies work?’ became increasingly important in my work. The step to actively engaging with pseudoscientific claims was a small one.”
Why did you take on such an active role in that debate?
“Because I still find it genuinely fascinating. People who deny science, like flat-earthers, anti-vaxxers, or climate skeptics, are usually not mentally ill. They believe what they say and obviously they can reason about it in an intelligent way. And I want to know: what’s going on in their minds? How do these ideas take hold?”
“At the same time, I find it worrying. If these views spread widely, they can have very real consequences, people refusing vaccinations, for example. These beliefs are often deeply personal. It’s intellectually challenging to understand the mechanisms and even more so to counter them.”
Has pseudoscience and science denial become a more widely spread problem in recent years?
“It probably has. Fifteen years ago, it was almost a fringe curiosity. But with the COVID-19 pandemic, these ideas surfaced more widely in society. Suddenly, everyone seemed to know someone who believed in conspiracy theories. Families were torn apart over vaccines. The impact became painfully real.”
“Still, we don’t know for sure whether the problem is actually bigger now, or just more visible. Creationism, for instance, has been studied for over a century. In the US, around 40% of people have consistently rejected evolution for decades. That hasn’t changed much.”
“But with social media, conspiracy theories now spread faster and more widely. Add to that the role of big media platforms such as Netflix, YouTube, and podcasts, and it’s clear we’re in a different media landscape. Misinformation is no longer marginal. It’s embedded in our daily lives.”
How important are public personalities or influencers in this landscape?
“Hugely important. Celebrities, from Gwyneth Paltrow to Jenny McCarthy, can promote absurd ideas, and their fans believe them. That’s not trivial. In the US, public health officials are spreading dubious claims. And yes, children have died of measles because of that.”
“It’s not just celebrities. Sometimes, very smart people adopt and defend views that are simply wrong. What happens is that they use their cognitive talents to defend wrong ideas with intelligently sounding arguments.”
“And of course, some misinformation is spread with malicious intent: disinformation campaigns, troll farms, companies that profit from alternative medicine. But a lot of it is spread in good faith. Think of a young woman about to give birth, hearing from a friend or reading online that vaccines cause autism. She’s not deliberately acting against scientific insights. She’s anxious. That’s a very human starting point.”
How can you reach people who are in that mindset?
“You won’t reach them by bombarding them with statistics. Most people trust stories more than numbers. If your neighbor or your mother tells you something, you’re more likely to believe it than when a stranger does.”
“If we want to counter misinformation, we need to do several things. First, we have to build trust. Second, we need to explain how science works, not just throw results at people, but show them the method behind those results.”
“We also have to ask them certain questions, repeatedly and friendly: Why do you trust one source but not another? Why believe a friend over the broad scientific consensus?”
“Reaching people takes patience, empathy, and humility. Especially when they’re deeply concerned about their own health, or their child’s. And often, there’s a broader mistrust of institutions beneath the surface. Think of those who’ve had bad experiences with the government or healthcare systems. Their rejection of authority, regulations, and science doesn’t come out of nowhere.”
Striving for dialogue, building trust, focusing on methods over outcomes... This all feels very different from how most researchers are trained to communicate.
“Definitely. Scientists are trained to separate arguments from the person who brings them up. If someone disagrees with their work, they generally know not to take it personally, but to see it as part of the process. That’s not how most people experience disagreement in conversations. When someone says, ‘That’s not true, you’re wrong’ it’s often perceived as a personal attack. You have to turn that around. As a researcher or science communicator, it’s not just about being right. It’s also about making sure the other person feels respected and heard.”
“Not only does the message matter, the history behind it also shapes how it’s received. Many people, let’s be honest, have good reasons to mistrust experts. Scientific authority has at times been used to defend questionable interests. Naomi Oreskes and Erik M. Conway describe this in their book Merchants of Doubt, showing how certain scientists have used their status to cast doubt on climate science or the link between smoking and cancer.”
“If people sense that you’re approaching them from a moral or intellectual high ground, they’ll shut down or take a defensive position. And that happens often; not so much in formal science communication, but in everyday settings: at the dinner table, in cafés, among friends. We need to be aware of how information spreads: not through scientific journals, but in normal interactions. And that’s where condescension or arrogance can be fatal.”
At the same time, you want to avoid the trap of pure relativism by just saying, “Well, it’s all opinions anyway.”
“Exactly. Anyone communicating science has a responsibility to defend the knowledge it produces. Not as the absolute truth, but as the best tool we have to understand the world, producing the most reliable knowledge so far. There’s no need for a sharp divide between ‘experts’ and ‘the uninformed’. We’re all vulnerable to mistakes.”
How do you talk about science without alienating people or making it sound like your knowledge is superior?
“A good way in is to talk about cognitive biases. All of us—professors, Nobel Prize winners, laypeople—are prone to it. I like to use examples from Daniel Kahneman’s work, or simple riddles I share in lectures. Even highly educated people get them wrong.”
“It’s about showing that we’re all human. Some of the strangest beliefs are held by very intelligent people, precisely because they’re clever enough to rationalize them. Sharing examples of your own errors helps too. I often say that as a teenager, I believed in telepathy. I don’t anymore. That usually gets a smile, but the point sticks: it’s okay to change your mind. In fact, when the arguments and the facts are there, it’s the smart thing to do.”
“And that’s crucial. People need to feel they can revise their views without losing face. It’s not shameful to say, ‘I used to believe X, but now I think differently.’ In Dutch, we call it voortschrijdend inzicht, a deepening or evolving insight that comes with new information or reflection. During the COVID-19 pandemic, many people learned just how important that capacity really is.”
Do you use that approach in lectures?
“I try to, yes. Though to be honest, lectures often feel like preaching to the choir. You reach people who are already interested. On social media, it’s different. You’ll run into more resistance, but sometimes also more meaningful encounters.”
“I always respond in those debates. I believe you can plant a seed of doubt, even if you don’t see immediate results. Many people who were deep into conspiracy thinking have told me they got out of it because someone gave them a reason to pause and reflect.”
But how do you handle people who are deep in a misinformation spiral?
“Trust and tone are everything. In my two-day courses in critical thinking for medical professionals in the Netherlands I always experience how easy it is to let smart and highly educated people fall for nonsense. On day two, I bring in an illusionist to perform. He plays with expectations and lets the audience draw their own conclusions. At one point, he seems to read a sentence from a book, supposedly through someone else’s eyes. He even gives a bullshit explanation of how the brain enables him to do it. And every time, the majority of the room falls for it. Many are convinced he has telepathic powers. Mind you, these are trained physicians.”
“Afterwards, I explain what just happened. You can feel the discomfort; ‘How could they fall for this?’ But that’s where learning begins. You show how the brain works, how easily we are misled. And most importantly: you don’t ridicule them. You keep the tone right. Of course, some humor helps too.”
What does all this mean for us as science communicators?
“It’s a very specific job: trying to help people let go of beliefs they’re attached to. That requires the right tone, the right personality, and often the right medium. In-person conversations work better than written texts. Even a YouTube video is more effective than an opinion piece, because it carries more human nuance.”
“Not everyone is cut out for the oral side of science communication. You need warmth, a sense of humor, the ability to make a quick self-deprecating remark. If you can’t do that, this kind of work might not be for you. It’s not about coming in with a moral or intellectual high ground. It’s about understanding where people are coming from, including the emotional weight behind their views. Think of a mother concerned about the health of her child, that’s not something you dismiss with a graph.”
“That said, you can use anecdotes in science communication, as long as they’re paired with evidence. In one of my lectures, I show an image of a child infected by measles, adding that it died because of the disease. That makes an impact. But I follow up immediately with data, and I explain the Andrew Wakefield case, demonstrating how misinformation can spread. Anecdotes can open the door, but we need to do more than that. In pseudoscience, anecdotes often lead to the wrong conclusions. That’s unacceptable in science communication.”
“Of course, you won’t find anecdotes in scientific journals, but in science journalism or opinion pieces, they have their place, as long as you don’t mislead.”
What about training scientists to speak in the media? Suppose a researcher is asked to join a televised debate with a politician about a sensitive topic. Who do you send? The professor who knows the research inside out, or the one who’s more at ease in front of a camera?
“That’s a real challenge. The instinct is often to say: ‘Just send the top expert.’ But that’s not always the best idea. You need people who know the science but also have the skill—and ideally the talent—to speak clearly, with confidence and nuance. Timing a joke, telling a story, using the right slide... these things matter. I often use PowerPoint with more cartoons than data. Most academic presentations are far too dense and go way too fast. There’s a fear of oversimplifying. But if your audience can’t follow you, the nuance is useless.”
“Journalists tend to call the same few people, not necessarily because they’re the best experts, but because they always have something to say that everybody understands. They’re media-friendly. So, if we want better public debates, we need to identify talented communicators among scientists and give them proper media training.”
“Also know that the media won’t come knocking unless you put yourself out there. Write an opinion piece, say something clear and memorable. That’s how you get invited. It’s a bit cynical, but if you’re not in the media’s ‘file box,’ they’ll just keep calling the same semi-expert again and again.”
They still find you, despite your recent decision to leave the university. In 2024, you stepped away from your role at Ghent University after 35 years. Why that move?
“I always had creative freedom in my work, but not in how I divided my time. Days would fly by, and by the end of the week, I often wasn’t sure what I’d actually accomplished. Of course, I supervised PhD students, taught classes, wrote papers and books; that leaves a positive mark. But much of my time went into things that didn’t add up.”
“Over time, I wanted to feel that my work was building toward something tangible. That’s why I left the university to focus on work I consider more relevant and enduring, in this part of my life. I still give lectures, like the critical thinking courses I mentioned, but now I can dedicate myself more to reading, thinking, and shaping new projects.”
“No more meetings, no traffic jams, no lesson prep. It’s liberating. I’ll try to write a book a year, including one on science denial, and I continue recording lectures for Home Academy. This new rhythm gives me the time and space to focus on what matters most: developing ideas that can resonate more deeply, especially in how we talk about science and doubt.”











Comments