Toxic Certainty: A Brief Investigation

Jon Ward
12 min readMay 29, 2022

--

At the request of someone I greatly respect, I watched an episode of Steve Bannon’s podcast, The War Room (do not try this at home). To set the scene: Mr. Bannon sits facing the camera. On one side, images of Jesus and the Virgin Mary to proclaim his piety; in the center, unkempt hair and unshaven jowls to proclaim his fearless authenticity; and on the other side, a written proclamation: “There are no conspiracies, but there are no coincidences.” That both these latter statements are demonstrably false is immaterial. What matters in The War Room is not veracity but conviction, and of that Mr. Bannon and his guests have an inexhaustible supply.

In the episode I watched, much of the tone and content was predictably unpleasant, but a few plausible statements were made: it’s hard to talk for twenty minutes without saying something true. What was most impressive was the complete absence of hesitation, self-questioning, or open-ended curiosity. Everything was spoken in bold italics. Every opinion was laden with the weight of fact. The War Room is a festival of certainty. As we learned on January 6th 2021, this tone of absolute conviction becomes dangerous when coupled with violent and autocratic intent, but it’s by no means the exclusive property of rightwing demagogues. It’s a universal affliction of political discourse. More than that, I suspect that the seductions of certainty are a problem for the entire species: you, me, everyone. A moment of modest introspection may be in order.

The Neuroscience of Knowledge

In a fascinating book on the subject(1) neuroscientist Robert Burton explores the cognitive functions by which we come to know what we know. He makes two radical but immediately plausible claims: first that the feeling of knowing operates independently of the information known; and secondly that this feeling of knowing emerges beyond our conscious control. Like many in his field, Burton draws on cases of pathology to illuminate his point: the sufferer from Cotard’s syndrome who “knows” that she is dead; the paranoid schizophrenic who “knows” that his thoughts are being recorded by aliens; the stroke victim who “knows” that his real antique desk has been switched for a cheap simulation. No surprise: indisputable evidence fails to shift these convictions.

We might be forgiven for believing that imperviousness to evidence is itself a signal of pathology. Not so fast. Research shows when individuals hold strong convictions, confronting them with corrective evidence has at best a limited impact. According to proponents of the so-called “backfire effect” it may even reinforce them. Once the feeling of knowing attaches to a belief, it becomes extraordinarily difficult to dislodge. All the forces of our intellect are summoned to strengthen the bond. Why is this?

Burton argues that the feeling of knowing is actually a form of pleasure that engages the brain’s dopamine reward system in a manner related to addiction. He suggests (with admirable caution about evolutionary reductionism) that this pleasure confers an adaptive benefit: if we didn’t enjoy the sense of knowing, we’d never learn anything and the species would die out. It takes some effort to tease this out, but if you look closely, you’ll see that 2+2=4 is wrapped in the warm sensation of rightness. Without this wrapping, it’s a meaningless cluster of numbers with no emotional valence at all. Conversely, we’re all familiar with games of trivia and the sweet thrill of knowing the answer to an obscure question: you can almost hear the dopamine rushing through your brain.

Like most neuroscientists, however, Robert Burton has a tendency to observe the individual in isolation. If the social fabric shows up at all, it looks like separate brains strung together, almost as an afterthought. As a result, some important distinctions can be missed. Let’s take a look.

Who Says?

My knowledge can be roughly divided between what I know firsthand, through direct sensory experience, and what I’ve learned from others. Philosophically educated readers will understand why I used the term “roughly”: this dichotomy is fraught with hidden ambiguities, some of which I’ll touch on. But from a practical and (most important) ethical point of view, the distinction is useful. I know that mangoes are sweet. I know that e-mc2. The way I know these two data points is sufficiently different to be worth dwelling on. I have tasted many mangoes. I have never attempted to duplicate Einstein’s equations, and never will. Yet here’s the curious point: the feeling of knowing attaches to both points with equal force, and it’s effectively the same feeling in both cases. A moment’s reflection explains why this might be. From an adaptive point of view, it’s essential that I “know” that rattlesnakes are poisonous, even though I’ve only heard this from others. Waiting to acquire such knowledge first hand would be extremely expensive. That’s good enough reason for my brain to confer the same feeling of knowing on both first- and second-hand knowledge.

Here’s an interesting example. You know that the sun rises in the east: you can see it every morning with your own eyes. You also know that the sun doesn’t rise at all — it’s just the earth turning. What you know from direct experience contradicts what you know from others, yet you attach the same feeling of knowing to both pieces of information. Clearly this dual application of the same pleasurable sensation is fraught with hazard, and sometimes absurd contradiction.

The implications are far from trivial. If you survey your own mental databanks, you’ll quickly surmise that 90% or more of what you know comes from others, not your own sensory experience. That’s what it means to live in an advanced civilization. You have attached the feeling of knowing to a huge volume of information you have never apprehended or tested first hand, from the second law of thermodynamics to the existence of the United Nations.

Narrating Knowledge

I need to insert a corrective here, though I’m not sure how to integrate it. Up to now I’ve referenced “data points” of knowledge, but when it comes to information acquired from others, this is mostly a misrepresentation. We don’t accumulate collections of facts. We accumulate collections of stories. Some of the stories may be very short (my version of e=mc2), some may be very long (this year I finally read War & Peace) but there’s almost always a narrative, whether of the British monarchy, or the human vascular system, or octopus intelligence. What we know usually takes the form of “It began here, which led to this, which resulted in that…”

There’s no doubt a brain function involved, and I’ve begun exploring the neuroscience of narrative(2). At the same time, the social dimension is obvious: no stories, no culture. How these two aspects intersect is beyond my grasp. However, it seems to me that the feeling of knowing fully takes hold only in the presence of a narrative. This is why conspiracy theories are so compelling. They are never mere assertions of “fact”. They always take the form of stories with beginnings, middles and end results.

Don’t Believe Your Eyes

But what about first-hand knowledge? It turns out, there are serious problems with what we “know” through direct perception. This has been intuited by philosophers for centuries, most eloquently by Kant, who asserted that we never have access to “the thing in itself.” More recently, neuroscience has uncovered the remarkable degree of construction involved in every perception.

The most compelling current theories(3) suggest that the brain, even from infancy, is constantly inventing the world and then testing its inventions against the incoming sensory data, applying a kind of Bayesian logic of prediction and adjustment. What you “see” is your brain’s best anticipation of what the sensory data indicates — “best” being defined more in terms of utility than objective accuracy.

A Skeptical Turn

When everything we think we know is thrown into so much doubt, what can we do? Enter the skeptics. The Greeks started that ball rolling, but its most salient icon has to be Descartes. These days, everyone takes pleasure in “knowing” that Descartes was wrong (about the mind-body split). In doing so, almost everyone ignores the courage and originality of his experiment in radical skepticism. Descartes set about doubting everything he knew including, most dramatically, what he knew from direct sensory experience. (He imagined an evil genius deceiving his senses, which in terms of modern brain science is remarkably plausible).

Descartes drove his experiment to its indissoluble kernel: the fact that he was thinking at all. The way he formulated this — “cogito ergo sum: I think therefore I am” — got him in a world of trouble for which he’s never been forgiven. The ergo sum part is an extrapolation open to challenge. But the cogito is not — you can be as skeptical as you like but you’ll never escape the feeling of knowing (that you’re thinking at all).

How far can we take this path? While it’s reasonable to turn a skeptical eye to our direct perceptions, this isn’t useful in most real-world situations. If you constantly doubt the evidence of your own eyes, don’t ask me to get into your car: you’re a menace on the roads. Practically speaking, we have to give credence to our sensory data most of the time. Knowledge acquired from others is another matter, and that’s a big deal.

Who Before What

Suppose (and it’s impossible) you could follow Descartes’ lead and erase all the knowledge you have absorbed from other people. When you came to reconstruct it, on what foundations would you build?

First, we should note that speaking of knowledge acquired from others is slightly unsatisfactory. It’s a bit like the picture of “acquiring language” as if language is something we put inside ourselves like food as we grow up. Things don’t work that way. As the psychoanalyst Jacques Lacan insisted, we are born into language. It predates, surrounds and defines us. We inhabit language before it inhabits us. In the same way, we are born into knowledge, vast amounts of it. It may feel to us as if we own it, but in reality much of it owns us. And as with language, so with knowledge: it is actually the community that holds us in its possession. This social dimension gives the feeling of knowing enormous added strength. “We know” has far greater power than “I know.”

Not only is this true at the macro level of human society in total. It also operates in micro form. People attach to communities of knowing: in politics, especially today, this is palpable. One word for this attachment is ideology. (Of course “ideology” always applies to other people’s positions, not one’s own!)

Looking back, I’ve at times been possessed by convictions that now seem to me preposterous. But in each case, I was bound to a community that in turn bound me to the belief. It’s as if the comfort of relationship can normalize any reality — or unreality.

This leads to a curious perception. We think we believe people because of what they say, but I suspect that it’s more often the other way around. We believe what they say because of who they are. The choice of what is much more discussed than the choice of who — yet the choice of who usually comes first, mostly unconsciously. And it’s decisive. This hidden reversal is widely ignored by the conventions of critical thinking.

Scorched Earth Thinking

We have to believe others, but on what grounds? There’s a tempting way out of the conundrum: what I would call lazy skepticism. Start Descartes’ experiment and abandon ship. Don’t try to reconstruct your certainties on surer ground. Simply apply the same quantum of doubt to everything anyone says. Become a committed skeptic. Not only is this lazy, it’s phony. You can no more live without believing at least some of what you’re told by others than you can live without trusting your own senses. If you really doubted everything anybody says, you couldn’t shop for groceries, let alone get on a plane.

The appeal of lazy skepticism is open to sinister manipulation. A recent study of current Russian propaganda(4) noted that much of the effort appears to be directed at creating mistrust in all narratives, even amongst the home audience. There’s another version of this tack, equally dangerous, which promotes the idea that because scientists disagree with each other (of course they do: that’s their job!), no science is worth believing.

From a strictly rational point of view, the choice of lazy skepticism has some skimpy plausibility. Ethically, it’s a non-starter. The human world is messy and fractious, evidence is always incomplete, knowledge is a work in progress, and people behave badly. That’s life. We’re in the game, whether we like it or not. The question is: how to play well. To be remotely useful, we have to make choices about both who to believe and what to believe.

Then what is to be done?

Provisional Strategies

I refer back to the wisdom of Robert Burton’s thinking: we can’t escape the feeling of knowing, which takes hold of us before we have conscious choice in the matter. Nor can we live without trusting our senses or accepting as true a great deal of what we receive from others. What we can do is heighten our awareness of the forces that shape our knowledge. Here are some very provisional starting points:

First, recognize that the feeling of knowing is quite distinct from the information known: it’s a pleasurable attachment, like mistletoe growing on a tree. Make the (considerable) mental effort to force apart the thing you are sure of and the feeling of being sure.

Secondly, recognize that we mostly choose who to believe before we choose what to believe. Notice that this first step easily escapes attention and its consequences can be massively underestimated.

Third, understand that narratives reinforce the feeling of knowing, and a good story can make almost anything feel true.

On the question of who to believe, which is pivotal to this whole way of thinking, there clearly can be no rational grounds for certainty. Unless you’ve abandoned yourself to a guru or a cult, no human being can provide absolute authority. How then do we hedge against deception? I think the best answer is: diversify your portfolio. The point about science is not that it offers infallible access to a fixed truth. The point about science is that it’s a vast, argumentative community. Is this community susceptible to corruption, fads, financial pressures, and so forth? Of course! It’s human. Nevertheless, it’s the best we have and its greatest strength lies in its multiplicity. I would say the same about the international community of journalists (rapidly becoming an endangered species). I trust the New York Times more than someone’s Telegram channel, not because the Times is immune to falsehood, but because it’s embedded in a large, competitive network of professionals who are quickly on each other’s case.

Returning to Mr. Bannon and his significantly named podcast, I have a hunch — nothing more — that there’s a correlation of toxic certainty with violence. It makes sense that when the feeling of knowing reaches a certain pitch, it can facilitate the demonization and dehumanization of others, and hence the rejection of normal restraints on doing harm. The new American right may be one case in point. Another example is the Taliban’s militant Islam. However, the path that runs from certainty to threatening, or taking, other people’s lives is not wholly clear to me. It could be a question for some curious postdoc to explore.

The Nocturnal Teacher

Meanwhile, there’s simple research we can all conduct, every time we get up. On his voyage into radical skepticism, Descartes noticed something obvious and useful. Dreams. He asked himself (somewhat in the mode of the famous Chinese sage) how he could know if he was dreaming or not? To me, this suggests a daily exercise in epistemological humility.

Dreams may be filled with anxiety but they’re rarely if ever troubled by doubt. What makes them fascinating is how firmly we “know” the realities they assume. For example, while I’m asleep I’m quite certain that I can float on an air mattress over the city of Paris. Waking up, I know with equal certainty that I can’t. Within seconds, the feeling of knowing has promiscuously switched sides. This is a daily lesson in the fragility of human certainties. It can make for a sobering, but instructive, start to the day.

+++++++++++++++++++++++++++++++++++++++

1. Burton, Robert Alan. On Being Certain: Believing You Are Right Even When You’re Not. St. Martin’s Griffin, 2009. I learned about this book from the estimable Dr. Ginger Campbell, host of the Brain Science Podcast.

2. Armstrong, Paul B. Stories and the Brain: The Neuroscience of Narrative. Maryland, 2020.

3. Buzsaki György. The Brain from inside Out. NY, United States of America, 2021.

4. Thompson, Stuart A. “The War in Ukraine, as Seen on Russian TV.” The New York Times, The New York Times, 6 May 2022, https://www.nytimes.com/interactive/2022/05/06/technology/russian-propaganda-television.html.

--

--

Jon Ward

After a career in marketing I created a software tool, Braincat, to help people think better. Medium is where I share my thoughts about issues I care about.