mail.google.com /mail/u/0/

STEVEN PINKER: Reason To Believe

18-23 minutes


(Photo by Hollie Adams/Getty Images)

From time to time, we take deep dives into topics that run longer than our average piece. In today’s long read, cognitive scientist and Persuasion advisor Steven Pinker explores the nature of rationality, why people hold outlandish beliefs, and how modern universities are straying from their commitment to reason and truth.

— The Editors

When I tell people that I teach and write about human rationality, they aren’t curious about the canons of reason like logic and probability, nor the classic findings from the psychology lab on how people flout them. They want to know why humanity appears to be losing its mind.

Why do people believe in outlandish conspiracy theories, such as that Covid-19 was a plot by Bill Gates to implant trackable microchips in our bodies? Or in blatant fake news, such as that Joe Biden called Trump supporters “dregs of society”? Or in paranormal woo-woo, like astrology, extra-sensory perception (ESP), and spiritual energy in pyramids and crystals?

It won’t do to give the Spockian answer that humans are simply irrational. Our hunter-gatherer ancestors lived by their wits, outsmarting animals with traps, poisons, and ambushes, while protecting themselves from the elements with fire, clothing, and shelter. It was this ingenuity that allowed mankind, according to the definition by Ambrose Bierce, to “infest the whole habitable earth and Canada.” Since then our species has plumbed the nature of matter, life, and mind, has slipped the surly bonds of Earth to explore other planets, and has blunted the scourges of war, pestilence and famine, doubling our life expectancy. Even the everyday feats of holding a job, keeping food in the fridge, and getting the kids clothed, fed, and off to school on time require feats of reasoning that are beyond the ken of our best AI.

How, then, can we explain the pandemic of poppycock? My best answer comes in four parts.

The first is rooted in the very nature of rationality. Reason, almost by definition, is inference deployed in service of a goal; no one gets rationality credit merely for enumerating true but useless propositions. But that goal need not be an objective understanding of the world. It can also be to win an argument in which the stakes matter to you. As Upton Sinclair pointed out, “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

Or reason can be used to show how wise and moral your group is—your religion, tribe, your political sect—and how stupid and evil a rival one is. This “myside bias,” explained in Keith Stanovich’s The Bias That Divides Us, appears to be the most pervasive bias among the hundreds documented by cognitive science and social psychology. And there’s a perverse rationality behind it. It’s far from irrational for an individual to endorse a belief that gains them status as a hero for their side while avoiding ostracism as a traitor. The problem is that it’s irrational for a society as a whole to lurch between one faction’s dogmas and another’s rather than collectively arrive at the most accurate understanding of reality. These conflicting incentives place us in a tragedy of the rationality commons.

A second contributor to irrationality is that human reasoning is guided by deeply rooted folk intuitions, the evolutionary legacy of having to figure out the hidden laws of reality before the scientific revolution gave us a sound method for doing so. Those intuitions, while indispensable for navigating everyday human life, are incommensurate with our best modern understanding of the world. The mismatch makes us vulnerable to superstition and pseudoscience.

For example, we are intuitive dualists: we feel that people are composed of two things, a body and a mind. We don’t treat others as robots or meat puppets, but assume they have beliefs and desires and sentience like our own, which we ascribe to an intangible mind or soul, a ghost in the machine. (This contrasts with the dominant scientific understanding that mental life arises from neurons firing in patterns.) From there it’s a short step to believing that minds can part company with bodies, and so to believe in spirits, souls, ghosts, an afterlife, reincarnation, and ESP.

We are also intuitive essentialists, sensing that living things contain an invisible essence or lifeblood that gives them their form and powers. This instinct allowed our ancestors to discern the continuity beneath the differing appearances of a single species (such as its eggs, seeds, flowers, or larvae) and to extract foods, medicines, and poisons from their tissues. But from essentialist inklings it’s a short step to intuit that disease must be caused by an adulteration of one’s essence by a foreign contaminant. That in turn makes it natural to recoil from vaccines, which, after all, introduce a bit of an infectious agent deep into the tissues of a healthy body. It invites homeopathy and herbal remedies, which infuse us with tinctures that seem to carry the essence of a healthful living thing. And it inspires the many forms of quackery in which people are subjected to purging, bloodletting, fasting, sweating, cupping, and other means of “getting rid of toxins.”

Finally, we are intuitive teleologists, aware that our own plans, tools, and artifacts are designed with a purpose. As the 19th-century theologian William Paley pointed out, when you find a watch lying on the ground, you correctly infer that it was designed by a watchmaker. But from there it’s a short step to inferring that the universe has a purpose, and so to believe in astrology, creationism, synchronicity, and the philosophy that “everything happens for a reason.” That mentality can also lead to conspiracy theorizing, abetted by the truth that our enemies really may plot out ways to harm us, and that in being vigilant against them, false positives may be less costly than false negatives.

The third key to public irrationality is to consider how we unlearn these folk intuitions and acquire a more sophisticated understanding. It’s certainly not by each of us exercising our inner genius. It’s by trusting legitimate expertise: scientists, journalists, historians, government record-keepers, and responsible, fact-checked authors. After all, few of us can really justify our beliefs by ourselves, including the true ones. Surveys have shown that creationists and climate change deniers are, on average, no less scientifically literate than believers (many of whom attribute warming to the ozone hole, toxic waste dumps, or plastic straws in the ocean). The difference is political tribalism: the farther to the right, the more denial.

For my part, I’ve been vaccinated against Covid five times, but my understanding of how the vaccines work extends little deeper than “Something something mRNA antibodies immune system.” I basically trust the people in the white coats who say they work. Flaky beliefs, in contrast, persist in people who don’t trust the public health establishment—who see it as just another faction, one that competes against their trusted preachers, politicians, and celebrities. In other words, we all have to trust authorities; the difference between believers who are probably right and those who are almost certainly wrong is that the authorities the first group listens to engage in practices and belong to institutions that are explicitly designed to sift truths from falsehoods.

The last piece of the puzzle of why people believe outlandish things is: It depends on what you mean by “believe.” George Carlin observed, “Tell people there’s an invisible man in the sky who created the universe, and the vast majority will believe you. Tell them the paint is wet, and they have to touch it to be sure.” The distinction was drawn more formally by the social psychologist Robert Abelson in his classic article “Beliefs Are Like Possessions,” which distinguished between “distal” and “testable” beliefs.

Touching the paint and other empirical probing is how we navigate the zone of reality in which we live our lives: the physical objects around us, the people we deal with face to face, the memory of our interactions. Beliefs in this zone are testable, and the compos mentis among us hold them only if they are likely to be true. They have no choice: reality, which doesn’t go away when you stop believing in it, punishes people for delusions.

But people also hold beliefs in spheres of existence that are far from their everyday experience: the distant past, the unknowable future, faraway peoples and places, remote corridors of power, the microscopic, the cosmic, the counterfactual, the metaphysical. Our lived experience gives us no grounds for beliefs about how the universe began, or when humans first appeared, or what makes it rain, or why bad things happen to good people, or what really goes on in the Oval Office or Microsoft headquarters or the dining rooms of Davos. Nor, except for a few movers and shakers and deciders, do our beliefs on these matters make any difference.

But that doesn’t mean that people abstain from believing things about these imponderables at all. They can adopt beliefs about them that are entertaining, or uplifting, or empowering, or morally edifying. Whether beliefs in this mythology zone are objectively “true” or “false” is, to them, unknowable and hence irrelevant.

Here again there’s a kind of historical rationality to this insouciance. Until modern times—the scientific revolution, the Enlightenment, systematic historiography and journalism, public records and datasets—the truth about these remote realms really was unknowable, and mythology was as good a kind of belief as any. This, I suggest, is the default human intuition when it comes to beliefs about these recondite matters. To appreciate the naturalness of this mythological mindset, you don’t have to invoke hunter-gatherers on the Pleistocene savannah. You just have to think about humanity during the vast majority of its existence, and about the vast majority of people today who have not signed on to the Enlightenment conviction that all of reality is in principle knowable by the scientific-analytical mindset.

The result is that there are as many kinds of mythological belief as there are questions about the world beyond our senses. The most obvious is religion, which most of its adherents cheerfully concede is a matter of faith, rather than of reason or evidence.

Another consists of national myths about the glorious martyrs and heroes who founded our great nations. The real historians who expose their feet of clay are as popular as a skunk at a garden party.

Many conspiracy theories, too, are entertained as engrossing myths rather than credible hypotheses. Believers in vast nefarious conspiracies, such as 9/11 “truthers,” hold their meetings and publish their exposés openly, apparently unconcerned that the omnipotent regime will crack down on brave truth-tellers like them.

QAnon might be likened to a live action role-playing game, with fans avidly trading clues and following leads. Its progenitor, Pizzagate (according to which Hillary Clinton ran a child sex ring out of the basement of a DC pizzeria), also had a make-believe quality. As the cognitive scientist Hugo Mercier has pointed out, virtually none of the adherents took steps commensurate with such an atrocity, like calling the police. (One of them did leave a one-star review on Google.) With the exception of the fanatic who burst into the restaurant with his guns blazing to rescue the children, among Pizzagate believers the proposition “I believe that Hillary Clinton ran a child sex ring” can really be translated as “I believe that Hillary Clinton is so depraved that she is capable of running a child sex ring”—or, perhaps even more accurately, “Hillary…Boo!” Beliefs outside our immediate experience, then, can be expressions of moral and political commitments rather than assertions of factual states of affairs.

Many of us are nonplussed by this way of thinking. It’s one thing to believe that Hillary Clinton is a morally compromised person—everyone is entitled to an opinion—but it’s quite another thing, and completely unacceptable, to express that opinion as a fabricated factual assertion.

But it’s our mindset that is exotic and unnatural. For many of us, it’s the dividend of a higher education which has imparted the sense that there is a fact of the matter about states of the world; that even if we don’t know it, there are ways of finding out; and that, as Bertrand Russell put it, “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” Indeed, one could argue that this mindset is the most important dividend of higher education.

Or at least, it used to be. Here's another candidate for a mythology zone: the sacred creeds of academic and intellectual elites. These include the belief that we are born blank slates, that sex is a social construction, that every difference in the social statistics of ethnic groups is caused by racism, that the source of all problems in the developing world is European and American imperialism, and that repressed abuse and trauma are ubiquitous.

Many observers have been taken aback by the repression of dissent from these beliefs in contemporary universities—the deplatformings, the cancelings, the heckler’s vetoes, the defenestrations, the multi-signatory denunciations, the memory-holing of journal articles. Universities, after all, are supposed to be the place in which propositions are interrogated and challenged and complexified and deconstructed, not criminalized. Yet these beliefs are treated not as empirical hypotheses but as axioms that decent members of the community may not challenge.

Academic cancel culture may be a regression to the default human intuition that distal beliefs are no more than moral expressions—in this case, opposition to bigotry and oppression. But the default intuition has also been intellectualized and fortified by the doctrines of relativism, postmodernism, critical theory, and social constructionism, according to which claims to objectivity and truth are mere pretexts to power. This marriage of intuition and theory may help us make sense of the mutual unintelligibility between Enlightenment liberal science, according to which beliefs are things about which decent people may be mistaken, and critical postmodern wokeism, according to which certain beliefs are self-incriminating. 

Can anything be done? Explicit instruction in “critical thinking” is a common suggestion. These curricula try to impart an awareness of fallacies such as arguments from anecdote and authority, and of cognitive biases such as motivated reasoning. They try to inculcate habits of active open-mindedness, namely to seek disconfirmatory as well as confirmatory evidence and to change one’s mind as the evidence changes.

But jaded teachers know that lessons tend to be forgotten as soon as the ink is dry on the exam. It’s hard, but vital, to embed active open-mindedness in our norms of intellectual exchange wherever it takes place. It should be conventional wisdom that humans are fallible and misconceptions ubiquitous in human history, and that the only route to knowledge is to broach and then evaluate hypotheses. Arguing ad hominem or by anecdote should be as mortifying as arguing from horoscopes or animal entrails; repressing honest opinion should be seen as risible as the doctrines of biblical inerrancy or Papal infallibility.

But of course we can no more engineer norms of discourse than we can dictate styles of hairstyling or tattooing. The norms of rationality must be implemented as the ground rules of institutions. It’s such institutions that resolve the paradox of how humanity has mustered feats of great rationality even though every human is vulnerable to fallacies. Though each of us is blind to the flaws in our own thinking, we tend to be better at spotting the flaws in other people’s thinking, and that is a talent that institutions can put to use. An arena in which one person broaches a hypothesis and others can evaluate it makes us more rational collectively than any of us is individually.

Examples of these rationality-promoting institutions include science, with its demands for empirical testing and peer review; democratic governance, with its checks and balances and freedom of speech and the press; journalism, with its demands for editing and fact-checking; and the judiciary, with its adversarial proceedings. Wikipedia, surprisingly reliable despite its decentralization, achieves its accuracy through a community of editors that correct each other's work, all of them committed to principles of objectivity, neutrality, and sourcing. (The same cannot be said for web platforms that are driven by instant sharing and liking.)

If we are to have any hope of advancing rational beliefs against the riptide of myside bias, primitive intuitions, and mythological thinking, we must safeguard the credibility of these institutions. Experts such as public health officials should be prepared to show their work rather than deliver pronouncements ex cathedra. Fallibility should be acknowledged: we all start out ignorant about everything, and whenever changing evidence calls for changing advice, that should be touted as a readiness to learn rather than stifled as a concession of weakness.

Perhaps most important, the gratuitous politicization of our truth-seeking institutions should be halted, since it stokes the cognitively crippling myside bias. Universities, scientific societies, scholarly journals, and public-interest nonprofits have increasingly been branding themselves with woke boilerplate and left-wing shibboleths. The institutions should not be surprised when they are then blown off by the center and right which make up the majority of the population. The results have been disastrous, including resistance to climate action and vaccination.

The defense of freedom of speech and thought must not be allowed to suffer that fate. Its champions should have at their fingertips the historical examples in which free speech has been indispensable to progressive causes such as the abolition of slavery, women’s suffrage, civil rights, opposition to the war in Vietnam, and gay rights. They should go after the censors on the right as vigorously as those on the left, and should not give a pass to conservative intellectuals or firebrands who are no friends to free speech, but are merely enemies of their enemies.

The creed of universal truth-seeking is not the natural human way of believing. Submitting all of one’s beliefs to the trials of reason and evidence is cognitively unnatural. The norms and institutions that support this radical creed are constantly undermined by our backsliding into tribalism and magical thinking, and must constantly be cherished and secured.

This essay is adapted from a presentation given to the Stanford Academic Freedom Conference in November 2022.

Steven Pinker, a member of the Persuasion advisory board, is the Johnstone Family Professor of Psychology at Harvard University and the author of Rationality: What It Is, Why It Seems Scarce, Why It Matters.

Follow Persuasion on Twitter, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.

And, to receive pieces like this in your inbox and support our work, subscribe below: