Publication Cover
Critical Review
A Journal of Politics and Society
Volume 35, 2023 - Issue 1-2: Post-Truth
2,657
Views
0
CrossRef citations to date
0
Altmetric
Essays

It’s Our Epistemic Environment, Not Our Attitude Toward Truth, That Matters

ABSTRACT

The widespread conviction that we are living in a post-truth era rests on two claims: that a large number of people believe things that are clearly false, and that their believing these things reflects a lack of respect for truth. In reality, however, fewer people believe clearly false things than surveys or social media suggest. In particular, relatively few people believe things that are widely held to be bizarre. Moreover, accepting false beliefs does not reflect a lack of respect for truth. Almost everyone’s beliefs are explained by rationally warranted trust in some sources rather than others. This allows us to explain why people have false beliefs.

In 2016, the Oxford English Dictionary designated “post-truth” the word of the year (Flood Citation2016). In doing so, it captured a widespread feeling that truth itself has come under threat in the twenty-first century. Just as Brexit is widely supposed to represent a populist preference for ordinary common sense over expert opinion—encapsulated by Michael Gove’s claim that Britons “have had enough of experts” (Mance Citation2016)—so, too, Trump’s election is widely held to symbolize the rejection of a concern for truth in favor of what Stephen Colbert presciently called “truthiness” (Zimmer Citation2010). In our post-truth era, it is thought, facts take a backseat to what feels right or what is convenient to believe. When we give up on facts and on a common reality, we replace debate and compromise with naked power. “Post-truth is pre-fascism,” Timothy Snyder (Citation2021) warns. The epistemological stakes might never have been greater.

For all that it warns us about the gravest of risks, the post-truth narrative is in many ways a comforting one to those who espouse it. In the post-truth view, we are on the side of truth and reason; they are strangers to these things. We are members of the reality-based community; they indulge in fantasy. Our politics follows the science and accepts facts; theirs rejects facts for feelings. The very fact that the narrative is so flattering to us should make us worry that it’s an instance of something very like the phenomenon it condemns: if not the rise of truthiness, exactly, at least the reign of confirmation bias. Perhaps we accept the post-truth narrative because it speaks so well of us.

In line with that possibility is the fact that the word of the year is a marketing device, and that its selection is not based on any scientific or systematic method (Oxford University Press Citation2022). And Gove was quoted out of context: he said that the British people were tired of “experts with organizations with acronyms saying that they know what is best and getting it consistently wrong” (my emphasis; see Friedman Citation2019 and Hazlett Citation2022 for discussion). In other words, Gove was not challenging the relevance of truth or the possibility of expertise; he was criticizing, as false, the truth-claims of a specific group of experts who had weighed in against Brexit but whose expertise was, in his view, bogus.

The post-truth narrative is wrong. For all sides, beliefs remain constrained by a concern with truth. There are, nevertheless, serious concerns about contemporary politics that can usefully be discussed by focusing on what people believe and on their epistemic actions (that is, their behavior within the sphere of knowledge and belief: gathering and sharing information, for example). If we are to address these genuine concerns, we must first identify the nature of the problems. The true picture is less flattering to any political “side,” but just for that reason might present less of an obstacle to establishing the ground rules for a consensus reality.

Here, in brief, is the true picture. Almost everyone continues to form and revise their beliefs in the light of genuine evidence, but the evidence to which beliefs respond is subjective evidence, not objective. That should be obvious and familiar to anyone who’s ever watched or read crime fiction. Think of plots where the real criminal has planted evidence to implicate an innocent person: the detective might be wrong but rational in arresting the victim of the scheme. Evidence that one rationally interprets as leading to a certain conclusion may be false, because what seems true may not be.

False beliefs are typically owed to trust in unreliable sources. We all need help in sorting through the evidence, but some of us are objectively unlucky in the help we receive. It is true that people often report beliefs that are not supported by good evidence, even when “evidence” is understood subjectively, this is because people often report beliefs they don’t in fact hold. I’ll focus on the first phenomenon, endorsement without belief, in the next section, before returning in the following section to genuine beliefs that are objectively false but supported by subjective evidence.

Endorsement without Belief

Let me begin with rough definitions of three terms I will use throughout this essay. A belief is a mental representation: a representation of the way an agent takes some part of the world to be. We have all sorts of beliefs, from mundane beliefs about whether the cat has been fed to momentous beliefs about the existence of God or of moral facts. Most beliefs are truth-apt: they can be assessed for truth, where a belief is true just in case the world is the way the belief represents it as being (the cat has been fed). Some beliefs may not be truth-apt; perhaps there’s no way for the world to be such that a painting is beautiful. I set those beliefs aside. Beliefs are bad when they conflict with the actually reliable opinion of epistemic authorities (Levy Citation2021a).

I won’t try to clarify what makes someone an “epistemic authority” or what makes their beliefs “reliable,” beyond saying that reliability is owed, in very significant part, to the social structure of a mode of inquiry: someone is an epistemic expert if they are appropriately educated and enculturated into an epistemic community, and that community has the right kind of social structure (exemplified by the structure of scientific communities). (In addition, the area of inquiry has to be expertise-conducive: in some areas there are many confounds and/or feedback is too slow for expertise to be reliable [Kahneman and Klein Citation2009].) A bad belief is one that conflicts with the consensus of such epistemic authorities. There is more scientific consensus, understood as I am using the term, than many think: many scientific questions are highly contested, but there’s nevertheless a consensus on what responses to these questions are reasonable.

Finally, I rely on an intuitive sense of what makes a belief bizarre. Lots of beliefs are bad without being bizarre. Climate change denial is not bizarre, nor is the claim that vaccines cause autism. We can easily understand what it would be like for the world to be such that those beliefs were true, and such beliefs are consistent with our background beliefs about the natural and social world. However, the belief that scientists around the world are colluding to make it appear as though climate change was true is at least somewhat bizarre. The belief that world leaders are lizards in human form—a genuine conspiracy theory, allegedly believed by 12 million Americans (Oksman Citation2016)—is certainly bizarre. The world would have to be very different in multiple ways from the way in which it actually is for that belief to be true.Footnote1

The challenge for someone who does not believe that we live in a post-truth era is to explain why bad and bizarre beliefs are widely held. Obviously, however, the challenge is greater with regard to bizarre beliefs than to bad: we can easily see how people might end up with bad beliefs. Such beliefs are common, and none of us should be confident that we are free of them. As for bizarre beliefs, I will argue that they do not represent the challenge they seem, because they are much rarer than is widely thought. In fact, many of the people we dismiss as irrational because they hold bizarre beliefs not only fail to believe bizarre things; they have much the same sense that we do as to what makes a belief bizarre.

Undeniably, people endorse bizarre beliefs. Trump first came to prominence in politics as a proponent of the “birther” conspiracy theory, according to which Barack Obama was not born in the United States. That is not a bizarre conspiracy theory (although it becomes more and more outlandish as it implicates more and more people and agencies), but Trump went on to hint at support for genuinely bizarre conspiracies, and many of his followers embraced them. One poll reported that more than a quarter of Americans believed that Obama was or might be the antichrist and that 37 percent of Americans believe that global warming is a hoax (Harris Citation2013). An NPR/Ipsos poll found that one third of Americans believe that Joe Biden engaged in widespread fraud to win the 2020 election, and 17 percent believe that “a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media” (Rose Citation2020). Seventeen percent would amount to 56 million Americans. How can we explain this degree of susceptibility to bizarre beliefs if we’re not living in a post-truth era?

According to the post-truth narrative, people may come to hold beliefs like these because their beliefs are no longer constrained by evidence. According to a softer view, much more commonly accepted in academic circles, people come to hold bizarre beliefs because human belief formation is powerfully influenced by motivated cognition: a disposition to accept claims that are congenial to us over those that are uncongenial. Confirmation bias (or myside bias; the differences between them are irrelevant in this context) is supposed to be a central mechanism for motivated believing (Nickerson Citation1998; Stanovich Citation2021). This bias leads to an asymmetrical treatment of evidence: we carefully scrutinize evidence that conflicts with our prior attitudes and beliefs, looking for reasons to reject it, but we apply only the lightest of touches to evidence that is congenial to our views. On this latter approach, the acceptance of bad or bizarre beliefs arises from lazy (Pennycook and Rand Citation2019) or biased thinking (Baron and Jost Citation2018), or epistemic vice (Cassam Citation2018).

However, while there are many bad, and some bizarre, beliefs out there, they are less common than we tend to think. People quite often espouse beliefs they do not in fact hold. There is extensive evidence that many reports of partisan beliefs—that is, beliefs congenial to one political party or ideology, producing a partisan gap in reported belief—are insincere (Hannon Citation2021; Levy and Ross Citation2021). These reports are not expressions of belief at all. Rather, they are expressions of support for one side that take the form of an expression of belief. Accordingly, the phenomenon is often called expressive responding (Berinsky 2017; Bullock et al. Citation2015).

A person’s beliefs are her map of the world: their function is the guidance of behavior. Evidence that behavior departs systematically from professed belief is evidence that the reported belief is not genuine. We suspect that the person who claims to value altruism but usually acts selfishly is trying to deceive us, or perhaps to deceive themselves. Similarly, Republicans and Democrats report divergent views of the health of the economy, depending on which party is in power at the time, but their economic behavior does not seem to align with these reports (McGrath Citation2017). Another way to tease out insincere expressions of belief is to show that monetary incentives narrow the partisan gap significantly (Bullock et al. Citation2015; Prior, Sood, and Khanna Citation2015). On the other hand, bigger incentives offer an opportunity to send an even stronger signal of support for one’s side by reporting an insincere but expressive belief.Footnote2

Other approaches highlight the sheer incredibility of the content of some reported beliefs. Brian F. Schaffner and Samantha Luks (Citation2018) took advantage of the then very recent controversy over “alternative facts”—a controversy that seemed to reinforce the post-truth narrative—to test the sincerity of partisans’ reported beliefs. The alternative facts controversy arose when Sean Spicer, Trump’s press secretary, claimed that Trump’s inauguration was attended by the biggest crowd in the history of such events. Kelly-Anne Conway, a Trump campaign strategist, defended Spicer by saying that he was giving “alternative facts” (Bradner Citation2017). Inevitably, the false initial claim became subject to a heated debate, with many media outlets taking the opportunity to compare aerial photographs of the crowds at Obama’s 2009 inauguration and of Trump’s relatively small crowd.

Schaffner and Luks gave their participants these photos without any identifying information and asked which of them depicted a bigger crowd. Only 3 percent of Clinton voters chose the picture of the Trump crowd as bigger, compared to 15 percent of Trump voters. It is not plausible that any of these people were reporting a genuine belief, since the crowd sizes were obviously disparate. Instead, most were reporting their attitudes toward Trump and/or Clinton. The fact that better educated Republicans were more likely to pick the Trump photo reinforces the conclusion: they were more likely to choose the Trump photo because they were more likely to be aware of the controversy and recognize the photo and the opportunity to engage in expressive responding (Ross and Levy Citation2023, reports a replication of the finding).

What about those 3 percent of Clinton voters (and 2 percent of independents) who chose the Trump photo as depicting a bigger crowd? Expressive responding is an important driver of insincere belief report, but it’s not the only one. Some people enjoy trolling experimenters and those conducting polls (Lopez and Hillygus Citation2018). The blogger Scott Alexander claims to identify a “lizardman constant”: a numerical representation of the proportion of people in the ordinary population who will answer “yes” to questions like “do lizardmen control the world?” (Alexander Citation2013). He places the constant at 4 percent. I doubt there is any such constant: the proportion of trollers will change depending on the population sampled, the questions asked and the (perceived) identity of the questioners. For example, conservatives might troll pollsters if they suspect (perhaps rightly) that the questions are designed to show widespread irrationality on the right. We might suspect, too, that samples that skew younger, and uncompensated samples, are more likely to troll.

In addition, bizarre questions might increase the prevalence of trolling, because they may function as cues not to take the survey seriously. Do I think Barack Obama is the literal antichrist? Sure, why not? A cabal of children-sacrificing satanists in congress? Uh huh. For a number of different reasons, then, people may respond insincerely to polls and survey questions. They may also engage in similar behavior unsolicited: sharing conspiracy theories on Facebook, for example, to express support for Trump or Sanders, or “for the lulz” (Kunzru Citation2020).

It is reasonably well established that expressive responding and trolling of surveys occurs. In addition, I suspect that there is a lot of sincere responding that nevertheless is not driven by genuine belief. It has been noted that people are more willing to endorse conspiracy theories and the like in low-stakes situations than in high-stakes situations, and that they therefore tend to be unwilling to bet or to act on these theories (Mercier Citation2020). This fact is usually interpreted as showing insincerity in the responses, but a different interpretation explains some of these instances: the incentive makes them scrutinize their responses more carefully, perhaps attending to their implications and to their relation to their (other) beliefs. As a consequence, they come to realize they don’t in fact believe these things; not really.

Genuine beliefs possess the properties Neil Van Leeuwen (Citation2014) calls cognitive governance and evidential vulnerability. That is, they drive cognition, and therefore behavior, across all contexts in which their content is relevant; and they respond to supportive or undermining evidence, with the person becoming more or less confident in response to such evidence. When a reported belief fails to exhibit these properties, it is not a genuine belief, even if the person takes it to be. Many sincere belief reports concerning conspiracy theories almost certainly aren’t driven by genuine beliefs.

Why would people take themselves to believe conspiracy theories? As their use in films and on television (as in The X-Files, Capricorn One, JFK, The Americans, Homeland, The Bourne Identity, and many more) suggests, conspiracy theories are fun (Blattberg Citation2021). So are ghost stories and tales of the paranormal. People enjoy these theories, and they enjoy pretending they’re true. We may easily become absorbed in such pretense, to such a degree that we lose track of the fact that it is pretense. Children almost never mistake their pretenses for reality (Weisberg Citation2013), but this is because reality pushes back against them. The child playing doctor with her stuffed toys is unlikely to mistake the inanimate giraffe for a real patient, but the adult absorbed in playing “Stolen Election” or “9/11 Inside Job” doesn’t experience any pushback from reality. Conspiracy theories, and paranormal theories too, are consistent with easily observed reality. They concern shadowy actors who cover their tracks, or intrinsically hard to detect entities, not obtrusive facts.

Of course, the person playing “conspiracy theory” will almost certainly be aware of pushback from media and experts; no doubt, this is a big part of the reason why people with low trust in mainstream sources are far more likely to endorse such theories (Bruder and Kunert Citation2022; Douglas et al. Citation2019). While we might all sometimes get absorbed in playing “conspiracy theory” or “paranormal phenomena” while watching the History Channel, for example, higher-trust individuals will tend to snap out of such play in the face of pushback. As Joseph E. Uscinski and Michael Parent (Citation2014) remind us, “conspiracy theories are for losers”: the beneficiaries of social arrangements are more likely to reject such theories because they are more likely to trust official sources. Those who have low trust in the institutions that push back against conspiracy theories are more likely to remain absorbed in them, and low trust is often explained by the fact that those who manifest it have fewer reasons to trust official sources. It is because higher SES individuals tend to have higher trust in official sources that we—the people more likely to read this essayare less likely to espouse bad or bizarre beliefs.Footnote3 This is not in virtue of any special rationality or virtue of ours. And we are not immune to “epistemic vice”: especially if a bizarre belief doesn’t matter to how our life goes, and we are not forced to stake anything on it, we too might come to espouse it even if in fact we don’t hold it.

People might, and probably sometimes do, begin in pretense but gradually come to believe the theory that absorbs them. For many people, though, their endorsement remains pretense, whether they realize it or not. Consider the kinds of “evidence” people often cite for conspiracy theories. The Wayfair conspiracy theorythe theory that the online retailer was involved in the trafficking of childrenbegan when someone pointed out on Twitter that the company’s website featured high-priced storage cabinets with girl’s names. This was taken as evidence that the website was a front, and that by buying the cabinets one was purchasing a trafficked child (Spring Citation2020). Think, too, of the many Covid-19 conspiracy theories that cited as evidence such facts as that “media control” is an anagram of “delta omicron” (Plummer Citation2021) and that “omicron” is an anagram of “moronic” (O’Rourke Citation2021), or even the plots of movies as evidence that the pandemic was planned (Sardarizadeh Citation2021). It is more likely that those who cite this kind of fact are playing with evidence, rather than genuinely engaging with it. Think, too, of how highly gamified QAnon is, with its (entirely unmotivated) use of codes, cyphers, and community-wide attempts at decipherment (Berkowitz Citation2021; Thompson Citation2020; Zadrozny and Collins Citation2018). A number of people have argued that QAnon is unusual among conspiracy theories in being gamified, but that’s false: what is unusual is the degree of centralization of the role of gamemaster. In older conspiracy theories, anyone could play that role, as well as the role of code breaker (see Levy, in press, for further discussion).

Genuine Belief

When we read breathless headlines like “One in Four Americans Believe that Barack Obama May Be the Antichrist” (Harris Citation2013), we should bear in mind the prevalence of expressive responding, trolling, and pretense.Footnote4 However, there is no doubt that many people genuinely believe things that are wildly at variance with prevailing expert opinion. Consider climate change. Many people who report believing that it is a hoax are probably not reporting a sincere belief, but there can be little doubt that they don’t accept the strong scientific consensus. If they believed that climate change were the serious threat it actually is, they would not allow themselves the luxury of engaging in such games. Are their bad beliefs the product of a post-truth culture?

No: the prevalence of bad beliefs arises from universal and ancient human dispositions, not a recent change in our relationship to truth. Human beings are epistemically social animals: most of what we know, we know on the basis of testimony. I know about the existence of cities I’ve never visited and about historical events that occurred before my birth on the basis of testimony. Equally, most of my scientific beliefsmy belief in the theory of evolution, for exampledepend on testimony from those I take to be experts on the topic. Our reliance on testimony does not reflect our epistemic limitations, in the sense that testimony is second best. Testimony is the only way anyone can acquire most knowledge about complex and difficult-to-observe phenomena. Our pervasive, flexible, and intelligent use of testimony allows for the division of cognitive labor, and that, in turn, allows us to understand the world around us.

Science, too, is heavily dependent on testimony. Contemporary science is complex, and the working scientist has first-hand experience of only a very narrow slice of it. For the rest, she is dependent on testimony: the tools (physical and conceptual) that she uses are inherited from others in the past, and her current work almost certainly relies heavily on input from others. All of us, even scientists, are pervasive outsourcers of cognition: we rely on the world to provide much of the content of our beliefs (Chater Citation2018; Rabb, Fernbach, and Sloman Citation2019), and we do so rationally. Outsourcing ensures that our beliefs track reality flexibly and rapidly, much more so than they could do were we to rely on internal representations alone, since they allow the very facts that make our beliefs true to partially constitute them.

Human beings have always been heavily dependent on testimony for flourishing in the world. Cultural evolution has been the evolution of epistemic tools and dispositions as much as of practices (see Boyd et al. Citation2011; Henrich Citation2015; Henrich and Boyd Citation1998). With this epistemic dependence goes epistemic vulnerability: other people may seek to take advantage of us, or may simply be unreliable. We therefore filter testimony using cues for reliability and benevolence (Levy Citation2019a). We are sensitive to evidence of an informant’s past unreliability, evidence that an informant shares our values, evidence that the testimony represents majority opinion (which is more likely to be correct), and evidence of expertise (Harris Citation2012; Mascaro and Sperber Citation2009; Sperber et al. Citation2010).

Nevertheless, the acceptance of bad beliefs is to be expected of rational agents under certain conditions. Climate-change scepticism is most prevalent among political conservatives (Funk and Kennedy Citation2020), and this group also has the lowest trust in science more generally (Gauchat Citation2012) and in universities (Pew Research Center Citation2017). Republicans identify science and universities with the left; increasingly, they are right. While left dominance of the universities dates back to before World War II, this was largely confined to the social sciences and humanities until relatively recently; since the 1970s, it has increased significantly across all areas (Gross Citation2013). The growth in left dominance might in part be due to a feedback loop: Republican mistrust of these institutions filters out conservatives and causes those within them to respond to the hostility by moving away from conservatism. The net result is that conservatives do not see academics as benevolent, and therefore don’t trust their opinions. They therefore seek alternative sources: apparent experts (or news sources they take to be sufficiently in touch with such experts) who share their values.

One possible explanation for low trust in climate science on the right is “solution aversion” (Campbell and Kay Citation2014): solutions to the problems posed by climate change seem inevitably to involve interference with free markets (Bardon Citation2019; Keller Citation2015). I am skeptical: ideologies are flexible, and conservatives can (and have) valued environmental protection. I suspect the perceived conflict is more a consequence than a cause of polarization, although not an epiphenomenal consequence. Instead, polarization may at least partly have been engineered by corporations and their paid surrogates. I suspect, that is, that merchants of doubt are central to the story (Oreskes and Conway Citation2011). Regardless, for the conservative today, the rejection or downplaying of climate science is typically rational. No more than any other laypeople are conservatives in a position to assess the science for themselves; indeed, attempts to assess the evidence, even by scientists, are always themselves heavily dependent on trust and testimony. Instead, we are all dependent on the say-so of trusted sources. Rational agents do not arrogantly “do their own research”; they defer. It is no surprise that conservatives tend to downplay climate change when those whom they trust tell them that it is a hoax. Indeed, better-educated Republicans, and those who know more about science, are more, not less, likely to reject the science (Kahan Citation2015). More sophistication entails a better sense of who to defer to.Footnote5

There are many bad beliefs out there, and many endorsements of bad beliefs. There’s no reason to think that they arise from contempt or disregard for truth. Bad beliefs arise from rational processes: they reflect our cognitive processes working as they’re supposed to—given bad inputs. A misinformed rational agent will believe badly. Insincere responses, too, may not reflect a lack of concern for truth, and thus irrationality. Those who troll may rationally believe that they don’t owe their interlocutors the truth. Moreover, a proper regard for truth leaves a great deal of space for play, and play is a valuable part of human life. Those who come to endorse bizarre beliefs as a consequence of play lack feedback to keep them grounded. They have the bad fortune to live in a social world in which the cues that play rational roles for them are not well calibrated. That is not their fault: it arises from their social setting, not from their attitudes to evidence, science, facts, or truth.

Prevalence and Problems

Why, then, is there a widespread impression that we live in a post-truth age? In part, the impression arises from incredulity that rational agents could vote for Trump or Brexit, or could reject consensus science. In part, the post-truth narrative flatters us and holds our opponents up to ridicule and is therefore satisfying for us. But another part of the explanation, I suspect, is the impression that expressions of bad belief are so much more prevalent today. But why would this be the case?

Part of the explanation for an increase in prevalence may be the influence of social media and the internet more generally. The expression of ridiculous opinions may go viral, triggering mockery from the left and a defensive reaction from the right, where some may endorse these opinions expressively or get lost in pretending that they’re true. The virtual public sphere provides far more opportunities than ever before for expressive responding, for trolling, for playing, and far more opportunities for such activities to reach a wide audience. We are left with a polarized impression of each other that does not reflect our actual degree of polarization (Hannon Citation2021). At the same time, genuine bad beliefs may increase in prevalence as people become more aware of what their side—those who share their values—believe.

I have painted a picture in which the endorsement of bizarre beliefs is often insincere while genuine bad beliefs arise from rational processes. Is this a more optimistic picture of current political reality than the post-truth account? At least in some ways it is. First, it might contribute to a reduction in political temperature and thereby open up hope for productive dialogue. They are not less rational than we are, nor less concerned with the truth. Moreover, it reveals how minds can be changed, sometimes surprisingly rapidly: alter the institutional cues for belief, bring people to trust more reliable sources, and minds can change almost at once (Levy Citation2021b). On the other hand, many of the tools beloved by philosophers, such as teaching critical thinking, are unlikely to be of much help.

But we should not underestimate either the harms of bad beliefs and bad belief endorsements, nor the difficulty of addressing these harms. We are on the brink of—or are already experiencing—genuinely catastrophic climate change, and the failure of voters to support meaningful action to address it is partly attributable to the bad beliefs that climate change is not happening, or is not anthropogenic, or cannot be addressed.Footnote6 Vaccine uptake is lower than it should be, and this, too, reflects bad beliefs. The endorsement of bad beliefs is also consequential: it gives rise to distortions in the epistemic landscape, causing polarization and probably contributing to bad beliefs (the person who repeatedly reads that the Democrats are Satanists may not come to believe the claim, but may come to believe that the Democrats are up to something nefarious). They distract us from the real problems and pollute the epistemic environment so much that it becomes difficult to identify reliable information.

Low trust in actually reliable sources is explanatorily central to bad beliefs, trolling, and the ludic acceptance of conspiracy theories. But it is difficult to restore trust when it is gone. It is especially difficult when bad actors—merchants of doubt and their allies—work to ensure that trust is kept at a low level. During the Covid pandemic, we saw that many people are ready to amplify the errors and ambiguities of epistemic authorities, not to mention engaging in outright deceit. An atmosphere of distrust is highly unconducive to eliciting trustworthy behavior. Scientists who are not trusted understandably find it difficult to venture into political territory that is hostile to them, and may come to feel contempt for those they see as obtuse.

Since testimony and the structure of the epistemic landscape—the distribution of sources and of trust in sources—is so central to our functioning as epistemic agents, a solution to our epistemic predicament will focus on this environment. Today, we live in an epistemically polluted world (Levy Citation2018): the very cues for epistemic reliability (credentials, consensus, track record, and so on) are widely mimicked and distorted by those with an interest in continued bad belief. Cleaning up that epistemic environment is a central task for those who would address our epistemic predicament and (thereby) some of our biggest challenges. Whether we can rise to this challenge remains an open question.Footnote7

Notes

1 There is a debate within philosophy of psychiatry concerning whether religious beliefs should be considered delusions: the diagnostic manual of the American Psychiatric Association specifically excludes them. A religious belief is not delusional when—as the DSM says—it is “one ordinarily accepted by other members of the person’s culture or subculture.” We need to bear this in mind when we attempt to categorize beliefs as bizarre or not. A belief is not bizarre if it is widely enough accepted within the subculture the person belongs to. I doubt that lizardpeople believers genuinely identify with a belief-sharing culture sufficiently strongly to render their belief non-bizarre, though of course such a community could emerge. In that case, their belief would be bad and not bizarre.

2 Some philosophers and cognitive scientists have argued that a signaling function explains many of our beliefs (Funkhouser Citation2017; Ganapini Citation2021; Mercier Citation2020; Sterelny Citation2018). These positions are (usually but not always) orthogonal to the expressive-response literature: they do not claim that people report beliefs they don’t in fact hold in order to support their side, but rather that the signaling function explains why we come to hold some of our genuine beliefs.

3 It’s worth noting, though, that while we may not espouse bizarre beliefs, many of our cognitive representations might fail to be genuine beliefs. Many people espouse beliefs that lack any determinate content, and that therefore can’t guide their actions, because they are not capable of providing this content. For example, many people assent to the claim that “evolution is true,” but even those who have had some college education about evolution tend to have vague or inaccurate beliefs about what evolution actually is. It may be that all of us have representations of moral and political claims – the equality of all human beings, inalienable rights – that play the same sort of role in our cognition as conspiracy theories play in the cognition of those who espouse them. These factual and empirical claims are distinguished from bad and bizarre beliefs not by their cognitive underpinnings or by the role they play in cognition, but by the fact that they’re widely accepted by epistemic authorities.

4 It is also worth bearing in mind that belief reports may be inflated by bad survey design. People tend to be reluctant to report ignorance, and therefore might report believing in a conspiracy theory they have never actually heard of before. Provision of a “skip” option lowers the proportion of beliefs reported in comparison to a “don’t know” option (Motta et al. Citation2019).

5 Philosophers, psychologists, and general educated liberal opinion tends to think that conservative rejection of climate science arises from motivated reasoning. Conservatives reject the science because they are more biased than liberals, or perhaps because their biases kick in on these topics specifically (Bardon Citation2019). While psychological biases are no doubt real, they actually play a smaller role in cognition than is generally thought. In fact, much of the evidence that is commonly cited as showing the influence of bias actually shows our dependence on testimony. Many of the experiments inadvertently embed implicit testimony, and participants respond accordingly. Framing effects, for example, function by implicitly recommending certain options (see Levy Citation2019b for discussion).

6 It might seem that we can avoid attributing a false belief to those who espouse climate change denial through a combination of Dan Kahan’s (Citation2015) view, that climate change denial is rational because whether or not it is true makes no difference to how one’s life goes day to day, and Hugo Mercier’s (Citation2020) plausible claim that when nothing is at stake for agents, their belief reports are often signals and not veridical. Most of us are not in a position to determine or even measurably affect policy, after all. What about policy makers? They may reasonably believe that they, too, can’t measurably affect whether or how rapidly climate change occurs, because they have little control over the emissions of China, India, and other developing economies. While I am confident that these facts make a difference to how easy climate change denial is to embrace, I doubt they can enable us to avoid attributing genuine (but false) beliefs to many of those who espouse climate-change denial. Most people have a stake in the future, and were they to grasp the urgency of cutting emissions I find it implausible they wouldn’t attempt to put pressure on legislators to solve the collective action problem gripping the world. Moreover, whether or not we succeed in reining in emissions, confronting climate change requires enormous investments in adaptation and mitigation. Policy makers who espouse inaction either genuinely believe what they say, or they care less for the wellbeing of their children than they do for playing games. I doubt they’re quite so heartless. I am grateful to Jeffrey Friedman for pressing me to think through this question.

7 I am grateful to the Australian Research Council (DP180102384) and the Arts and Humanities Research Council (AH/W005077/1) for their generous support of this research. I am especially grateful to Jeffrey Friedman for extensive comments on an earlier version.

REFERENCES

  • Alexander, Scott. 2013. “Lizardman’s Constant Is 4%.” Slate Star Codex. https://slatestarcodex.com/2013/04/12/noisy-poll-results-and-reptilian-muslim-climatologists-from-mars/ (August 23, 2021).
  • Bardon, Adrian. 2019. The Truth About Denial: Bias and Self-Deception in Science, Politics, and Religion. Oxford: Oxford University Press.
  • Baron, Jonathan, and John T. Jost. 2018. “False Equivalence: Are Liberals and Conservatives in the U.S. Equally ‘Biased?”’ Perspectives on Psychological Science 14(2): 292-303.
  • Berkowitz, Reed. 2021. “QAnon Resembles the Games I Design. But for Believers, There Is No Winning.” Washington Post, May 11.
  • Blattberg, Charles. 2021. “Antisemitism and the Aesthetic.” Philosophical Forum 52(3): 189–210.
  • Boyd, Robert, Peter J. Richerson, and Joseph Henrich. 2011. “The Cultural Niche: Why Social Learning Is Essential for Human Adaptation.” Proceedings of the National Academy of Sciences 108 (Supplement 2): 10918–25.
  • Bradner, Eric. 2017. “Conway: Trump White House Offered ‘Alternative Facts’ on Crowd Size.” CNN, January 23.
  • Bruder, Martin, and Laura Kunert. 2022. “The Conspiracy Hoax? Testing Key Hypotheses about the Correlates of Generic Beliefs in Conspiracy Theories During the COVID-19 Pandemic.” International Journal of Psychology 57(1): 43–48.
  • Bullock, John G., Alan S. Gerber, Seth J. Hill, and Gregory A. Huber. 2015. “Partisan Bias in Factual Beliefs about Politics.” Quarterly Journal of Political Science 10(4): 519–78.
  • Campbell, Troy H., and Aaron C. Kay. 2014. “Solution Aversion: On the Relation Between Ideology and Motivated Disbelief.” Journal of Personality and Social Psychology 107(5): 809–24.
  • Cassam, Quassim. 2018. Vices of the Mind: From the Intellectual to the Political. Oxford: Oxford University Press.
  • Chater, Nick. 2018. The Mind Is Flat: The Illusion of Mental Depth and The Improvised Mind. New Haven: Yale University Press.
  • Douglas, Karen, Robbie M. Sutton, and Aleksandra Cichocka. 2019. “Belief in Conspiracy Theories: Looking beyond Gullibility.” In The Social Psychology of Gullibility: Conspiracy Theories, Fake News and Irrational Beliefs, ed. J. Forgas and R. Baumeister. London: Routledge.
  • Flood, Alison. 2016. “‘Post-Truth’ Named Word of the Year by Oxford Dictionaries.” The Guardian, November 15.
  • Friedman, Jeffrey. 2019. “Populists as Technocrats.” Critical Review 31(3-4): 315-376.
  • Funk, Cary, and Brian Kennedy. 2020. “How Americans See Climate Change in 5 Charts.” Pew Research Center, April 21.
  • Funkhouser, Eric. 2017. “Beliefs as Signals: A New Function for Belief.” Philosophical Psychology 30(6): 809–31.
  • Ganapini, Marianna. 2021. “The Signaling Function of Sharing Fake Stories.” Mind and Language.
  • Gauchat, Gordon. 2012. “Politicization of Science in the Public Sphere: A Study of Public Trust in the United States, 1974 to 2010.” American Sociological Review 77(2): 167–87.
  • Gross, Neil. 2013. Why Are Professors Liberal and Why Do Conservatives Care? Cambridge, Mass.: Harvard University Press.
  • Hannon, Michael. 2021. “Disagreement or Badmouthing? The Role of Expressive Discourse in Politics.” In Political Epistemology, ed. Elizabeth Edenberg and Michael Hannon. Oxford: Oxford University Press.
  • Harris, Paul. 2012. Trusting What You’re Told. Cambridge, Mass.: Harvard University Press.
  • Harris, Paul. 2013. “One in Four Americans Think Obama May Be the Antichrist, Survey Says.” The Guardian, April 2.
  • Hazlett, Allan. 2022. “Populism, Expertise, and Intellectual Autonomy.” In Engaging Populism: Democracy and the Intellectual Virtues, ed. Gregory R. Peterson, Michael C. Berhow, and George Tsakiridis. Cham: Springer International Publishing.
  • Henrich, Joseph. 2015. The Secret of Our Success: How Culture Is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter. Princeton: Princeton University Press.
  • Henrich, Joseph, and Robert Boyd. 1998. “The Evolution of Conformist Transmission and the Emergence of Between-Group Differences.” Evolution and Human Behavior 19(4): 215–41.
  • Kahan, Dan M. 2015. “Climate-Science Communication and the Measurement Problem.” Political Psychology 36(s1): 1–43.
  • Kahneman, Daniel, and Gary Klein. 2009. “Conditions for Intuitive Expertise: A Failure to Disagree.” American Psychologist 64(6): 515–26.
  • Keller, Simon. 2015. “Empathizing with Scepticism about Climate Change.” In Climate Change and Justice, ed. Jeremy Moss. Cambridge: Cambridge University Press.
  • Kunzru, Hari. 2020. “For the Lulz.” New York Review of Books 67(5): 4–8.
  • Levy, Neil. 2018. “Taking Responsibility for Health in an Epistemically Polluted Environment.” Theoretical Medicine and Bioethics 39: 123–41.
  • Levy, Neil. 2019a. “Due Deference to Denialism: Explaining Ordinary People’s Rejection of Established Scientific Findings.” Synthese 196(1): 313–27.
  • Levy, Neil. 2019b. “Nudge, Nudge, Wink, Wink: Nudging Is Giving Reasons.” Ergo: An Open Access Journal of Philosophy 6.
  • Levy, Neil. 2021a. Bad Beliefs: Why They Happen to Good People. Oxford: Oxford University Press.
  • Levy, Neil. 2021b. “Not So Hypocritical After All: Belief Revision Is Adaptive and Often Unnoticed.” In Empirically Engaged Evolutionary Ethics, ed. Johan De Smedt & Helen De Cruz. Cham: Springer International Publishing.
  • Levy, Neil. Forthcoming. “Conspiracy Theories as Serious Play.” Philosophical Topics.
  • Levy, Neil, and Robert M. Ross. 2021. “The Cognitive Science of Fake News.” In The Routledge Handbook of Political Epistemology, ed. Michael Hannon and Jeroen de Ridder. London: Routledge.
  • Lopez, Jesse, and D. Sunshine Hillygus. 2018. Why So Serious? Survey Trolls and Misinformation. Social Science Research Network, March 19.
  • Mance, Henry. 2016. “Britain Has Had Enough of Experts, Says Gove.” Financial Times, June 3.
  • Mascaro, Olivier, and Dan Sperber. 2009. “The Moral, Epistemic, and Mindreading Components of Children’s Vigilance towards Deception.” Cognition 112(3): 367–80.
  • McGrath, Mary C. 2017. “Economic Behavior and the Partisan Perceptual Screen.” Quarterly Journal of Political Science 11(4): 363–83.
  • Mercier, Hugo. 2020. Not Born Yesterday: The Science of Who We Trust and What We Believe. Princeton: Princeton University Press.
  • Motta, Matthew, Daniel Chapman, Dominik Stecula, and Kathryn Haglin. 2019. “An Experimental Examination of Measurement Disparities in Public Climate Change Beliefs.” Climatic Change 154(1): 37–47.
  • Nickerson, Raymond S. 1998. “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises.” Review of General Psychology 2(2): 175–220.
  • Oksman, Olga. 2016. “Conspiracy Craze: Why 12 Million Americans Believe Alien Lizards Rule Us.” The Guardian, April 7.
  • Oreskes, Naomi, and Erik M. Conway. 2011. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. London: Bloomsbury.
  • O’Rourke, Ciara. 2021. “The New Coronavirus Variant Is Named for a Letter in the Greek Alphabet.” Politifact, November 30.
  • Oxford University Press. 2022. “Word of the Year: FAQs.” https://languages.oup.com/word-of-the-year-faqs/ (accessed November 5, 2022).
  • Pennycook, Gordon, and David G. Rand. 2019. “Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning than by Motivated Reasoning.” Cognition 188: 39–50.
  • Pew Research Center. 2017. “Sharp Partisan Divisions in Views of National Institutions.” Pew Research Center for the People and the Press, July 10.
  • Plummer, Kate. 2021. “Lord Ashcroft Posted a Bizarre Covid Anagram and It Immediately Backfired.” indy100, December 19.
  • Prior, Markus, Gaurav Sood, and Kabir Khanna. 2015. “You Cannot Be Serious: The Impact of Accuracy Incentives on Partisan Bias in Reports of Economic Perceptions.” Quarterly Journal of Political Science 10(4): 489–518.
  • Rabb, Nathaniel, Philip M. Fernbach, and Steven A. Sloman. 2019. “Individual Representation in a Community of Knowledge.” Trends in Cognitive Sciences 23(10): 891–902.
  • Rose, Joel. 2020. “Even If It’s ‘Bonkers,’ Poll Finds Many Believe QAnon And Other Conspiracy Theories.” NPR, December 30.
  • Ross, Robert M., and Neil Levy. 2023. “‪Expressive Responding in Support of Donald Trump: An Extended Replication of Schaffner and Luks (2018).” Collabra Psychology 9(1): 68054.
  • Sardarizadeh, Shayan. 2021. “I Am Legend Screenwriter Dismisses Anti-Vax Claims Based on Film’s Plot.” BBC News, August 10.
  • Schaffner, Brian F., and Samantha Luks. 2018. “Misinformation or Expressive Responding? What an Inauguration Crowd Can Tell Us about the Source of Political Misinformation in Surveys.” Political Opinion Quarterly 82(1): 135–47.
  • Snyder, Timothy. 2021. “The American Abyss.” New York Times, January 9.
  • Sperber, Dan, Fabrice Clément, Christophe Heintz, Olivier Mascaro, Hugo Mercier, Gloria Origi, and Deirdre Wilson. 2010. “Epistemic Vigilance.” Mind & Language 25(4): 359–93.
  • Spring, Marianna. 2020. “Wayfair: The False Conspiracy about a Furniture Firm and Child Trafficking.” BBC News, July 15.
  • Stanovich, Keith E. 2021. The Bias That Divides Us: The Science and Politics of Myside Thinking. Cambridge, Mass.: MIT Press.
  • Sterelny, Kim. 2018. “Religion Re-Explained.” Religion, Brain & Behavior 8(4): 406–25.
  • Thompson, Clive. 2020. “QAnon Is Like a Game—A Most Dangerous Game.” Wired, September 22.
  • Uscinski, Joseph E., and Joseph M. Parent. 2014. American Conspiracy Theories. New York: Oxford University Press.
  • Van Leeuwen, Neil. 2014. “Religious Credence Is Not Factual Belief.” Cognition 133(3): 698–715.
  • Weisberg, Deena Skolnick. 2013. “Distinguishing Imagination from Reality.” In The Oxford Handbook of the Development of Imagination, ed. Marjorie Taylor. Oxford: Oxford University Press.
  • Zadrozny, Brandy, and Ben Collins. 2018. “How Three Conspiracy Theorists Took ‘Q’ and Sparked Qanon.” NBC News, August 14.
  • Zimmer, Ben. 2010. “Truthiness.” New York Times Magazine, October 17.