It’s Our Epistemic Environment, Not Our Attitude Toward Truth, That Matters

ABSTRACT The widespread conviction that we are living in a post-truth era rests on two claims: that a large number of people believe things that are clearly false, and that their believing these things reflects a lack of respect for truth. In reality, however, fewer people believe clearly false things than surveys or social media suggest. In particular, relatively few people believe things that are widely held to be bizarre. Moreover, accepting false beliefs does not reflect a lack of respect for truth. Almost everyone’s beliefs are explained by rationally warranted trust in some sources rather than others. This allows us to explain why people have false beliefs.

rejection of a concern for truth in favor of what Stephen Colbert presciently called "truthiness" (Zimmer ). In our post-truth era, it is thought, facts take a backseat to what feels right or what is convenient to believe. When we give up on facts and on a common reality, we replace debate and compromise with naked power. "Post-truth is pre-fascism," Timothy Snyder () warns. The epistemological stakes might never have been greater.
For all that it warns us about the gravest of risks, the post-truth narrative is in many ways a comforting one to those who espouse it. In the post-truth view, we are on the side of truth and reason; they are strangers to these things. We are members of the reality-based community; they indulge in fantasy. Our politics follows the science and accepts facts; theirs rejects facts for feelings. The very fact that the narrative is so flattering to us should make us worry that it's an instance of something very like the phenomenon it condemns: if not the rise of truthiness, exactly, at least the reign of confirmation bias. Perhaps we accept the post-truth narrative because it speaks so well of us.
In line with that possibility is the fact that the word of the year is a marketing device, and that its selection is not based on any scientific or systematic method (Oxford University Press ). And Gove was quoted out of context: he said that the British people were tired of "experts with organizations with acronyms saying that they know what is best and getting it consistently wrong" (my emphasis; see Friedman  and Hazlett  for discussion). In other words, Gove was not challenging the relevance of truth or the possibility of expertise; he was criticizing, as false, the truth-claims of a specific group of experts who had weighed in against Brexit but whose expertise was, in his view, bogus.
The post-truth narrative is wrong. For all sides, beliefs remain constrained by a concern with truth. There are, nevertheless, serious concerns about contemporary politics that can usefully be discussed by focusing on what people believe and on their epistemic actions (that is, their behavior within the sphere of knowledge and belief: gathering and sharing information, for example). If we are to address these genuine concerns, we must first identify the nature of the problems. The true picture is less flattering to any political "side," but just for that reason might present less of an obstacle to establishing the ground rules for a consensus reality.
Here, in brief, is the true picture. Almost everyone continues to form and revise their beliefs in the light of genuine evidence, but the evidence to which beliefs respond is subjective evidence, not objective. That should be obvious and familiar to anyone who's ever watched or read crime fiction. Think of plots where the real criminal has planted evidence to implicate an innocent person: the detective might be wrong but rational in arresting the victim of the scheme. Evidence that one rationally interprets as leading to a certain conclusion may be false, because what seems true may not be.
False beliefs are typically owed to trust in unreliable sources. We all need help in sorting through the evidence, but some of us are objectively unlucky in the help we receive. It is true that people often report beliefs that are not supported by good evidence, even when "evidence" is understood subjectively, this is because people often report beliefs they don't in fact hold. I'll focus on the first phenomenon, endorsement without belief, in the next section, before returning in the following section to genuine beliefs that are objectively false but supported by subjective evidence.

Endorsement without Belief
Let me begin with rough definitions of three terms I will use throughout this essay. A belief is a mental representation: a representation of the way an agent takes some part of the world to be. We have all sorts of beliefs, from mundane beliefs about whether the cat has been fed to momentous beliefs about the existence of God or of moral facts. Most beliefs are truth-apt: they can be assessed for truth, where a belief is true just in case the world is the way the belief represents it as being (the cat has been fed). Some beliefs may not be truth-apt; perhaps there's no way for the world to be such that a painting is beautiful. I set those beliefs aside. Beliefs are bad when they conflict with the actually reliable opinion of epistemic authorities (Levy a).
I won't try to clarify what makes someone an "epistemic authority" or what makes their beliefs "reliable," beyond saying that reliability is owed, in very significant part, to the social structure of a mode of inquiry: someone is an epistemic expert if they are appropriately educated and enculturated into an epistemic community, and that community has the right kind of social structure (exemplified by the structure of scientific communities). (In addition, the area of inquiry has to be expertise-conducive: in some areas there are many confounds and/or feedback is too slow for expertise to be reliable [Kahneman and Klein ].) A bad belief is one that conflicts with the consensus of such epistemic authorities. There is more scientific consensus, understood as I am using the term, than many think: many scientific questions are highly contested, but there's nevertheless a consensus on what responses to these questions are reasonable.
Finally, I rely on an intuitive sense of what makes a belief bizarre. Lots of beliefs are bad without being bizarre. Climate change denial is not bizarre, nor is the claim that vaccines cause autism. We can easily understand what it would be like for the world to be such that those beliefs were true, and such beliefs are consistent with our background beliefs about the natural and social world. However, the belief that scientists around the world are colluding to make it appear as though climate change was true is at least somewhat bizarre. The belief that world leaders are lizards in human form-a genuine conspiracy theory, allegedly believed by  million Americans (Oksman )-is certainly bizarre. The world would have to be very different in multiple ways from the way in which it actually is for that belief to be true.  The challenge for someone who does not believe that we live in a post-truth era is to explain why bad and bizarre beliefs are widely held. Obviously, however, the challenge is greater with regard to bizarre beliefs than to bad: we can easily see how people might end up with bad beliefs. Such beliefs are common, and none of us should be confident that we are free of them. As for bizarre beliefs, I will argue that they do not represent the challenge they seem, because they are much rarer than is widely thought. In fact, many of the people we dismiss as irrational because they hold bizarre beliefs not only fail to believe bizarre things; they have much the same sense that we do as to what makes a belief bizarre.
Undeniably, people endorse bizarre beliefs. Trump first came to prominence in politics as a proponent of the "birther" conspiracy theory, according to which Barack Obama was not born in the United States. That is not a bizarre conspiracy theory (although it becomes more and more outlandish as it implicates more and more people and agencies), but Trump went on to hint at support for genuinely bizarre conspiracies, and many of his followers embraced them. One poll reported that more than a quarter of Americans believed that Obama was or might be the antichrist and that  percent of Americans believe that global warming is a hoax (Harris ). An NPR/Ipsos poll found that one third of Americans believe that Joe Biden engaged in widespread fraud to win the  election, and  percent believe that "a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media" (Rose ). Seventeen percent would amount to  million Americans. How can we explain this degree of susceptibility to bizarre beliefs if we're not living in a post-truth era?
According to the post-truth narrative, people may come to hold beliefs like these because their beliefs are no longer constrained by evidence. According to a softer view, much more commonly accepted in academic circles, people come to hold bizarre beliefs because human belief formation is powerfully influenced by motivated cognition: a disposition to accept claims that are congenial to us over those that are uncongenial. Confirmation bias (or myside bias; the differences between them are irrelevant in this context) is supposed to be a central mechanism for motivated believing (Nickerson ; Stanovich ). This bias leads to an asymmetrical treatment of evidence: we carefully scrutinize evidence that conflicts with our prior attitudes and beliefs, looking for reasons to reject it, but we apply only the lightest of touches to evidence that is congenial to our views. On this latter approach, the acceptance of bad or bizarre beliefs arises from lazy (Pennycook and Rand ) or biased thinking (Baron and Jost ), or epistemic vice (Cassam ).
However, while there are many bad, and some bizarre, beliefs out there, they are less common than we tend to think. People quite often espouse beliefs they do not in fact hold. There is extensive evidence that many reports of partisan beliefs-that is, beliefs congenial to one political party or ideology, producing a partisan gap in reported belief-are insincere (Hannon ; Levy and Ross ). These reports are not expressions of belief at all. Rather, they are expressions of support for one side that take the form of an expression of belief. Accordingly, the phenomenon is often called expressive responding (Berinsky ;

Bullock et al. ).
A person's beliefs are her map of the world: their function is the guidance of behavior. Evidence that behavior departs systematically from professed belief is evidence that the reported belief is not genuine. We suspect that the person who claims to value altruism but usually acts selfishly is trying to deceive us, or perhaps to deceive themselves. Similarly, Republicans and Democrats report divergent views of the health of the economy, depending on which party is in power at the time, but their economic behavior does not seem to align with these reports (McGrath ). Another way to tease out insincere expressions of belief is to show that monetary incentives narrow the partisan gap significantly (Bullock et al. ; Prior, Sood, and Khanna ). On the other hand, bigger incentives offer an opportunity to send an even stronger signal of support for one's side by reporting an insincere but expressive belief.  Other approaches highlight the sheer incredibility of the content of some reported beliefs. Brian F. Schaffner and Samantha Luks () took advantage of the then very recent controversy over "alternative facts"-a controversy that seemed to reinforce the post-truth narrative -to test the sincerity of partisans' reported beliefs. The alternative facts controversy arose when Sean Spicer, Trump's press secretary, claimed that Trump's inauguration was attended by the biggest crowd in the history of such events. Kelly-Anne Conway, a Trump campaign strategist, defended Spicer by saying that he was giving "alternative facts" (Bradner ). Inevitably, the false initial claim became subject to a heated debate, with many media outlets taking the opportunity to compare aerial photographs of the crowds at Obama's  inauguration and of Trump's relatively small crowd.
Schaffner and Luks gave their participants these photos without any identifying information and asked which of them depicted a bigger crowd. Only  percent of Clinton voters chose the picture of the Trump crowd as bigger, compared to  percent of Trump voters. It is not plausible that any of these people were reporting a genuine belief, since the crowd sizes were obviously disparate. Instead, most were reporting their attitudes toward Trump and/or Clinton. The fact that better educated Republicans were more likely to pick the Trump photo reinforces the conclusion: they were more likely to choose the Trump photo because they were more likely to be aware of the controversy and recognize the photo and the opportunity to engage in expressive responding (Ross and Levy , reports a replication of the finding).
What about those  percent of Clinton voters (and  percent of independents) who chose the Trump photo as depicting a bigger crowd? Expressive responding is an important driver of insincere belief report, but it's not the only one. Some people enjoy trolling experimenters and those conducting polls (Lopez and Hillygus ). The blogger Scott Alexander claims to identify a "lizardman constant": a numerical representation of the proportion of people in the ordinary population who will answer "yes" to questions like "do lizardmen control the world?" (Alexander ). He places the constant at  percent. I doubt there is any such constant: the proportion of trollers will change depending on the population sampled, the questions asked and the (perceived) identity of the questioners. For example, conservatives might troll pollsters if they suspect (perhaps rightly) that the questions are designed to show widespread irrationality on the right. We might suspect, too, that samples that skew younger, and uncompensated samples, are more likely to troll.
In addition, bizarre questions might increase the prevalence of trolling, because they may function as cues not to take the survey seriously. Do I think Barack Obama is the literal antichrist? Sure, why not? A cabal of children-sacrificing satanists in congress? Uh huh. For a number of different reasons, then, people may respond insincerely to polls and survey questions. They may also engage in similar behavior unsolicited: sharing conspiracy theories on Facebook, for example, to express support for Trump or Sanders, or "for the lulz" (Kunzru ).
It is reasonably well established that expressive responding and trolling of surveys occurs. In addition, I suspect that there is a lot of sincere responding that nevertheless is not driven by genuine belief. It has been noted that people are more willing to endorse conspiracy theories and the like in low-stakes situations than in high-stakes situations, and that they therefore tend to be unwilling to bet or to act on these theories (Mercier ). This fact is usually interpreted as showing insincerity in the responses, but a different interpretation explains some of these instances: the incentive makes them scrutinize their responses more carefully, perhaps attending to their implications and to their relation to their (other) beliefs. As a consequence, they come to realize they don't in fact believe these things; not really.
Genuine beliefs possess the properties Neil Van Leeuwen () calls cognitive governance and evidential vulnerability. That is, they drive cognition, and therefore behavior, across all contexts in which their content is relevant; and they respond to supportive or undermining evidence, with the person becoming more or less confident in response to such evidence. When a reported belief fails to exhibit these properties, it is not a genuine belief, even if the person takes it to be. Many sincere belief reports concerning conspiracy theories almost certainly aren't driven by genuine beliefs.
Why would people take themselves to believe conspiracy theories? As their use in films and on television (as in The X-Files, Capricorn One, JFK, The Americans, Homeland, The Bourne Identity, and many more) suggests, conspiracy theories are fun (Blattberg ). So are ghost stories and tales of the paranormal. People enjoy these theories, and they enjoy pretending they're true. We may easily become absorbed in such pretense, to such a degree that we lose track of the fact that it is pretense. Children almost never mistake their pretenses for reality (Weisberg ), but this is because reality pushes back against them. The child playing doctor with her stuffed toys is unlikely to mistake the inanimate giraffe for a real patient, but the adult absorbed in playing "Stolen Election" or "/ Inside Job" doesn't experience any pushback from reality. Conspiracy theories, and paranormal theories too, are consistent with easily observed reality. They concern shadowy actors who cover their tracks, or intrinsically hard to detect entities, not obtrusive facts.
Of course, the person playing "conspiracy theory" will almost certainly be aware of pushback from media and experts; no doubt, this is a big part of the reason why people with low trust in mainstream sources are far more likely to endorse such theories (Bruder and Kunert ; Douglas et al. ). While we might all sometimes get absorbed in playing "conspiracy theory" or "paranormal phenomena" while watching the History Channel, for example, higher-trust individuals will tend to snap out of such play in the face of pushback. As Joseph E. Uscinski and Michael Parent () remind us, "conspiracy theories are for losers": the beneficiaries of social arrangements are more likely to reject such theories because they are more likely to trust official sources. Those who have low trust in the institutions that push back against conspiracy theories are more likely to remain absorbed in them, and low trust is often explained by the fact that those who manifest it have fewer reasons to trust official sources. It is because higher SES individuals tend to have higher trust in official sources that we-the people more likely to read this essay-are less likely to espouse bad or bizarre beliefs.  This is not in virtue of any special rationality or virtue of ours. And we are not immune to "epistemic vice": especially if a bizarre belief doesn't matter to how our life goes, and we are not forced to stake anything on it, we too might come to espouse it even if in fact we don't hold it.
People might, and probably sometimes do, begin in pretense but gradually come to believe the theory that absorbs them. For many people, though, their endorsement remains pretense, whether they realize it or not. Consider the kinds of "evidence" people often cite for conspiracy theories. The Wayfair conspiracy theory-the theory that the online retailer was involved in the trafficking of children-began when someone pointed out on Twitter that the company's website featured high-priced storage cabinets with girl's names. This was taken as evidence that the website was a front, and that by buying the cabinets one was purchasing a trafficked child (Spring ). Think, too, of the many Covid- conspiracy theories that cited as evidence such facts as that "media control" is an anagram of "delta omicron" (Plummer ) and that "omicron" is an anagram of "moronic" (O'Rourke ), or even the plots of movies as evidence that the pandemic was planned (Sardarizadeh ). It is more likely that those who cite this kind of fact are playing with evidence, rather than genuinely engaging with it. Think, too, of how highly gamified QAnon is, with its (entirely unmotivated) use of codes, cyphers, and community-wide attempts at decipherment (Berkowitz ; Thompson ; Zadrozny and Collins ). A number of people have argued that QAnon is unusual among conspiracy theories in being gamified, but that's false: what is unusual is the degree of centralization of the role of gamemaster. In older conspiracy theories, anyone could play that role, as well as the role of code breaker (see Levy, in press, for further discussion).

Genuine Belief
When we read breathless headlines like "One in Four Americans Believe that Barack Obama May Be the Antichrist" (Harris ), we should bear in mind the prevalence of expressive responding, trolling, and pretense.  However, there is no doubt that many people genuinely believe things that are wildly at variance with prevailing expert opinion. Consider climate change. Many people who report believing that it is a hoax are probably not reporting a sincere belief, but there can be little doubt that they don't accept the strong scientific consensus. If they believed that climate change were the serious threat it actually is, they would not allow themselves the luxury of engaging in such games. Are their bad beliefs the product of a post-truth culture?
No: the prevalence of bad beliefs arises from universal and ancient human dispositions, not a recent change in our relationship to truth. Human beings are epistemically social animals: most of what we know, we know on the basis of testimony. I know about the existence of cities I've never visited and about historical events that occurred before my birth on the basis of testimony. Equally, most of my scientific beliefs-my belief in the theory of evolution, for example-depend on testimony from those I take to be experts on the topic. Our reliance on testimony does not reflect our epistemic limitations, in the sense that testimony is second best. Testimony is the only way anyone can acquire most knowledge about complex and difficult-to-observe phenomena. Our pervasive, flexible, and intelligent use of testimony allows for the division of cognitive labor, and that, in turn, allows us to understand the world around us.
Science, too, is heavily dependent on testimony. Contemporary science is complex, and the working scientist has first-hand experience of only a very narrow slice of it. For the rest, she is dependent on testimony: the tools (physical and conceptual) that she uses are inherited from others in the past, and her current work almost certainly relies heavily on input from others. All of us, even scientists, are pervasive outsourcers of cognition: we rely on the world to provide much of the content of our beliefs (Chater ; Rabb, Fernbach, and Sloman ), and we do so rationally. Outsourcing ensures that our beliefs track reality flexibly and rapidly, much more so than they could do were we to rely on internal representations alone, since they allow the very facts that make our beliefs true to partially constitute them.
Human beings have always been heavily dependent on testimony for flourishing in the world. Cultural evolution has been the evolution of epistemic tools and dispositions as much as of practices (see Boyd et al. ; Henrich ; Henrich and Boyd ). With this epistemic dependence goes epistemic vulnerability: other people may seek to take advantage of us, or may simply be unreliable. We therefore filter testimony using cues for reliability and benevolence (Levy a). We are sensitive to evidence of an informant's past unreliability, evidence that an informant shares our values, evidence that the testimony represents majority opinion (which is more likely to be correct), and evidence of expertise (Harris ; Mascaro and Sperber ; Sperber et al. ).
Nevertheless, the acceptance of bad beliefs is to be expected of rational agents under certain conditions. Climate-change scepticism is most prevalent among political conservatives (Funk and Kennedy ), and this group also has the lowest trust in science more generally (Gauchat ) and in universities (Pew Research Center ). Republicans identify science and universities with the left; increasingly, they are right. While left dominance of the universities dates back to before World War II, this was largely confined to the social sciences and humanities until relatively recently; since the s, it has increased significantly across all areas (Gross ). The growth in left dominance might in part be due to a feedback loop: Republican mistrust of these institutions filters out conservatives and causes those within them to respond to the hostility by moving away from conservatism. The net result is that conservatives do not see academics as benevolent, and therefore don't trust their opinions. They therefore seek alternative sources: apparent experts (or news sources they take to be sufficiently in touch with such experts) who share their values.
One possible explanation for low trust in climate science on the right is "solution aversion" (Campbell and Kay ): solutions to the problems posed by climate change seem inevitably to involve interference with free markets (Bardon ; Keller ). I am skeptical: ideologies are flexible, and conservatives can (and have) valued environmental protection. I suspect the perceived conflict is more a consequence than a cause of polarization, although not an epiphenomenal consequence. Instead, polarization may at least partly have been engineered by corporations and their paid surrogates. I suspect, that is, that merchants of doubt are central to the story (Oreskes and Conway ). Regardless, for the conservative today, the rejection or downplaying of climate science is typically rational. No more than any other laypeople are conservatives in a position to assess the science for themselves; indeed, attempts to assess the evidence, even by scientists, are always themselves heavily dependent on trust and testimony. Instead, we are all dependent on the say-so of trusted sources. Rational agents do not arrogantly "do their own research"; they defer. It is no surprise that conservatives tend to downplay climate change when those whom they trust tell them that it is a hoax. Indeed, better-educated Republicans, and those who know more about science, are more, not less, likely to reject the science (Kahan ). More sophistication entails a better sense of who to defer to.  There are many bad beliefs out there, and many endorsements of bad beliefs. There's no reason to think that they arise from contempt or disregard for truth. Bad beliefs arise from rational processes: they reflect our cognitive processes working as they're supposed to-given bad inputs. A misinformed rational agent will believe badly. Insincere responses, too, may not reflect a lack of concern for truth, and thus irrationality. Those who troll may rationally believe that they don't owe their interlocutors the truth. Moreover, a proper regard for truth leaves a great deal of space for play, and play is a valuable part of human life. Those who come to endorse bizarre beliefs as a consequence of play lack feedback to keep them grounded. They have the bad fortune to live in a social world in which the cues that play rational roles for them are not well calibrated. That is not their fault: it arises from their social setting, not from their attitudes to evidence, science, facts, or truth.

Prevalence and Problems
Why, then, is there a widespread impression that we live in a post-truth age? In part, the impression arises from incredulity that rational agents could vote for Trump or Brexit, or could reject consensus science. In part, the post-truth narrative flatters us and holds our opponents up to ridicule and is therefore satisfying for us. But another part of the explanation, I suspect, is the impression that expressions of bad belief are so much more prevalent today. But why would this be the case?
Part of the explanation for an increase in prevalence may be the influence of social media and the internet more generally. The expression of ridiculous opinions may go viral, triggering mockery from the left and a defensive reaction from the right, where some may endorse these opinions expressively or get lost in pretending that they're true. The virtual public sphere provides far more opportunities than ever before for expressive responding, for trolling, for playing, and far more opportunities for such activities to reach a wide audience. We are left with a polarized impression of each other that does not reflect our actual degree of polarization (Hannon ). At the same time, genuine bad beliefs may increase in prevalence as people become more aware of what their side-those who share their values-believe.
I have painted a picture in which the endorsement of bizarre beliefs is often insincere while genuine bad beliefs arise from rational processes. Is this a more optimistic picture of current political reality than the posttruth account? At least in some ways it is. First, it might contribute to a reduction in political temperature and thereby open up hope for productive dialogue. They are not less rational than we are, nor less concerned with the truth. Moreover, it reveals how minds can be changed, sometimes surprisingly rapidly: alter the institutional cues for belief, bring people to trust more reliable sources, and minds can change almost at once (Levy b). On the other hand, many of the tools beloved by philosophers, such as teaching critical thinking, are unlikely to be of much help.
But we should not underestimate either the harms of bad beliefs and bad belief endorsements, nor the difficulty of addressing these harms. We are on the brink of-or are already experiencing-genuinely catastrophic climate change, and the failure of voters to support meaningful action to address it is partly attributable to the bad beliefs that climate change is not happening, or is not anthropogenic, or cannot be addressed.  Vaccine uptake is lower than it should be, and this, too, reflects bad beliefs. The endorsement of bad beliefs is also consequential: it gives rise to distortions in the epistemic landscape, causing polarization and probably contributing to bad beliefs (the person who repeatedly reads that the Democrats are Satanists may not come to believe the claim, but may come to believe that the Democrats are up to something nefarious). They distract us from the real problems and pollute the epistemic environment so much that it becomes difficult to identify reliable information.
Low trust in actually reliable sources is explanatorily central to bad beliefs, trolling, and the ludic acceptance of conspiracy theories. But it is difficult to restore trust when it is gone. It is especially difficult when bad actors-merchants of doubt and their allies-work to ensure that trust is kept at a low level. During the Covid pandemic, we saw that many people are ready to amplify the errors and ambiguities of epistemic authorities, not to mention engaging in outright deceit. An atmosphere of distrust is highly unconducive to eliciting trustworthy behavior. Scientists who are not trusted understandably find it difficult to venture into political territory that is hostile to them, and may come to feel contempt for those they see as obtuse.
Since testimony and the structure of the epistemic landscape-the distribution of sources and of trust in sources-is so central to our functioning as epistemic agents, a solution to our epistemic predicament will focus on this environment. Today, we live in an epistemically polluted world (Levy ): the very cues for epistemic reliability (credentials, consensus, track record, and so on) are widely mimicked and distorted by those with an interest in continued bad belief. Cleaning up that epistemic environment is a central task for those who would address our epistemic predicament and (thereby) some of our biggest challenges. Whether we can rise to this challenge remains an open question.  NOTES . There is a debate within philosophy of psychiatry concerning whether religious beliefs should be considered delusions: the diagnostic manual of the American Psychiatric Association specifically excludes them. A religious belief is not delusional when-as the DSM says-it is "one ordinarily accepted by other members of the person's culture or subculture." We need to bear this in mind when we attempt to categorize beliefs as bizarre or not. A belief is not bizarre if it is widely enough accepted within the subculture the person belongs to. I doubt that lizardpeople believers genuinely identify with a belief-sharing culture sufficiently strongly to render their belief non-bizarre, though of course such a community could emerge. In that case, their belief would be bad and not bizarre. . Some philosophers and cognitive scientists have argued that a signaling function explains many of our beliefs (Funkhouser ; Ganapini ; Mercier ; Sterelny ). These positions are (usually but not always) orthogonal to the expressive-response literature: they do not claim that people report beliefs they don't in fact hold in order to support their side, but rather that the signaling function explains why we come to hold some of our genuine beliefs. . It's worth noting, though, that while we may not espouse bizarre beliefs, many of our cognitive representations might fail to be genuine beliefs. Many people espouse beliefs that lack any determinate content, and that therefore can't guide their actions, because they are not capable of providing this content. For example, many people assent to the claim that "evolution is true," but even those who have had some college education about evolution tend to have vague or inaccurate beliefs about what evolution actually is. It may be that all of us have representations of moral and political claimsthe equality of all human beings, inalienable rightsthat play the same sort of role in our cognition as conspiracy theories play in the cognition of those who espouse them. These factual and empirical claims are distinguished from bad and bizarre beliefs not by their cognitive underpinnings or by the role they play in cognition, but by the fact that they're widely accepted by epistemic authorities. . It is also worth bearing in mind that belief reports may be inflated by bad survey design. People tend to be reluctant to report ignorance, and therefore might report believing in a conspiracy theory they have never actually heard of before. Provision of a "skip" option lowers the proportion of beliefs reported in comparison to a "don't know" option (Motta et al. ). . Philosophers, psychologists, and general educated liberal opinion tends to think that conservative rejection of climate science arises from motivated reasoning. Conservatives reject the science because they are more biased than liberals, or perhaps because their biases kick in on these topics specifically (Bardon ). While psychological biases are no doubt real, they actually play a smaller role in cognition than is generally thought. In fact, much of the evidence that is commonly cited as showing the influence of bias actually shows our dependence on testimony. Many of the experiments inadvertently embed implicit testimony, and participants respond accordingly. Framing effects, for example, function by implicitly recommending certain options (see Levy b for discussion). . It might seem that we can avoid attributing a false belief to those who espouse climate change denial through a combination of Dan Kahan's () view, that climate change denial is rational because whether or not it is true makes no difference to how one's life goes day to day, and Hugo Mercier's () plausible claim that when nothing is at stake for agents, their belief reports are often signals and not veridical. Most of us are not in a position to determine or even measurably affect policy, after all. What about policy makers? They may reasonably believe that they, too, can't measurably affect whether or how rapidly climate change occurs, because they have little control over the emissions of China, India, and other developing economies. While I am confident that these facts make a difference to how easy climate change denial is to embrace, I doubt they can enable us to avoid attributing genuine (but false) beliefs to many of those who espouse climate-change denial. Most people have a stake in the future, and were they to grasp the urgency of cutting emissions I find it implausible they wouldn't attempt to put pressure on legislators to solve the collective action problem gripping the world. Moreover, whether or not we succeed in reining in emissions, confronting climate change requires enormous investments in adaptation and mitigation. Policy makers who espouse inaction either genuinely believe what they say, or they care less for the wellbeing of their children than they do for playing games. I doubt they're quite so heartless. I am grateful to Jeffrey Friedman for pressing me to think through this question. . I am grateful to the Australian Research Council (DP) and the Arts and Humanities Research Council (AH/W/) for their generous support of this research. I am especially grateful to Jeffrey Friedman for extensive comments on an earlier version.