The Moral Psychology of Misinformation: Why We Excuse Dishonesty in a Post-Truth World

Commentators say we have entered a “post-truth” era. As political lies and “fake news” flourish, citizens appear not only to believe misinformation, but also to condone misinformation they do not believe. The present article reviews recent research on three psychological factors that encourage people to condone misinformation: partisanship, imagination, and repetition. Each factor relates to a hallmark of “post-truth” society: political polarization, leaders who push “alterative facts,” and technology that amplifies disinformation. By lowering moral standards, convincing people that a lie’s “gist” is true, or dulling affective reactions, these factors not only reduce moral condemnation of misinformation, but can also amplify partisan disagreement. We discuss implications for reducing the spread of misinformation.


The Moral Psychology of Misinformation: Why We Excuse Dishonesty in a Post-Truth World
Pundits and scholars argue we have entered a "post-truth" era, surrounded by dishonest politicians and "fake news" that stymie public health efforts, provoke violence, and undermine democracy [1][2][3][4][5][6][7].A central concern with this post-truth era is that people believe misinformation [8*-10].A different concern, however, is that people condone misinformation [11] -they recognize it as false, but give it a moral pass.When people condone misinformation, political leaders can tell blatant lies without damaging their public image, and people may have little compunction about spreading misinformation themselves [12*, 13**, 14*-16].Thus, to understand our "post-truth" era, we need to understand not only the psychology of belief, but also the moral psychology of misinformation.
Commentaries typically highlight three hallmarks of post-truth society: Citizens are politically polarized, leaders endorse "alternative realities," and technology amplifies misinformation [1,2,17,3].Each hallmark at the societal level, we argue, is associated with a psychological factor at the individual level that encourages people to condone misinformation: partisanship, imagination, and repetition, respectively (see Figure 1).The present article reviews recent research on these psychological factors, explains the mechanisms behind each factor, and highlights how the second two factors can exacerbate political divisions in moral judgments of misinformation.We conclude by discussing implications for interventions.

Partisanship Shapes Moral Judgments of Misinformation
One hallmark of a post-truth society is political polarization [2,17,3] -"the divergence of political attitudes and beliefs towards ideological extremes" [18**].Polls reveal increasing concern that partisans cannot agree on the facts [19].However, even when people do agree on the facts, partisanship may spark disagreement about the morality of lying about those facts Motivated and cognitive processes both offer plausible accounts for this phenomenon [22].Perhaps because partisans want to excuse misinformation that fits with their politics, they set lower moral standards for behavior that serves their own partisan interests (see Figure 1, path a) [23,24].Alternatively, partisans may judge the gist -or general idea -of a falsehood as more true when it fits with their prior knowledge [20**] (see Figure 1, path b) -and the truer a falsehood's gist, the less unethical the falsehood may seem.For example, the gist of Spicer's inauguration falsehood was that Trump enjoyed immense popularity.Trump supporters, more than Clinton supporters, may believe the gist that Trump is popular -even when they do not believe that his inauguration was the largest in history -leading them to consider this falsehood more excusable.When you support a leader, it may be easier to take their falsehoods seriously, if not literally [see 25].In short, whether because of a motivated process, a cognitive process, or both -misinformation seems less unethical when it aligns with one's politics.

Imagining Alternatives to Reality Reduces Moral Condemnation
Another hallmark of post-truth society is that many citizens eschew facts and evidence to inhabit "alternative realities" endorsed by leaders and other elites [2,17,3].However, to increase people's inclination to condone a falsehood, it may not be necessary to make them believe in alternatives to reality; it may be sufficient to get them to imagine such alternatives.For instance, after lying about the inauguration's size, Trump's administration suggested that attendance might have been higher if the weather had been nicer [21].While lying about the existence of a medical technology, Theranos CEO Elizabeth Holmes conjured futures in which this technology would ultimately revolutionary healthcare [26].Rather than merely arguing that their falsehoods are American participants in a series of studies read false political claims -such as the one about Trump's inauguration -that were clearly identified as false by reputable, non-partisan factcheckers.Half of participants were randomly assigned to imagine an alternative to reality (a counterfactual or a prefactual, depending on the study) in which the falsehood was true.
Importantly, this manipulation did not reliably affect people's ability to distinguish fact from fiction -but it did affect the moral judgments of participants on both sides of the political aisle (e.g., both Trump and Clinton supporters).Imagining how a falsehood could have been true or might become true made the falsehood seem less unethical to spread, which in turn resulted in weaker intentions to punish the speaker and stronger intentions to like or share the falsehood on social media.Thus, simply imagining -without believing in -alternatives to reality can soften moral judgments of misinformation [see also 29,30].

Mechanisms for Imagination's Effect on Moral Judgment
Why does merely imagining a falsehood as true make it seem less unethical?Results suggest that although imagination does not make people think a falsehood is literally true, it does make the falsehood's gist seem truer (see Figure 1, path c).For example, consider the false claim, "guns kill 500 Americans daily" (the correct statistic is about 1/4 th as many [31]).
Imagining whether guns might someday kill 500 Americans does not make people believe that guns currently kill 500 Americans.However, it does make the broader message that guns kill many Americans seem truer, and therefore the falsehood about the specific number of gun deaths seems less unethical.

Imagination Increases Political Divisions in Moral Judgments of Misinformation
Imagination can also increase partisan reactions to dishonesty.Imagining how a falsehood could have been true -or could become true -reduced partisans' moral condemnation of falsehoods to a greater extent when those falsehoods fit, versus conflicted, with their politics [12*,20**].Two explanations could account for this partisan effect.First, imagination might be more likely to make the gist of the falsehood seem true if the falsehood is aligned (vs.misaligned) with your politics (see Figure 1, path d).For example, imagining "in the future, guns may kill 500 Americans daily" might strengthen Democrats' belief that guns kill many Americans, but have little effect on Republicans' beliefs about gun violence.Second, even when partisans agree that the gist of the falsehood is true, they may disagree about how justified it is to tell a falsehood with a truthful gist (see Figure 1, path e).For example, even if Democrats and Republicans agreed with the general idea that guns kill many Americans, Democrats might be more likely to think that this general idea justifies falsely claiming that guns kill 500 Americans daily.Thus, partisans on different sides of the aisle may disagree about the morality of telling a particular lie -not because they disagree on whether it is true, but because they disagree on whether it could have been true or could become true, and also disagree about the moral implications of these imagined scenarios.

Repeated Exposure to Misinformation Reduces Moral Condemnation
A third hallmark of a post-truth society is the existence of technology, such as social media platforms, that amplify misinformation [see 1,2,3].Such technologies allow fake news -"articles that are intentionally and verifiably false and that could mislead readers" [32] -to spread fast and far [33*], sometimes in multiple periods of intense "contagion" across time [34].
When fake news does "go viral," the same person is likely to encounter the same piece of misinformation multiple times.Research suggests that these multiple encounters may make it seem less unethical to spread [35**,13**].
Participants in a series of studies viewed fake-news headlines that were clearly labelled as false, and that participants correctly identified as false.In each study, participants rated the headlines as less unethical to publish or share on social media if they had been shown these headlines earlier in the study than if they were seeing the headlines for the first time.Moreover, the less unethical they thought a headline was to spread, the less inclined they were to censure an acquaintance who shared it on social media, and the stronger their intentions to share it themselves [13**].Whereas prior work suggests that repetition can increase belief in misinformation [36,37], these studies reveal that repetition can reduce how much people condemn misinformation they know to be false.

Mechanism for Repetition's Effect on Moral Judgments
This phenomenon seems to occur because repetition reduces the negative affective reaction people experience in response to fake news [35**](see Figure 1, path f).When people first encounter a specific fake-news headline (and recognize it as fake), they may experience a "flash of negative affect" [38] that informs their moral judgments.For example, anger that someone would spread that piece of misinformation may lead people to judge it as particularly unethical.But encountering the same headline repeatedly dulls this anger -a desensitization process [39,40].Because affect informs moral judgments [38], reduced anger means less severe moral judgments.This mechanism predicts -and experiments confirm -that wrongdoings across the moral domain (and not just fake-news sharing) seem less unethical when repeatedly encountered [35**].

Repetition Increases Political Divisions in Moral Judgments of Misinformation
In our studies, repetition reduced condemnation regardless of whether the falsehood fit or conflicted with participants' politics [13], perhaps because desensitization to emotionally arousing stimuli does not depend on people's motivations or beliefs.Nonetheless, this "moral repetition effect" [35**] may still exacerbate partisan responses to misinformation because people disproportionately encounter misinformation that aligns with their politics [41,42].Unlike politically discordant misinformation, politically concordant misinformation should become increasingly familiar over time, and thus seem increasingly permissible.

Implications for Anti-Misinformation Interventions
How can we stem the spread of misinformation in this post-truth world?Antimisinformation efforts commonly aim to improve discernment between fact and fiction, or to make the inaccuracy of fake news more salient [e.g., 9,43*,44*-46].Such interventions will be insufficient, however, if people intentionally spread misinformation they do not believe By contrast, imagination can make falsehoods seem less unethical by lending credence to their gist.This mechanism relies more on cognition than affect.Accordingly, encouraging people to use reason when forming moral judgments did not significantly attenuate the effect of imagination on moral judgments [20**].Two intervention strategies might be more promising.
First, encouraging people to focus on the precise truth of information they encounter, and not just its gist, should reduce their latitude to condone lies that are easy to imagine.Second, satirizing these persuasion attempts might inoculate people against their effects [see also 43,46].For example, media outlets that want to hold leaders accountable for lying might present implausible prefactuals to communicate that one could imagine scenarios in which anything might become true (e.g., if a reincarnated Steve Jobs took control of Theranos, then Theranos technology would revolutionize healthcare).
Lastly, the effect of partisanship on moral judgments may be the most difficult to address because, as noted, it is likely multiply determined.It might help to show people that they use lower moral standards for falsehoods that are aligned versus misaligned with their politics [similar to 48](Figure 1, path a), or again to shift focus away from the falsehood's gist (Figure 1, path b) -but ultimately, reducing partisans' willingness to condone misinformation may require easing partisan animosity [49,50].

Conclusion
In a post-truth world, purveyors of misinformation need not convince the public that their lies are true.Instead, they can reduce the moral condemnation they receive by appealing to our politics (partisanship), convincing us a falsehood could have been true or might become true in the future (imagination), or simply exposing us to the same misinformation multiple times (repetition).Partisanship may lower moral standards, partisanship and imagination can both make the broader meaning of the falsehood seem true, and repetition can blunt people's negative affective reaction to falsehoods (see Figure 1).Moreover, because partisan alignment strengthens the effects of imagination and facilitates repeated contact with falsehoods, each of these processes can exacerbate partisan divisions in the moral condemnation of falsehoods.
Understanding these effects and their pathways informs interventions aimed at reducing the spread of misinformation.
Ultimately, the line of research we have reviewed offers a new perspective on our posttruth world.Our society is not just post-truth in that people can lie and be believed.We are posttruth in that it is concerningly easy to get a moral pass for dishonesty -even when people know you are lying.

[ 12 *
,20**].Consider Press Secretary Sean Spicer's falsehood that Donald Trump's 2016 inauguration attracted the largest crowd in history [21].Among Americans who correctly identified this falsehood as false, telling the falsehood seemed less unethical to Trump supporters than to Trump opponents.More generally, Democrats and Republicans alike judge misinformation they know to be false as less unethical when it aligns with their politics [12*,20**].
true, Trump officials and Holmes invite us to imagine two different types of alternative to reality: a counterfactual world [see 27] in which the falsehood could have been true (Trump), and a prefactual world [see 28] in which it might become true (Holmes).Research suggests that imagining either alternative to reality can reduce how much people condemn misinformation, even when they recognize the misinformation as false [12*,20**].

[ 47 ,
Figure 1), and thus would require different interventions to address.For example, repetition

Figure Figure 1
Figure