Skip to main content

Advertisement

Log in

False polarization: debiasing as applied social epistemology

  • Published:
Synthese Aims and scope Submit manuscript

Abstract

False polarization (FP) is an interpersonal bias on judgement, the effect of which is to lead people in contexts of disagreement to overestimate the differences between their respective views. I propose to treat FP as a problem of applied social epistemology—a barrier to reliable belief-formation in certain social domains—and to ask how best one may debias for FP. This inquiry leads more generally into questions about effective debiasing strategies; on this front, considerable empirical evidence suggests that intuitively attractive strategies for debiasing are not very effective, while more effective strategies are neither intuitive nor likely to be easily implemented. The supports for more effective debiasing seem either to be inherently social and cooperative, or at least to presuppose social efforts to create physical or decision-making infrastructure for mitigating bias. The upshot, I argue, is that becoming a less biased epistemic agent is a thoroughly socialized project.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

Notes

  1. The notion of a reliable belief-forming process plays a key conceptual role in some versions of epistemic reliabilism (Goldman 1986, pp. 44–51); but one need not be an epistemic reliabilist to think that reliable belief-forming processes are important things to cultivate.

  2. Willingham writes, “Can critical thinking actually be taught? Decades of cognitive research point to a disappointing answer: not really” (2007, p. 8). Yet his skepticism is mainly directed at the way critical thinking is typically taught—that is, as a small set of general purpose skills, and with little emphasis on the ties between critical reasoning and the content of the claims at issue. This is probably not a blanket rejection of the utility of teaching critical thinking.

  3. Lilienfeld et al. (Lilienfeld et al. 2009, pp. 392–394) briefly discuss these relatively primitive biases, proposing both that they can be partially constitutive of larger-scale phenomena (such as ideological extremism), and that they function individually as potential barriers to mitigating the biases with which they are linked.

  4. In the same vein, it might seem natural to suppose that FP arises simply from the cognitive or emotional investment that discussants have in a topic. The evidence suggests, however, that FP is unlikely to be entirely a matter of the partisanship of interlocutors. Pronin, Puccio and Ross note two studies finding that even people who were nonpartisan with respect to some socially charged topic also tended to overestimate the degree of disagreement between partisan interlocutors (2002, p. 652). Partisanship just seems to make us worse at something we’re independently not great at doing: accurately estimating the prospects for conciliation between people who disagree over some fraught issue.

  5. This much applies to many kinds of human activity. For example, Strayer et al. (2012) note that drivers can enthusiastically criticize the poor performance that other drivers display when using mobile phones behind the wheel, while describing their own driving-while-phoning as safe.

  6. John Randolph is a deejay and activist popularly known as Jay Smooth. His use of this metaphor was brought to my attention by Alexis Shotwell.

  7. http://www.bulletin.uwaterloo.ca/2009/aug/14fr.html.

  8. The first two strategies correspond loosely to J.S. Mill’s Methods of Difference and Concomitant Variation.

  9. These examples also reveal a problem with treating self-debiasing as a process that people undertake once they have decided that they were in fact unduly biased (pace Wilson et al., pp. 187-189). Sometimes people run through a self-stimulated debiasing process merely because they think they might have been biased in a judgement, and they wish to check their reasoning.

  10. Perhaps for similar reasons, O’Brien (2009) found it to be a more effective confirmation-debiasing technique to have criminal investigators consider just one alternative suspect than to have them consider three alternative suspects. The more alternatives are to be generated, with greater cognitive difficulty, the more apt one may be to regard the lower effort associated with the biased judgement as a kind of evidence in its favour (pp. 329–30).

  11. There is reason to doubt that the extreme case of FP overcorrection will amount to the complementary error of the False Consensus (FC) effect—the overestimation of the extent to which one’s views are shared by others, so named by Ross et al. (1977) in an early analysis of the phenomenon. FC is most strongly mediated by lack of evidence of what others actually think—for example, by the silence of interlocutors—while FP most easily arises from an (over-) interpretation of what interlocutors have said. The cognitive effort of interpreting a contrary utterance as an expression of complete agreement will plausibly constrain the prospect of overcorrection taking such an extreme form. FP bears important similarities to pluralistic ignorance, discussed in detail by Bicchieri and Fukui (1999), and by other work in this volume.

  12. For helpful comments on earlier drafts I am grateful to Guillaume Beaulac, Gerald Callaghan, Carla Fehr, Carlo Proietti, Frank Zenker, fellow participants in the April 2011 Copenhagen/Lund Workshop in Social Epistemology, and two anonymous referees for this issue. This work was supported in part by the Faculty of Arts, University of Waterloo, and by Social Sciences and Research Council of Canada Grant 410-2011-1737.

References

  • Berkowitz, L. (1971). Reporting an experiment: A case study in leveling, sharpening and assimilation. Journal of Experimental Social Psychology, 72, 237–243.

    Article  Google Scholar 

  • Bicchieri, C., & Fukui, Y. (1999). The great illusion: Ignorance, informational cascades, and the persistence of unpopular norms. In M. Galavotti & A. Pagnini (Eds.), Experience, reality, and scientific explanation (pp. 89–121). Dordrecht: Kluwer.

    Chapter  Google Scholar 

  • Bird, S. R., Fehr, C., Larson, L. M., Sween, M. (2011). ISU ADVANCE Collaborative transformation project: Final focal department synthesis report. Iowa State University ADVANCE Program. Report available online at: http://www.advance.iastate.edu/resources/resources.shtml.

  • Fischhoff, B. (1982). Debiasing. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases (pp. 422–444). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Frantz, C., & Janoff-Bulman, R. (2000). Considering both sides: The limits of perspective-taking. Basic and Applied Social Psychology, 22, 31–42.

    Article  Google Scholar 

  • Frantz, C. (2006). I AM being fair: The bias blind spot as a stumbling block to seeing both sides. Basic and Applied Social Psychology, 28(2), 157–167.

    Article  Google Scholar 

  • Goldman, A. (1986). Epistemology and cognition. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Hirt, E. R., & Markman, K. D. (1995). Multiple explanation: A consider-an-alternative strategy for debiasing judgments. Journal of Personality and Social Psychology, 69, 1069–1086.

    Article  Google Scholar 

  • Keltner, D., & Robinson, R. (1993). Imagined ideological differences in conflict escalation and resolution. The International Journal of Conflict Management, 4, 249–262.

    Article  Google Scholar 

  • Keltner, D., & Robinson, R. (1996). Extremism, power, and the imagined basis of social conflict. Current Directions in Psychological Science, 5, 101–105.

    Article  Google Scholar 

  • Kennedy, K. A., & Pronin, E. (2008). When disagreement gets ugly: Perceptions of bias and the escalation of conflict. Personality and Social Psychology Bulletin, 34, 833–848.

    Article  Google Scholar 

  • Lilienfeld, S., Ammirati, R., & Landfield, K. (2009). Giving debiasing away. Perspectives on Psychological Science, 44, 390–8.

    Article  Google Scholar 

  • Monin, B., & Norton, M. (2003). Perceptions of a fluid consensus: Uniqueness bias, false consensus, false polarization, and pluralistic ignorance in a water conservation crisis. Personal and Social Psychology Bulletin, 295, 559–67.

    Article  Google Scholar 

  • O’Brien, B. (2009). Prime suspect: An examination of factors that aggravate and counteract confirmation bias in criminal investigations. Psychology, Public Policy, and Law, 154, 315–334.

    Article  Google Scholar 

  • Pronin, E., Puccio, C., & Ross, L. (2002a). Understanding misunderstanding: Social psychological perspectives. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristic and biases: The psychology of intuitive judgement (pp. 636–665). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Pronin, E., Lin, D., & Ross, L. (2002b). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28, 369–381.

    Article  Google Scholar 

  • Pronin, E., & Kugler, M. (2007). Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. Journal of Experimental Social Psychology, 434, 565–578.

    Article  Google Scholar 

  • Puccio, C., & Ross, L. (1998). Real versus perceived ideological differences: Can we close the gap?. Unpublished ms, Stanford University

  • Randolph, J. (2012). How I learned to stop worrying and love discussing race. http://tedxtalks.ted.com/video/TEDxHampshireCollege-Jay-Smooth. Accessed April 20, 2012.

  • Robinson, R., Keltner, D., Ward, A., & Ross, L. (1995). Actual versus assumed differences in construal: “Naive realism” in intergroup perception and conflict. Journal of Personality and Social Psychology, 68, 404–417.

    Article  Google Scholar 

  • Ross, L., Greene, D., & House, P. (1977). The “false consensus effect”: An egocentric bias in social perception and attribution processes. Journal of Experimental Social Psychology, 13(3), 279–301.

    Article  Google Scholar 

  • Sanna, L., Stocker, S., & Schwarz, N. (2002). When debiasing backfires: Accessible content and accessibility experiences in debiasing hindsight. Journal of Experimental Psychology: Learning, Memory, and Cognition, 28(3), 497–502.

    Google Scholar 

  • Strayer, D., Drews, F., & Johnston, W. (2012). The eye of the beholder: Cellular communication causes in attention blindness behind the wheel. In A. G. Gale (Ed.), Vision in vehicles X (pp. 142–148). Applied Vision Research Centre, Loughborough University.

  • Tetlock, P. (2005). Expert political judgment. Princeton, NJ: Princeton University Press.

    Google Scholar 

  • Thompson, L. (1995). “They saw a negotiation”: Partisanship and involvement. Journal of Personality and Social Psychology, 68, 839–853.

    Article  Google Scholar 

  • West, R., Meserve, R., & Stanovich, K. (2012). Cognitive sophistication does not attenuate the bias blind spot. Journal of Personality and Social Psychology. Advance online publication. doi:10.1037/a0028857

  • Willingham, D. (2007). Critical thinking: Why is it so hard to teach? American Educator, 31(2), 8–19.

    Google Scholar 

  • Wilson, T., Centerbar, D., & Brekke, N. (2002). Mental contamination and the debiasing problem. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristic and biases: The psychology of intuitive judgement (pp. 185–200). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Kenyon.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kenyon, T. False polarization: debiasing as applied social epistemology. Synthese 191, 2529–2547 (2014). https://doi.org/10.1007/s11229-014-0438-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11229-014-0438-x

Keywords

Navigation