Self-imposed filter bubbles: Selective attention and exposure in online search

https://doi.org/10.1016/j.chbr.2022.100226Get rights and content
Under a Creative Commons license
open access

Highlights

  • Partisan users tend to observe links for longer when the content perceivably aligns with their own political views.

  • Partisan users tend to select links when the content perceivably aligns with their own political views.

  • Trust in the link source affects both visual attention afforded to a link, as well as the tendency to click it.

  • Selecting from a politically diverse set of search results is independent of link presentation order.

  • Policy aimed at addressing potential filter bubbles in online interactions may ultimately be ineffective.

Abstract

It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results.

Keywords

Filter bubble
Online search
Selective exposure
Ingroup bias
Eye tracking
Trust

Cited by (0)