Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Scientists’ deficit perception of the public impedes their behavioral intentions to correct misinformation

  • Sera Choi,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Writing – original draft, Writing – review & editing

    Affiliation School of Communications, Grand Valley State University, Allendale, Michigan, United States of America

  • Ashley A. Anderson ,

    Roles Conceptualization, Data curation, Funding acquisition, Supervision, Writing – original draft, Writing – review & editing

    Ashley.A.Anderson@colostate.edu

    Affiliation Department of Journalism and Media Communication, Colorado State University, Fort Collins, Colorado, United States of America

  • Shelby Cagle,

    Roles Data curation, Methodology, Writing – original draft

    Affiliation Department of Microbiology, Immunology, and Pathology, Colorado State University, Fort Collins, Colorado, United States of America

  • Marilee Long,

    Roles Writing – review & editing

    Affiliation Department of Journalism and Media Communication, Colorado State University, Fort Collins, Colorado, United States of America

  • Nicole Kelp

    Roles Data curation, Writing – original draft, Writing – review & editing

    Affiliation Department of Microbiology, Immunology, and Pathology, Colorado State University, Fort Collins, Colorado, United States of America

Abstract

This paper investigates the relationship between scientists’ communication experience and attitudes towards misinformation and their intention to correct misinformation. Specifically, the study focuses on two correction strategies: source-based correction and relational approaches. Source-based approaches combatting misinformation prioritize sharing accurate information from trustworthy sources to encourage audiences to trust reliable information over false information. On the other hand, relational approaches give priority to developing relationships or promoting dialogue as a means of addressing misinformation. In this study, we surveyed 416 scientists from U.S. land-grant universities using a self-report questionnaire. We find that scientists’ engagement in science communication activities is positively related to their intention to correct misinformation using both strategies. Moreover, the scientists’ attitude towards misinformation mediates the relationship between engagement in communication activities and intention to correct misinformation. The study also finds that the deficit model perception–that is, the assumption that scientists only need to transmit scientific knowledge to an ignorant public in order to increase understanding and support for science–moderates the indirect effect of engagement in science communication activities on behavioral intention to correct misinformation using relational strategies through attitude towards misinformation. Thus, the deficit model perception is a barrier to engaging in relational strategies to correct misinformation. We suggest that addressing the deficit model perception and providing science communication training that promotes inclusive worldviews and relational approaches would increase scientists’ behavioral intentions to address misinformation. The study concludes that scientists should recognize their dual positionality as scientists and members of their community and engage in respectful conversations with community members about science.

Introduction

In recent years, concerns over misinformation have become widespread, especially in light of the high cost of the COVID-19 pandemic and vaccine hesitance in terms of both economic and morbidity burden [1,2]. The responsibility for preventing and stopping the spread of misinformation has been placed on a variety of stakeholders, including media producers, educators, health professionals, researchers, and funders [3]. These calls to action are more likely to succeed if we understand who is susceptible to misinformation, which techniques are effective in responding to misinformation, and how to motivate individuals to adopt them.

Progress has been made on two of these three fronts. We know more about who is susceptible to misinformation and why [e.g., 4,5] and which techniques are effective for correcting misinformation [6,7]. Research that identifies who addresses misinformation, why they do so, and what corrective techniques they use in different circumstances is limited but growing [8]. As calls for scientists to correct misinformation grow [9,10], it becomes increasingly important to understand the factors that increase scientists’ likelihood of engaging in these misinformation correction activities.

Scientists’ engagement with the public plays a critical role in addressing societal challenges [11]. Engagement activities include working at open-house events or science festivals, participating in public meetings, meeting with policymakers, delivering lectures to nonexpert audiences, writing blogs, giving interviews to journalists and engaging on social media (e.g., writing about topics related to their research, participating in discussions, etc.) [12]. Existing scholarship has examined the factors that influence scientists’ public engagement activities [1315]; however, little research has specifically investigated the relationship between scientists’ participation in engagement activities and their willingness to correct misinformation. Through their experience with science communication activities, scientists may be more willing to participate in societal communication challenges such as correcting misinformation.

Individuals’ behavioral intentions to correct misinformation may be influenced by different factors. Previous studies have empirically examined the relationship between attitudes and behavioral intentions related to addressing misinformation in various contexts such as correcting COVID-19 rumors [16], sharing unverified information [17], and verifying the information before disseminating it [18]. Here, we examine the mediating role of attitude toward misinformation (i.e., concerns over misinformation) in the relationship between scientists’ communication activities and their misinformation correction behavioral intentions.

Many approaches for correcting misinformation involve developing audience members’ skills related to identifying misinformation or providing corrective information from trusted sources. The goal of such strategies is to combat misinformation by helping people hone their analytical skills, such as logically examining the material in question and critically evaluating the information source [19]. More recently, researchers have investigated relational approaches that prioritize fostering relationships or dialogue as a response to key science communication challenges [20]. These relational approaches provide another important tool to address misinformation. Thus, this study aims to provide a more granular understanding of the drivers of scientists’ willingness to partake in correcting misinformation using both source-based and relational strategies.

The deficit model perception, or the attitude that the public does not know enough about science, likely plays a role in scientists’ willingness to correct misinformation. While studies have shown that scientists continue to hold deficit model perceptions when communicating with non-scientist audiences [21,22], little is known about how the deficit model perception predicts their likelihood to engage in misinformation correction. Using a survey of scientists at land-grant universities in the United States, we examine the dynamics of past communication behavior along with two key attitudes–attitude toward misinformation (i.e., concern over misinformation) and the deficit model perception–to predict scientists’ likelihood of engaging in misinformation correction using both source-based and relational approaches.

The state of misinformation

Misinformation has been defined by many scholars, and these definitions often emphasize the audience member’s role. For instance, Kuklinski and colleagues state that misinformation refers to circumstances when “people hold inaccurate beliefs, and do so confidently” [23 pp792]. Likewise, Lewandowsky et al. define misinformation as “any piece of information that is initially processed as valid but is subsequently retracted or corrected” [24 pp124-125]. Building on these two definitions, for this study, we define misinformation as information that people perceive to be accurate when it is not. We also conceptualize misinformation as distinct from disinformation, or the purposeful act of sharing misleading information [25].

Recent research highlights a growing emphasis on identifying effective ways to correct misinformation, often through “the presentation of information designed to rebut an inaccurate claim or a misperception” [26]. An individual may be exposed to misinformation correction by witnessing correction, correcting others, or being corrected [8]. Examples of corrective behaviors include citing a credible source [27,28], debunking, and prebunking [29,30]. Most of these strategies have been tested almost exclusively in social media spaces, where significant amounts of exposure to and correction of misinformation occurs [26,31].

Strategies to correct misinformation

Source-based strategies.

Source-based approaches to addressing misinformation emphasize sharing corrective information from trusted sources in a manner that encourages audiences to believe the trusted, accurate information over misinformation. Such approaches are grounded in the principles of cognitive processing. Who people are (e.g., audience characteristics), the views they hold about a topic (e.g., attitudes or levels of knowledge), and the cues they rely upon when selecting an information source (e.g., likeability) contribute to the effectiveness of misinformation corrective techniques [19,32]. People’s susceptibility to misinformation can be reduced before they encounter the misinformation. Techniques focus on involve developing media literacy skills to check the source or prompting reflection on red flags in the message (a cognitive audience characteristic) [27]. Prebunking involves a type of knowledge change among audience members. Prebunking shows people examples of misinformation so that they will be better equipped to spot it and question it when they encounter it [29,30]. Other source-based approaches are focused on correcting misinformation that has already surfaced; these are often based on audience perceptions of the corrective source. Testing in the social media environment shows that when someone sees another person calling out misinformation while posting a link to a trusted source, misperceptions do not prevail [33].

Observational corrective techniques assume that institutional sources of information (e.g., news media, government) are best for mitigating misperceptions [33]. Yet, the ability of institutionally based sources to reach publics is limited. For example, trust in the CDC, one of the widely tested sources of information in source-based misinformation correction studies, has decreased since the start of the Covid-19 pandemic [34]. A growing lack of trust in such institutional sources indicates addressing misinformation will require community approaches grounded in relationship-building and interpersonal trust [35].

Relational strategies.

Science communication practitioners are increasingly calling upon scientists and organizations to use community-based approaches involving a variety of stakeholders to address science communication challenges [36]. Public health community leaders, such as those who work in county public health departments, often engage in dialogic approaches to misinformation correction that emphasize relationship-building and listening to individual concerns during person-to-person conversations [20]. Relational approaches promote a shared sense of community, which may drive the efficacy of responses to misinformation and the prevention of misinformation [37].

Relational approaches may work well when the misinformation aligns with people’s values [26]. People tend to accept information that supports or aligns with their preexisting beliefs, values, or notions, so they are more likely to believe misinformation when it aligns with those values [38]. Relational approaches may be most effective in overcoming misinformation bound within the dynamics of motivated reasoning because relational approaches are based on an interaction that allows for listening, acknowledgement of others’ perspectives and values, and tolerance of difference. Dialogic activities build trust [39]. And people accept information that comes from a perceived trusted source [24]. Furthermore, individuals who are wrong about the information or disseminate untrue details about a particular fact respond more positively to being corrected when they share a relational bond or community connection with the corrector [40]. Those engaged in misinformation correction prefer approaches that rely on relationship-building because those approaches allow them to demonstrate politeness and express emotions [41].

In this study, we explore source-based and relational approaches to addressing misinformation separately. Source-based and relational approaches are distinct activities, and research on the motivations or barriers to each is growing.

Scientists’ engagement in science communication activities

Research shows that scientists who participate in communication activities will continue to participate in communicating about science with the public in the future [42,43]. In other words, communication begets communication. Research on scientists also suggests that participating in science communication is associated with increased self-efficacy, social norms, and positive attitudes about science communication activities [15,22].

Based on this evidence, we pose the following hypothesis:

  1. H1: Higher levels of engagement in science communication activities are positively related to behavioral intentions to correct misinformation using (a) source-based strategies and (b) relational strategies.

Relationship between engagement in science communication activities and willingness to correct misinformation

Scientists who have more experience in science communication activities may feel more concern about misinformation than scientists who have less experience. Through their engagement in science communication activities, scientists can broaden participation in tackling societal issues and improving public perception of science (e.g., public talks) [44,45]. Additionally, engaging in science communication helps scientists contribute to shaping the direction of political and policy decisions by educating citizens about the challenges affecting the world [45]. Concern towards misinformation may predict scientists’ engagement in science communication activities and intentions to correct misinformation. Therefore, the following hypothesis is proposed:

  1. H2: Higher levels of engagement in science communication activities are positively related to negative attitudes toward misinformation.

Previous studies have examined the association between individuals’ attitudes and their behavioral intentions to combat the spread of misinformation [16,46]. For instance, Ding et al.’s study found that positive attitudes toward COVID-19 rumor recognition (i.e., positive attitudes toward verifying information that people are skeptical about) and intentions to identify COVID-19 rumors are positively correlated [16].

Other studies have found a relationship between attitudes and behaviors around spreading misinformation online [17,18]. According to Khan and Idris, those who hold positive attitudes toward information verification are less likely to share unverified information on social media [17]. Pundir et al. found that attitudes toward news verification are positively related to the intention to validate news before disseminating it [18].

Although limited, findings from the studies discussed in this section indicate that attitudes toward misinformation are associated with a positive behavioral intention to correct misinformation. Indeed, research has shown that higher levels of perceived severity of misinformation are associated with intentions to correct misinformation [47]. Thus, the following hypothesis is proposed:

  1. H3: Attitudes toward misinformation mediate the effect of engagement in science communication activities on behavioral intentions to correct misinformation using (a) source-based strategies and (b) relational strategies.

Barriers to addressing misinformation

While trust and relationship-building are foundational to disseminating accurate, verified information and correcting misinformation, research is limited in these areas, especially on scientists. Instead, most research suggests that scientists embrace a deficit model attitude where they take a superior position when speaking to non-scientists. For example, Dudo and Besley found that scientists’ main objectives for public engagement are to “inform the public about science” and to “defend science from misinformation” [22]. Scientists’ lowest priorities are to strengthen the public’s trust in science (i.e., build trust) and to connect to their audience through science stories (i.e., tailor messages) [22].

The role of deficit model perception on behavioral intention to correct misinformation.

There are multiple positionalities by which scientists can communicate with non-scientists. These can be divided into “traditional” models that focus on transmitting science to non-expert audiences and “non-traditional” models that focus on discussions involving knowledge outside of science [48]. These models have been further sub-categorized by various authors, such as Brossard and Lewenstein’s description of deficit/science literacy, contextual, lay expertise, and public participation models [49]. Trench described the deficit/dissemination, dialogue, and conversation/participation models [50]. Akin and Scheufele outlined the deficit model, dialogue model, and communication in context model [51]. Regardless of specific terminology in these and other studies characterizing science communication models, deficit-based models assume non-scientist publics are ignorant or monolithic, while more participatory models of science communication tend to focus on eliciting multiple perspectives from individuals and groups with diverse identities and expertise.

The deficit model has been criticized because it unnecessarily characterizes those with concerns about science as ignorant or lacking adequate knowledge about science [52,53], yet it remains persistent in science communication efforts [52,54]. An analysis of 515 science engagement and outreach activities in Australia found that most followed a mix of deficit or dialogue approaches and lacked inclusive/participatory approaches [55]. This is problematic because the deficit model assumes that scientific knowledge is superior to other worldviews [56], thus reinforcing racism and exclusion in science and STEM [57].

There are several reasons for the persistence of the deficit model. Science communication arose amid a model of one-directional communication to the public [54], with success defined as the diffusion of information from a sender to a receiver [58]. Additionally, science itself has been conceptualized as the production of knowledge that occurs separately from a society that is then informed of findings [54]. Furthermore, scientists lack training in communication and often see themselves as rational decision-makers while the public is some sort of deficient “other” [59,60]. Finally, STEM scientists’ lack of respect for social science can contribute to their deficit view of the public [60]. Scientists who have a positive attitude toward social sciences are more likely to stray from the deficit model of scientific opinion formation [60].

In a deficit model of science communication, experts need to only inform the audience and provide data and scientific facts in an understandable way. In a more dialogic model of science communication, scientific experts have responsibilities of “sharing input that is well received by others, listening to and learning the input of others, and investing in a relationship with others” [56]. Thus, utilizing non-deficit science communication approaches requires different skills on the part of scientists, and the science community needs additional training in these skills [52].

There are multiple ways that different science communication models can intersect with misinformation. In terms of communication about health, the move to more participatory approaches can build more one-on-one trusting relationships in science and health information; conversely, it could also lead to the rise of misinformation as scientific information is disseminated on social media and other sources instead of being filtered through experts [61]. When it comes to correcting misinformation, the deficit model assumes that any public skepticism in science is due to a lack of knowledge and can be solved with more information [62], suggesting that literacy- and source-based approaches to correcting misinformation may be appropriate. Conversely, more dialogic models recognize the various contexts that may influence public distrust or skepticism of science [62], suggesting that contextual and relational approaches to correcting misinformation may be more effective. Thus, there are complex interactions between science communication models and the rise, dissemination, and correction of misinformation.

In this study, with a focus on scientists themselves and their communication activities, we were especially interested in how scientists’ perceptions of the public and how strongly they see the public as a deficient “other” [60] influence their desire to correct misinformation. Fig 1 presents the conceptual model for the following research question.

  1. RQ1: Do the direct and indirect effects of engagement in science communication activities on behavioral intentions to correct misinformation using (a) source-based strategies and (b) relational strategies differ based on the level of the deficit model perception?

Method

Ethics statement

This study and its consent procedure were approved by the Institutional Review Board at Colorado State University (IRB: #3219) where the authors work. Before beginning the online survey via Qualtrics, all participants provided written informed consent. The personal identity information (e.g., IP address) for all respondents was not collected. Participants consented to sharing their data in the aggregate and not as individual responses. To uphold the ethical guidelines of our Institutional Review Board and respect the privacy of our participants, we cannot share the data set in a public repository. Data requests may be sent to the Colorado State University Institutional Review Board (CSU_IRB@colostate.edu).

Procedures and participants

This study utilized a geographically diverse sample of U.S. academic scientists from 25 land-grant universities. Within the six accreditation regions of the Council for Higher Education [63] in the United States (i.e., New England, Middle States, North Central, Southern, Western, and Northwest), four land-grant institutions were randomly chosen from each region.

The sample was also stratified by type of land-grant university. In the United States, there are three types of land-grant institutions [64]. (1) the original 57 public universities of 1862, which comprise more than 50% of the land-grant universities (57 out of 112); (2) the 19 historically Black colleges and universities that were added 1890; and (3) the 36 tribal colleges and universities that were added in 1994 [64]. For each accreditation region, two institutions were randomly chosen from the 1862 group, one institution was randomly chosen from the 1890 group, and one was randomly chosen from the 1994 group. If there were no institutions from the 1890 group in a region, an institution from 1994 was randomly chosen and vice versa. If both the 1890 and 1994 groups were not represented in an accreditation region, all four institutions were randomly chosen from 1862 group.

Within each land-grant university, six departments were chosen: five departments were randomly chosen from STEM (science, technology, engineering, and mathematics) fields based on the U.S. National Science Foundation (NSF) list [65], and one department was randomly chosen from the agriculture field. In addition, the authors’ land-grant university was added to the institution list.

Five research assistants visited the websites of selected departments to compile a list of faculty and researchers. Then, using manual search methods, the assistants gathered the email addresses of these individuals from each institution’s website and recorded them in a database. A total of 5,184 emails were obtained from 24 land-grant universities. Between May and June 2022, potential respondents received an initial invitation followed by two reminders. The survey was also distributed over a listserv that included all academic researchers and faculty at the authors’ institution.

Although 454 responses were collected, not all respondents answered all the questions. Respondents (n = 38) who only answered the first section of the questionnaire were eliminated from the sample, leaving a final sample size of 416. In addition, the data contained 13% missing data in key variables for analysis, and we assessed the structure of the missing values using the visualization method [66,67]. As the visualization inspection showed unstructured missing data patterns with no evident mechanism, missing data were handled using hot-deck imputation with the VIM package using R. Hot-deck imputation allows missing data (recipient) to be replaced by similar data that do not have any missing data (the donor) [66,68]. Therefore, this method allows all remaining data of 416 responses to be employed.

Of the 416 responses, 218 were males (52.4%) and 168 were females (40.3%), followed by those who said they prefer not to disclose (6%) and non-binary (1.2%). The scientists who responded were from life sciences (22.6%), social sciences (21.2%), math/computer sciences (14.9%), physical sciences (13.2%), humanities (12.3%), agriculture (8.7%), and material sciences and engineering (7.2%). When asked about the decade that they received their highest degree, respondents responded that they received it in the in the 2020s (15.9%), in the 2010s (29.6%), in the 2000s (22.4%), in the 1990s (18.3%), or prior to the 1980s (13.9%). The majority of scientists reported themselves to be White (76.2%) followed by those who indicated “prefer not to answer” (6.7%), mixed ethnicity (5.8%), Hispanic (4.8%), and Asian (4.1%). The remainder of the respondents indicated they were either Black, Native American, or other (2.4%).

Measurements

Engagement in science communication.

Engagement in science communication was measured by five statements that included: “During the past 5 years, about how often have you participated in the following activities?”: (1) Met with local, state, or federal policymakers, (2) Participated in public meetings, deliberative forums, or science festivals, (3) Given an interview with a journalist, (4) Written a news article, press release, blog post, or op-ed, and (5) Posted about a scientific topic on social media [42]. A 5-point Likert-type scale was used ranging from “not at all” to “very often”; the data were combined into one variable (α = .70).

Attitudes toward misinformation.

To measure respondents’ attitudes toward misinformation, they were asked “Misinformation can impact different areas of society. How concerned are you about the harm of misinformation to the following?”: (1) environment, (2) human health, (3) animal health, (4) political climate, (5) government, (6) news media, and (7) society as a whole (adapted from [69]). These items used a 5-point Likert-type scale ranging from “not very concerned” to “very concerned” and were combined into one variable (α = .85)

Deficit model perception.

We adopted Yuan et al.’s scale to assess how scientists view the general public [70]. As the original items were designed to measure scientists’ attitudes toward the public, we adopted one item that was used to measure the extent to which scientists held a deficit model perception of the public. This item was “the general public has little knowledge about science.” A 5-point Likert-type scale was used ranging from “strongly disagree” to “strongly agree” (M = 4.05, SD = 1.05).

Behavioral intention to correct misinformation using relational strategies.

Individuals’ intention to correct misinformation using relational strategies was assessed by asking respondents whether they would correct misinformation using relational strategies on social media and in one-on-one conversations. The questions included: “How likely are you to do the following to correct misinformation in the future? (1) use my own words to respond to a post on social media, and (2) talk with someone one-on-one.” A 5-point Likert-type scale was also used ranging from “extremely unlikely” to “extremely likely,’” and these two items were averaged into a single item (α = .71).

Behavioral intention to correct misinformation using source-based strategies.

Individuals’ intention to correct misinformation using source-based strategies was assessed by asking “How likely are you to do the following to correct misinformation in the future? (1) provide corrective information from a government organization, (2) provide corrective information from news media, and (3) provide corrective information from schools, faith organizations, or other community organizations.” A 5-point Likert-type scale was used ranging from “extremely unlikely” to “extremely likely,” and these three items were averaged into a single item (α = .82).

Control variables.

Previous studies, such as [71], have controlled for demographic variables, such as gender, race/ethnicity, and discipline. Therefore, in our analyses, we controlled for participants’ gender, race/ethnicity, and discipline.

Statistical analyses.

To investigate hypotheses 1 and 2, we used hierarchical ordinary least square regression, which enabled us to enter variables in distinct blocks so that we tested the incremental assessment of R2 in each block as well as the relative effects of variables while taking into account those entered simultaneously or in earlier steps [72]. To test hypothesis 3 and the research question, we used PROCESS Model 4 and Model 14 (respectively) with 5,000 bootstrap iterations to evaluate the power of the unstandardized indirect effects. The PROCESS Macro [73] has been widely used to evaluate indirect effects in the mediator model.

Results

We hypothesized that higher levels of engagement in science communication activities are positively related to behavioral intentions to correct misinformation using (a) source-based strategies and (b) relational strategies. As shown in Table 1, H1(a) and H1(b) were supported. Engagement in science communication activities was positively correlated with behavioral intentions to correct misinformation using source-based strategies (β = 3.53, p < .001) and using relational strategies (β = 5.47, p < .001).

H2 predicted that engagement in science communication activities is positively related to attitudes toward misinformation. The results, as shown in Table 1, indicated that engagement in science communication activities is positively related to attitude toward misinformation (β = .10, p < .05).

Hypothesis 3 predicted that attitudes toward misinformation mediate the effect of science communication activities on behavioral intentions to correct misinformation using (a) source-based strategies and (b) relational strategies. As shown in Table 2, our results supported H3 (a) and (b). As predicted, the indirect effect via attitudes toward the misinformation was significant on behavioral intentions to correct misinformation using source-based strategies (indirect effect = .0435, 95% CI .0073, .0819) and using relational strategies (indirect effect = .0297, 95% CI .0046, .0608).

thumbnail
Table 2. Bootstrap results of the direct and indirect effects of engagement in science communication activities on outcome variables.

https://doi.org/10.1371/journal.pone.0287870.t002

A moderated mediation analysis was performed to test the research question about the deficit model perception. Model 14 from Hayes [73] showed a significant moderated mediation, suggesting a conditional indirect effect on behavioral intentions to correct misinformation using relational strategies (index = -.0114, SE = .0080, 95% CI -.0314, -.0003). For the condition of a low deficit model perception, attitude toward the misinformation mediated the effects of science communication activities on behavioral intentions to correct misinformation using relational strategies (indirect effect = .0374, SE = .0181, 95% CI.0064, .0777, significant as CI excludes zero). However, for the condition of a high deficit model perception, the mediation was diminished (indirect effect = .0146, SE = .0130, 95% CI, -.0063, .0441) and non-significant (overall model fit: adjusted R2 = .1309, F (4, 408) = 8.78, p < .001). See Tables 3 and 4, and Fig 2.

thumbnail
Fig 2. Interaction of scientists’ attitude toward misinformation and their deficit model perception on their intentions to correct misinformation using relational strategies.

https://doi.org/10.1371/journal.pone.0287870.g002

thumbnail
Table 3. Model coefficients for the conditional process model (process model 14).

https://doi.org/10.1371/journal.pone.0287870.t003

thumbnail
Table 4. Conditional indirect effect at specific levels of the deficit model perception.

https://doi.org/10.1371/journal.pone.0287870.t004

On the other hand, the data showed that there is no significant moderated mediation of the deficit model perception on behavioral intentions to correct misinformation using source-based strategies (index = —.0066, SE = .0063, 95% CI -.0223, .0025).

Discussion

The goal of this study was to understand how scientists’ communication experience and attitudes toward misinformation relate to their intentions to correct misinformation using two common strategies: source-based and relational approaches. Of note, our study identified an important potential barrier to the likelihood that scientists will engage in relational approaches: the deficit model perception. Despite increased sophistication in understanding how publics approach, interpret and respond to scientific issues, the deficit model–i.e., the idea that scientists need to simply transmit scientific knowledge to the public to increase understanding and support for science–persists. Therefore, our study raises important considerations for understanding what impedes scientists’ involvement in the pressing societal challenge of addressing misinformation.

Limitations

Before we address the study findings further, it is important to note several limitations to our study. First, the study highlights the importance of the deficit model perception, which is a potential barrier to scientists’ likelihood to engage in relational approaches. However, utilizing a single-item measurement to capture the deficit model perception may not be sufficient. Multiple-item measures perform consistently better than their single-item equivalents [74]. Future work should consider exploring different dimensions of the deficit model perception. Furthermore, while the current study utilized self-reported survey data to explore factors that affect scientists’ use of source-based and relational strategies to address misinformation, the survey cannot confirm the causal relationships that were proposed in the conceptual model. Future studies should employ an experimental design to test these causal relationships. Last but not least, our sampling of scientists at land-grant universities limits the generalizability of the findings to scientists at other institutions. Scientists at land-grant institutions, however, do represent a group of scientists who may be particularly likely to participate in activities related to addressing misinformation, given that public engagement and outreach are central to the mission of such institutions. We also sampled widely across geographical regions and other characteristics, such as type of land-grant institution.

Findings

Our study identifies how land-grant university scientists’ engagement in science communication activities contributes to their behavioral intentions to correct misinformation using source-based strategies and relational strategies. As an expected outcome, scientists’ higher levels of past engagement in science communication activities are positively associated with behavioral intentions to correct misinformation using source-based strategies and relational strategies. This finding is consistent with research showing that scientists’ public communication activities are related to other science communication activities [15,22]. Another expected outcome was that scientists who have more engagement in science communication activities perceive misinformation as more of a concern. Research in other contexts shows that increased involvement in communication activities, such as social media use, is positively associated with attitudes toward the same communication activity [75]. Scientists who have experience in science communication activities may have an increased awareness of prominent communication challenges in society, leading to their willingness to address those challenges.

Our results also show that scientists’ attitudes toward misinformation served as a significant mediator of the relationship between engagement in science communication activities and behavioral intentions to correct misinformation using source-based strategies and relational strategies. That is, when scientists were concerned about misinformation as a result of engagement in science communication activities, they were more likely to correct misinformation using source-based and relational strategies. This finding is supported by previous studies on the associations between attitudes and behavioral intentions. Past research has also shown that attitudes are an indicator of behavioral intentions in misinformation literature. For example, attitudes of social media users are positively associated with stronger behavioral intentions to recognize rumors in emergencies [16]. In addition, misinformation has been characterized as a prominent risk in society [69], and when people form higher risk perceptions on issues, they are more likely to act on them.

Our study revealed a moderated mediation relationship, such that engagement in science communication activities had an indirect effect on behavioral intentions to correct misinformation through attitude towards misinformation, and this relationship was moderated by the deficit model perception. Specifically, for scientists with a stronger belief in the deficit model, the indirect effect was weaker, indicating that their attitude towards misinformation played a smaller role in explaining the relationship between engagement in science communication activities and behavioral intentions to correct misinformation using relational strategies. That is, when scientists think the public knows little about science, they may be less likely to engage in relational conversations to address misinformation in their communities.

Implications

It is possible that if scientists are not engaging in conversations with diverse members of the public, these members of the public may seek other spaces where their perspectives are heard, such as echo chambers where misinformation can spread [76]. Thus, coupled with public perceptions of uncertainties in science itself and a lack of consensus within the scientific community [77], scientists’ communication attitudes and behaviors can be a source of polarization. Scientists should instead grow as boundary spanners [78], recognizing their dual positionality as scientists and as members of their neighborhoods and other community groups. They have a responsibility to engage in respectful conversations about their community with fellow scientists and about science with community members. Greater attention should be paid to relational strategies for addressing misinformation. We suggest that providing science communication training within inclusive (rather than deficit model perception) worldviews and skills [79] would increase behavioral intentions to address misinformation relationally.

Policy recommendations

Scientists’ engagement in addressing key science communication challenges, such as misinformation, has the potential to shape important societal outcomes. Misinformation negatively impacts human physical and mental health, social cohesion, and environmental systems [80,81]. Additionally, given the increased emphasis on communication shared in digital and social media communication environments, there are other communication problems that may exacerbate the impacts of misinformation. Problems such as cyberbullying can have a greater impact on those who are more socially vulnerable such as isolated elderly individuals [82] and adolescents [83]. Communication and human networks are part of the ecological system that enable positive health outcomes [84]. Encouraging greater engagement with communication among various publics in society, including scientists, will contribute to social change.

A number of factors go into developing capacities for scientists to engage in public communication activities, and many of those factors fall under organizational support for scientists. Training opportunities that promote inclusive, rather than deficit-model, science communication engagement should be built into curriculum for STEM students rather than as something outside of the degree-seeking experience [79]. Similarly, models of scholarly engagement that expand beyond traditional output (e.g., peer-reviewed publications) should be adopted at universities for scientific faculty. When scientists and STEM students are personally motivated to participate in science communication to address misinformation [85], institutions should provide directions and support for communication engagement that satisfies those motivations.

Conclusions

This research study aimed to investigate the correlation between scientists’ communication activities, attitudes towards misinformation, the deficit model perception and their intentions to correct misinformation by employing source-based and relational strategies. The study emphasized that scientists’ engagement in science communication activities and their attitudes towards misinformation are significantly related to their behavioral intentions to correct misinformation using source-based and relational strategies. In addition, our results highlighted that the deficit model perception moderates the indirect effect of engagement in science communication activities on behavioral intentions to correct misinformation using relational strategies through attitude towards misinformation. In light of the urgent social challenge that misinformation poses, there is a need to focus more on employing effective strategies to address it. This calls for a collaborative effort among scientists, policymakers, and other stakeholders to establish science communication policies that promote training and adoption of relational strategies to address misinformation.

Acknowledgments

AA received an award (#6476450) supported by Colorado State University’s Office of the Vice President for Research’s (https://www.research.colostate.edu/) “Accelerating Innovations in Pandemic Disease” initiative, made possible through support from The Anschutz Foundation.

References

  1. 1. Micah AE, Bhangdia K, Cogswell IE, Lasher D, Lidral-Porter B, Maddison ER, et al. Global investments in pandemic preparedness and COVID-19: development assistance and domestic spending on health between 1990 and 2026. The Lancet Global Health. 2023;11: e385–e413. pmid:36706770
  2. 2. Aqeel M, Rehna T, Shuja KH, Abbas J. Comparison of Students’ Mental Wellbeing, Anxiety, Depression, and Quality of Life During COVID-19’s Full and Partial (Smart) Lockdowns: A Follow-Up Study at a 5-Month Interval. Front Psychiatry. 2022;13: 835585. pmid:35530024
  3. 3. U.S. Surgeon General. Confronting Health Misinformation: The U.S. Surgeon General’s Advisory on Building a Healthy Information Environment. U.S. Surgeon General; 2021. Available: https://www.hhs.gov/sites/default/files/surgeon-general-misinformation-advisory.pdf.
  4. 4. Anspach NM, Carlson TN. Not who you think? Exposure and vulnerability to misinformation. New Media & Society. 2022; 146144482211304.
  5. 5. Scherer LD, Pennycook G. Who Is Susceptible to Online Health Misinformation? Am J Public Health. 2020;110: S276–S277. pmid:33001736
  6. 6. Ecker UKH, Lewandowsky S, Cook J, Schmid P, Fazio LK, Brashier N, et al. The psychological drivers of misinformation belief and its resistance to correction. Nat Rev Psychol. 2022;1: 13–29.
  7. 7. Roozenbeek J, van der Linden S. How to Combat Health Misinformation: A Psychological Approach. Am J Health Promot. 2022;36: 569–575. pmid:35164546
  8. 8. Bode L, Vraga EK. Correction Experiences on Social Media During COVID-19. Social Media + Society. 2021;7: 205630512110088.
  9. 9. Union of Concerned Scientists. Countering disinformation in your community. Union of Concerned Scientists. 2022. Available: www.ucsusa.org/resources/what-you-can-do-about-disinformation.
  10. 10. National Academies of Sciences, Engineering, and Medicine. Addressing Inaccurate and Misleading Information About Biological Threats Through Scientific Collaboration and Communication in Southeast Asia. Washington, D.C.: National Academies Press; 2022. p. 26466. https://doi.org/10.17226/26466
  11. 11. Dudo A. Toward a Model of Scientists’ Public Communication Activity: The Case of Biomedical Researchers. Science Communication. 2013;35: 476–501.
  12. 12. Rose KM, Holesovsky CM, Brossard D, Markowitz E. Faculty Public Engagement Attitudes and Practices at Land-Grant Universities in the United States. University of Wisconsin-Madison Madison, WI: Department of Life Sciences Communication. 2019; 94.
  13. 13. Martín-Sempere MJ, Garzón-García B, Rey-Rocha J. Scientists’ motivation to communicate science and technology to the public: surveying participants at the Madrid Science Fair. Public Underst Sci. 2008;17: 349–367.
  14. 14. Pearson G. The participation of scientists in public understanding of science activities: The policy and practice of the U.K. Research Councils. Public Underst Sci. 2001;10: 121–137.
  15. 15. Poliakoff E, Webb TL. What Factors Predict Scientists’ Intentions to Participate in Public Engagement of Science Activities? Science Communication. 2007;29: 242–263.
  16. 16. Ding X, Zhang X, Fan R, Xu Q, Hunt K, Zhuang J. Rumor recognition behavior of social media users in emergencies. Journal of Management Science and Engineering. 2022;7: 36–47.
  17. 17. Khan ML, Idris IK. Recognise misinformation and verify before sharing: a reasoned action and information literacy perspective. Behaviour & Information Technology. 2019;38: 1194–1212.
  18. 18. Pundir V, Devi EB, Nath V. Arresting fake news sharing on social media: a theory of planned behavior approach. Management Research Review. 2021;44: 1108–1138.
  19. 19. National Academies of Sciences, Engineering, and Medicine. Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshop—in Brief. Wojtowicz A, editor. Washington (DC): National Academies Press (US); 2020. Available: http://www.ncbi.nlm.nih.gov/books/NBK565935/.
  20. 20. Wallerstein N, Duran B. Community-Based Participatory Research Contributions to Intervention Research: The Intersection of Science and Practice to Improve Health Equity. Am J Public Health. 2010;100: S40–S46. pmid:20147663
  21. 21. Davies SR. Constructing communication: Talking to scientists about talking to the public. Science Communication. 2008; 413–434.
  22. 22. Dudo A, Besley JC. Scientists’ Prioritization of Communication Objectives for Public Engagement. PLOS ONE. 2016;11: e0148867. pmid:26913869
  23. 23. Kuklinski JH, Quirk PJ, Jerit J, Schwieder D, Rich RF. Misinformation and the Currency of Democratic Citizenship. The Journal of Politics. 2000;62: 790–816.
  24. 24. Lewandowsky S, Ecker UKH, Seifert CM, Schwarz N, Cook J. Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychol Sci Public Interest. 2012;13: 106–131. pmid:26173286
  25. 25. Guess AM, Lyons BA. Misinformation, Disinformation, and Online Propaganda. Social Media and Democracy: The State of the Field, Prospects for Reform. Cambridge University Press; 2020.
  26. 26. Vraga EK, Bode L. Correction as a Solution for Health Misinformation on Social Media. Am J Public Health. 2020;110: S278–S280. pmid:33001724
  27. 27. Vraga EK, Bode L, Tully M. Creating News Literacy Messages to Enhance Expert Corrections of Misinformation on Twitter. Communication Research. 2022;49: 245–267.
  28. 28. Vraga EK, Bode L. Defining Misinformation and Understanding its Bounded Nature: Using Expertise and Evidence for Describing Misinformation. Political Communication. 2020;37: 136–144.
  29. 29. Traberg CS, Roozenbeek J, van der Linden S. Psychological Inoculation against Misinformation: Current Evidence and Future Directions. The ANNALS of the American Academy of Political and Social Science. 2022;700: 136–151.
  30. 30. Vraga EK, Tully M, Maksl A, Craft S, Ashley S. Theorizing News Literacy Behaviors. Communication Theory. 2021;31: 1–21.
  31. 31. Bode L, Vraga E. Value for Correction: Documenting Perceptions about Peer Correction of Misinformation on Social Media in the Context of COVID-19. JQD. 2021;1.
  32. 32. Chou W-YS, Gaysynsky A, Cappella JN. Where We Go From Here: Health Misinformation on Social Media. Am J Public Health. 2020;110: S273–S275. pmid:33001722
  33. 33. Vraga EK, Bode L. Using Expert Sources to Correct Health Misinformation in Social Media. Science Communication. 2017;39: 621–645.
  34. 34. Pollard MS, Davis LM. Decline in Trust in the Centers for Disease Control and Prevention During the COVID-19 Pandemic. Rand Health Q. 2022;9: 23. pmid:35837520
  35. 35. Malhotra P. A Relationship-Centered and Culturally Informed Approach to Studying Misinformation on COVID-19. Social Media + Society. 2020;6: 205630512094822. pmid:34192033
  36. 36. Wirz CD, Cate A, Brauer M, Brossard D, DiPrete Brown L, Chen K, et al. Science communication during COVID-19: when theory meets practice and best practices meet reality. JCOM. 2022;21: N01.
  37. 37. Lee J, Britt BC, Kanthawala S. Taking the lead in misinformation-related conversations in social media networks during a mass shooting crisis. Internet Research. 2022;ahead-of-print.
  38. 38. Peterson E, Iyengar S. Partisan Gaps in Political Information and Information‐Seeking Behavior: Motivated Reasoning or Cheerleading? American Journal of Political Science. 2021;65: 133–147.
  39. 39. Carpini MXD, Cook FL, Jacobs LR. PUBLIC DELIBERATION, DISCURSIVE PARTICIPATION, AND CITIZEN ENGAGEMENT: A Review of the Empirical Literature. Annual Review of Political Science. 2004;7: 315–344.
  40. 40. Margolin DB, Hannak A, Weber I. Political Fact-Checking on Twitter: When Do Corrections Have an Effect? Political Communication. 2017;35: 196–219.
  41. 41. Pearce KE, Malhotra P. Inaccuracies and Izzat: Channel Affordances for the Consideration of Face in Misinformation Correction. Journal of Computer-Mediated Communication. 2022;27: zmac004.
  42. 42. Rose KM, Markowitz EM, Brossard D. Scientists’ incentives and attitudes toward public communication. Proc Natl Acad Sci USA. 2020; 201916740. pmid:31911470
  43. 43. Hara N, Abbazio J, Perkins K. An emerging form of public engagement with science: Ask Me Anything (AMA) sessions on Reddit r/science. PLOS ONE. 2019;14: e0216789. pmid:31091264
  44. 44. Bultitude K. The Why and How of Science Communication. In: Rosulek , editor. Science Communication. Pilsen: European Commission.; 2011. Available: https://www.scifode-foundation.org/attachments/article/38/Karen_Bultitude_-_Science_Communication_Why_and_How.pdf.
  45. 45. Jucan MS, Jucan CN. The Power of Science Communication. Procedia—Social and Behavioral Sciences. 2014;149: 461–466.
  46. 46. Husain F, Shahnawaz MG, Khan NH, Parveen H, Savani K. Intention to get COVID-19 vaccines: Exploring the role of attitudes, subjective norms, perceived behavioral control, belief in COVID-19 misinformation, and vaccine confidence in Northern India. Human Vaccines & Immunotherapeutics. 2021;17: 3941–3953. pmid:34546837
  47. 47. Bautista JR, Zhang Y, Gwizdka J. Professional Identity and Perceived Crisis Severity as Antecedents of Healthcare Professionals’ Responses to Health Misinformation on Social Media. In: Smits M, editor. Information for a Better World: Shaping the Global Future. Cham: Springer International Publishing; 2022. pp. 273–291. https://doi.org/10.1007/978-3-030-96960-8_19
  48. 48. Secko DM, Amend E, Friday T. Four Models of Science Journalism. Journalism Practice. 2013;7: 62–80.
  49. 49. Brossard D, Lewenstein BV. A Critical Appraisal of Models of Public Understanding of Science: Using Practice to Inform Theory. Communicating Science. Routledge; 2009.
  50. 50. Trench B. Towards an Analytical Framework of Science Communication Models. In: Cheng D, Claessens M, Gascoigne T, Metcalfe J, Schiele B, Shi S, editors. Communicating Science in Social Contexts: New models, new practices. Dordrecht: Springer Netherlands; 2008. pp. 119–135. https://doi.org/10.1007/978-1-4020-8598-7_7
  51. 51. Akin H, Scheufele DA. Overview of the Science of Science Communication. In: Jamieson KH, Kahan D, Scheufele DA, editors. The Oxford Handbook of the Science of Science Communication. Oxford, New York: Oxford University Press; 2017. pp. 25–33.
  52. 52. Besley JC, Tanner AH. What Science Communication Scholars Think About Training Scientists to Communicate. Science Communication. 2011;33: 239–263.
  53. 53. Priest SHP. Misplaced Faith: Communication Variables as Predictors of Encouragement for Biotechnology Development. Science Communication. 2001;23: 97–110.
  54. 54. Suldovsky B. In science communication, why does the idea of the public deficit always return? Exploring key influences. Public Underst Sci. 2016;25: 415–426. pmid:27117769
  55. 55. Metcalfe J. Comparing science communication theory with practice: An assessment and critique using Australian data. Public Underst Sci. 2019;28: 382–400. pmid:30755086
  56. 56. Reincke CM, Bredenoord AL, van Mil MH. From deficit to dialogue in science communication: The dialogue communication model requires additional roles from scientists. EMBO Rep. 2020;21. pmid:32748995
  57. 57. Callwood KA, Weiss M, Hendricks R, Taylor TG. Acknowledging and Supplanting White Supremacy Culture in Science Communication and STEM: The Role of Science Communication Trainers. Frontiers in Communication. 2022;7. Available: https://www.frontiersin.org/articles/10.3389/fcomm.2022.787750.
  58. 58. Bucchi M. Of deficits, deviations and dialogues: Theories of public communication of science. Handbook of Public Communication of Science and Technology. Routledge; 2008.
  59. 59. Besley JC, Nisbet M. How scientists view the public, the media and the political process. Public Underst Sci. 2013;22: 644–659. pmid:23885050
  60. 60. Simis MJ, Madden H, Cacciatore MA, Yeo SK. The lure of rationality: Why does the deficit model persist in science communication? Public Understanding of Science. 2016;25: 400–414. pmid:27117768
  61. 61. Ko H. In science communication, why does the idea of a public deficit always return? How do the shifting information flows in healthcare affect the deficit model of science communication? Public Underst Sci. 2016;25: 427–432. pmid:27117770
  62. 62. Houtman D, Vijlbrief B, Riedijk S. Experts in science communication. EMBO reports. 2021;22: e52988. pmid:34269513
  63. 63. Council for Higher Education Accreditation. Regional Accrediting Organizations. [cited 6 Jul 2022]. Available: https://www.chea.org/regional-accrediting-organizations-accreditor-type.
  64. 64. National Institute of Food and Agriculture. College Partners Directory. In: Nation Institute of Food and Agriculture [Internet]. [cited 6 Jul 2022]. Available: http://www.nifa.usda.gov/land-grant-colleges-and-universities-partner-website-directory.
  65. 65. NSF. Research Areas. [cited 6 Jul 2022]. Available: https://www.nsf.gov/about/research_areas.jsp.
  66. 66. Kowarik A, Templ M. Imputation with the R Package VIM. Journal of Statistical Software. 2016;74: 1–16.
  67. 67. Templ M, Alfons A, Filzmoser P. Exploring incomplete data using visualization techniques. Adv Data Anal Classif. 2012;6: 29–47.
  68. 68. Gill J, Cranmer S, Jackson N, Murr A, Armstrong D, Heuberger S. Multiple Hot-Deck Imputation. 2021. Available: https://cran.r-project.org/web/packages/hot.deck/hot.deck.pdf.
  69. 69. Krause NM, Freiling I, Beets B, Brossard D. Fact-checking as risk communication: the multi-layered risk of misinformation in times of COVID-19. Journal of Risk Research. 2020;23: 1052–1059.
  70. 70. Yuan S, Besley JC, Dudo A. A comparison between scientists’ and communication scholars’ views about scientists’ public engagement activities. Public Underst Sci. 2019;28: 101–118. pmid:30175667
  71. 71. Copple J, Bennett N, Dudo A, Moon W-K, Newman TP, Besley J, et al. Contribution of Training to Scientists’ Public Engagement Intentions: A Test of Indirect Relationships Using Parallel Multiple Mediation. Science Communication. 2020;42: 508–537.
  72. 72. Cohen P, Cohen P, West SG, Aiken LS. Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences. 2nd ed. New York: Psychology Press; 2014. https://doi.org/10.4324/9781410606266
  73. 73. Hayes AF. Introduction to Mediation, Moderation, and Conditional Process Analysis, Second Edition: A Regression-Based Approach. Guilford Publications; 2017.
  74. 74. Kamakura WA. Measure twice and cut once: the carpenter’s rule still applies. Mark Lett. 2015;26: 237–243.
  75. 75. Yu S, Abbas J, Draghici A, Negulescu OH, Ain NU. Social Media Application as a New Paradigm for Business Communication: The Role of COVID-19 Knowledge, Social Distancing, and Preventive Attitudes. Front Psychol. 2022;13: 903082. pmid:35664180
  76. 76. Wollebæk D, Karlsen R, Steen-Johnsen K, Enjolras B. Anger, Fear, and Echo Chambers: The Emotional Basis for Online Behavior. Social Media + Society. 2019;5: 2056305119829859.
  77. 77. O’Connor C, Weatherall JO. Scientific polarization. Euro Jnl Phil Sci. 2018;8: 855–875.
  78. 78. Shah H, Simeon J, Fisher KQ, Eddy SL. Talking Science: Undergraduates’ Everyday Conversations as Acts of Boundary Spanning That Connect Science to Local Communities. LSE. 2022;21: ar12. pmid:35179951
  79. 79. Vickery R, Murphy K, McMillan R, Alderfer S, Donkoh J, Kelp N. Analysis of Inclusivity of Published Science Communication Curricula for Scientists and STEM Students. Prevost L, editor. LSE. 2023;22: ar8. pmid:36637377
  80. 80. CCA (Council of Canadian Academies). Fault Lines. Ottawa, ON: Expert Panel on the Socioeconomic Impacts of Science and Health Misinformation, CCA; 2023.
  81. 81. Shoib S, Gaitán Buitrago JET, Shuja KH, Aqeel M, de Filippis R, Abbas J, et al. Suicidal behavior sociocultural factors in developing countries during COVID-19. L’Encéphale. 2022;48: 78–82. pmid:34654566
  82. 82. Su Z, Cheshmehzangi A, Bentley BL, McDonnell D, Šegalo S, Ahmad J, et al. Technology-based interventions for health challenges older women face amid COVID-19: a systematic review protocol. Syst Rev. 2022;11: 271. pmid:36514147
  83. 83. Yao J, Ziapour A, Abbas J, Toraji R, NeJhaddadgar N. Assessing puberty-related health needs among 10–15-year-old boys: A cross-sectional study approach. Archives de Pédiatrie. 2022;29: 307–311. pmid:35292195
  84. 84. Iorember PT, Iormom B, Jato TP, Abbas J. Understanding the bearable link between ecology and health outcomes: the criticality of human capital development and energy use. Heliyon. 2022;8: e12611. pmid:36619406
  85. 85. Bennett N, Dudo A, Besley J. STEM Graduate Students’ Perspectives on Science Communication and Their Sense of Belonging in These Spaces. Center for Media Engagement. 2022. Available: https://mediaengagement.org/research/stem-graduate-students-perspectives-on-science-communication.