Using virtual presence and survey instructions to minimize careless responding on Internet-based surveys
Introduction
Advances in technology have spurred the extensive use of Internet-based surveys. Data from Internet-based surveys support knowledge creation in research and inform applied work in many organizations (e.g., Acquavita, 2009, Anderson, 2010, Berta, 2006, Marx, 2012, Mervis, 2007, Patrick, 2012). While there are decided advantages to Internet-based surveys, this mode of administration comes with its own set of challenges. For example, respondents may ignore important parts of the survey, submit multiple times, and may often exhibit careless responding (CR; Barak and English, 2002, Barge and Gehlbach, 2012, Berry et al., 1992, Curran et al., 2010, Hardré et al., 2012, Johnson, 2005, Meade and Craig, 2012, Robinson-Cimpian, 2014). That is, either intentionally or unintentionally respondents may answer survey items in a manner that does not accurately reflects their true sentiments. Understanding and manipulating features of Internet-based surveys that encourage attentiveness may: decrease CR, provide cleaner datasets to support conclusions, and promote better theory development and application. The primary aim of this study is to examine how certain features of survey design can prevent CR by increasing attentiveness among respondents.
Section snippets
Psychometric problems associated with Internet-based surveying
CR occurs when a person responds to a survey item in a way that reflects inaccuracy rather than that person’s true sentiment. The person may or may not take into account the content of the survey item. Nichols, Greene, and Schmolck (1989) describe CR as manifesting itself in one of two ways. Content-responsive faking occurs when responses relate to the content of items and exhibit some level of inaccuracy. Respondents may intentionally engage in content-responsive faking or unintentionally
Reasons for CR and how it might be prevented
Preventing CR requires an understanding of why this form of responding occurs. Despite many advantages to online data collection, administrators of Internet-based surveys relinquish much of the control they had when overseeing paper and pencil surveys. Researchers have posited that less direct interaction between the administrator and participant (Johnson, 2005), more environmental distractions (Carrier, Cheever, Rosen, Benitez, & Chang, 2009), multitasking (Zwarun & Hall, 2014), lower
Participants
A total of 502 participants were recruited from a pool of students enrolled in introductory psychology courses at a large Southeastern United States university. Deletion of participants who omitted one or more pages of the survey or who were multivariate outliers reduced the total number of participants to a final sample size of 410 participants. Cell sizes per experimental condition ranged from 29 to 64 people due to random assignment and deletion of missing data. Deleted cases ranged from 2%
Analysis of the fidelity of manipulations
Prior to testing the hypotheses of this study, we ran a series of manipulation checks to verify that participants perceived differences among the types of instructions and among the types of virtual presence. Participants indicated their levels of agreement to items that described the instructions such as, “I will receive feedback about the quality of my survey responses.” Items also checked that participants perceived the intended virtual presence, e.g. “There was an animated picture of a
Discussion
The increasing popularity of Internet-based surveys has brought with it the unintended consequence of widespread CR and the various psychometric problems that result from CR. In this study, we proposed and tested a new approach to CR that uses design elements of the Internet-based survey to increase attentiveness among survey respondents. As predicted, manipulating survey instructions and virtual presence can reduce some forms of CR. The forms of CR most affected by survey design manipulations
References (65)
- et al.
Similarity effects in online training: Effects with computerized trainer agents
Computers in Human Behavior
(2011) - et al.
Multitasking across generations: Multitasking choices and difficulty ratings in three generations of Americans
Computers in Human Behavior
(2009) - et al.
Human neural systems for face recognition and social communication
Biological Psychiatry
(2002) Ascertaining the validity of individual protocols from web-based personality inventories
Journal of Research in Personality
(2005)- et al.
What’s going on? Age, distraction, and multitasking during online survey taking
Computers in Human Behavior
(2014) Personal and organizational diversity factors’ impact on social workers’ job satisfaction: Results from a national Internet-based survey
Administration in Social Work
(2009)- et al.
Avatars
- et al.
Social facilitation from Triplett to electronic performance monitoring
Group Dynamics: Theory, Research, and Practice
(2001) - et al.
Computer monitoring of work performance: Extending the social facilitation framework to electronic presence
Journal of Applied Social Psychology
(1993) Assessment of qualifications needed by environmental health graduates entering private-sector employment (cover story)
Journal of Environmental Health
(2010)