Elsevier

Computers in Human Behavior

Volume 48, July 2015, Pages 554-568
Computers in Human Behavior

Using virtual presence and survey instructions to minimize careless responding on Internet-based surveys

https://doi.org/10.1016/j.chb.2015.01.070Get rights and content

Highlights

  • Instructions and virtual presence were used to prevent careless responding in online surveys.

  • MANOVAs showed warning instructions reduced some forms of careless responding.

  • Virtual presence enhances the utility of survey instructions in preventing careless responding.

  • Feedback instructions hold promise as a non-aversive way to prevent careless responding.

Abstract

Internet-based survey data inform knowledge creation in research and justify work activities in organizations. While there are advantages to online surveys, this mode of administration comes with its own set of challenges. Survey respondents may engage in careless responding (i.e. insufficient effort responding or satisficing) by intentionally or unintentionally responding in a manner that does not accurately reflect their true sentiments. Careless responding can create psychometric problems even after correctly removing careless respondents (i.e. mischievous responders). This study aimed to improve survey methodology by preventing careless responding. Using a 3 × 3 between-subjects experimental design, we manipulated both virtual presence (none, animated shape, and virtual human) and type of instructions (anonymous, warning, and feedback). Indicators of careless responding were the dependent variables. Results showed that beyond characteristics of survey items, survey design elements can prevent careless responding. The effects of interventions differed by type of careless responding. Instructions, and the interaction of instructions and virtual presence significantly reduced careless responding, but not virtual presence alone. Virtual human presence increased the salience of instructions. Although currently effective, instructions warning of punitive consequences may create difficulty in recruiting participants. Future research should continue investigating non-aversive ways to prevent careless responding on Internet-based surveys.

Introduction

Advances in technology have spurred the extensive use of Internet-based surveys. Data from Internet-based surveys support knowledge creation in research and inform applied work in many organizations (e.g., Acquavita, 2009, Anderson, 2010, Berta, 2006, Marx, 2012, Mervis, 2007, Patrick, 2012). While there are decided advantages to Internet-based surveys, this mode of administration comes with its own set of challenges. For example, respondents may ignore important parts of the survey, submit multiple times, and may often exhibit careless responding (CR; Barak and English, 2002, Barge and Gehlbach, 2012, Berry et al., 1992, Curran et al., 2010, Hardré et al., 2012, Johnson, 2005, Meade and Craig, 2012, Robinson-Cimpian, 2014). That is, either intentionally or unintentionally respondents may answer survey items in a manner that does not accurately reflects their true sentiments. Understanding and manipulating features of Internet-based surveys that encourage attentiveness may: decrease CR, provide cleaner datasets to support conclusions, and promote better theory development and application. The primary aim of this study is to examine how certain features of survey design can prevent CR by increasing attentiveness among respondents.

Section snippets

Psychometric problems associated with Internet-based surveying

CR occurs when a person responds to a survey item in a way that reflects inaccuracy rather than that person’s true sentiment. The person may or may not take into account the content of the survey item. Nichols, Greene, and Schmolck (1989) describe CR as manifesting itself in one of two ways. Content-responsive faking occurs when responses relate to the content of items and exhibit some level of inaccuracy. Respondents may intentionally engage in content-responsive faking or unintentionally

Reasons for CR and how it might be prevented

Preventing CR requires an understanding of why this form of responding occurs. Despite many advantages to online data collection, administrators of Internet-based surveys relinquish much of the control they had when overseeing paper and pencil surveys. Researchers have posited that less direct interaction between the administrator and participant (Johnson, 2005), more environmental distractions (Carrier, Cheever, Rosen, Benitez, & Chang, 2009), multitasking (Zwarun & Hall, 2014), lower

Participants

A total of 502 participants were recruited from a pool of students enrolled in introductory psychology courses at a large Southeastern United States university. Deletion of participants who omitted one or more pages of the survey or who were multivariate outliers reduced the total number of participants to a final sample size of 410 participants. Cell sizes per experimental condition ranged from 29 to 64 people due to random assignment and deletion of missing data. Deleted cases ranged from 2%

Analysis of the fidelity of manipulations

Prior to testing the hypotheses of this study, we ran a series of manipulation checks to verify that participants perceived differences among the types of instructions and among the types of virtual presence. Participants indicated their levels of agreement to items that described the instructions such as, “I will receive feedback about the quality of my survey responses.” Items also checked that participants perceived the intended virtual presence, e.g. “There was an animated picture of a

Discussion

The increasing popularity of Internet-based surveys has brought with it the unintended consequence of widespread CR and the various psychometric problems that result from CR. In this study, we proposed and tested a new approach to CR that uses design elements of the Internet-based survey to increase attentiveness among survey respondents. As predicted, manipulating survey instructions and virtual presence can reduce some forms of CR. The forms of CR most affected by survey design manipulations

References (65)

  • Averager [Computer Software]. University of Aberdeen, Aberdeen, UK: Face Research...
  • A. Barak et al.

    Prospects and limitations of psychological testing on the Internet

    Journal of Technology in Human Services

    (2002)
  • S. Barge et al.

    Using the theory of satisficing to evaluate the quality of survey data

    Research in Higher Education

    (2012)
  • T. Behrend et al.

    Using animated agents in learner-controlled training: The effects of design control

    The International Journal of Training and Development

    (2012)
  • D.T.R. Berry et al.

    MMPI-2 random responding indices: Validation using a self-report methodology

    Psychological Assessment

    (1992)
  • Berta, D. (2006, November 13). Caribou Coffee Co. refines hiring process with online applicant-screening program....
  • S.M. Colarelli et al.

    Comparative effects of personal and situational influences on job outcomes of new professionals

    Journal of Applied Psychology

    (1987)
  • D.P. Crowne et al.

    A new scale of social desirability independent of psychopathology

    Journal of Consulting Psychology

    (1960)
  • Curran, P. G., Kotrba, L., & Denison, D. (2010). CR in surveys: Applying traditional techniques to organizational...
  • C.W. De Dreu et al.

    Self-interest and other-orientation in organizational behavior: Implications for job performance, prosocial behavior, and personal initiative

    Journal of Applied Psychology

    (2009)
  • J.M. Diefendorff et al.

    Motivating employees

  • Ehlers, C., Greene-Shortridge, T. M., Weekley, J. A., & Zaiack, M. D. (2009). The exploration of statistical methods in...
  • J.M. Feinberg et al.

    Social facilitation: A test of competing theories

    Journal of Applied Social Psychology

    (2006)
  • J.M. Feinberg et al.

    The effect of challenge and threat appraisals under evaluative presence

    Journal of Applied Social Psychology

    (2010)
  • A. Field et al.

    Discovering statistics using SAS statistics

    (2010)
  • H. Gehlbach et al.

    Anchoring and adjusting in questionnaire responses

    Basic and Applied Social Psychology

    (2012)
  • L.R. Goldberg

    A broad-bandwidth, public domain, personality inventory measuring the lower-level facets of several five-factor models

    Personality Psychology in Europe

    (1999)
  • Gratch, J., Wang, N., Okhmatovskaia, A., Lamothe, F., Morales, M., van der Werf, R. J., et al. (2007). Can virtual...
  • T.L. Griffith

    Monitoring and performance: A comparison of computer and supervisor monitoring

    Journal of Applied Social Psychology

    (1993)
  • P.L. Hardré et al.

    Examining contexts-of-use for web-based and paper-based questionnaires

    Educational and Psychological Measurement

    (2012)
  • J. Huang et al.

    Detecting and deterring insufficient effort responding to surveys

    Journal of Business and Psychology

    (2012)
  • D.N. Jackson

    Jackson vocational interest survey manual

    (1977)
  • Cited by (0)

    View full text