Is It Too Much to Ask? The Role of Question Difficulty in Survey Response Accuracy for Measures of Online Behavior

Abstract:

While market research capabilities of online panels have never been greater, the challenges facing these panels in many ways are just as great. Over the past few years, online panels that recruit members using nonprobability/opt-in based methods have come under increased scrutiny and criticism over data quality concerns such as respondent identity and increased satisficing. These concerns have drawn attention to the heart of the issue, which is: the accuracy or truthfulness of data provided by opt-in panel respondents. This issue is of utmost importance given the recently established link between opt-in panel sample and poor survey data quality (see Yeager et. al. 2011).

In this study, we tackle this issue head-on by assessing the accuracy of multiple survey measures of online behaviors by comparisons to behavioral data collected via respondents’ membership with a metered panel. The survey measures were questions around visitations to various websites which included a popular social networking website. While research on assessing the accuracy of self-reports has been around for a while (see McMorris & Ambrosino 1973; Miller & Groves 1985) and has generally involved comparing self-reports against administrative/official records, this study is unique in that it assessed the accuracy by comparing self-reports against actual behavior. The behavioral data consists of respondents’ online activity captured passively through a software meter installed on their primary household computer. Results revealed an interesting relationship between difficulty of the survey question and accuracy of the response. We found that response accuracy was higher for simple than complex questions on online behaviors. This is particularly the case for most websites, irrespective of their popularity. Furthermore, the propensity of providing an accurate response when controlled for question complexity showed interesting demographic effects of age and cognitive ability. While arguing against the prevailing wisdom that opt-in surveys are less accurate, the findings from this study affirms the notion that surveys, even opt-in ones, can provide accurate and valid data when done right i.e. with reasonable task expectations. The findings from this study may have practical implications for researchers using opt-in survey data for methodological research. This study concludes with a call for further research on response accuracy in opt-in surveys and recommendations for future research.

Recommended Citation:

Rao, K., Zhang, M., & Luo, T. (2014). Is It Too Much to Ask? The Role of Question Difficulty in Survey Response Accuracy for Measures of Online Behavior. Paper presented at the American Association for Public Opinion Research, Anaheim, CA.

Attached Documents:

  • AAPOR 2014 Program (see page #159 for the mention)
  • For a copy of this presentation, please send me a comment with your email address in the box below.
  • Trackback are closed
  • Comments (1)
    • Vikas T
    • April 17th, 2015

    Please send me a copy of the presentation titled, “Rao, K., Zhang, M., & Luo, T. (2014). Is It Too Much to Ask? The Role of Question Difficulty in Survey Response Accuracy for Measures of Online Behavior. Paper presented at the American Association for Public Opinion Research, Anaheim, CA.”

    Thank you

    Vikas

Comment are closed.