Archive for May 17th, 2014

Who Is behind That Screen? Solving the Puzzle of Within-Home Computer Sharing among Household Members

Abstract:

The number of US households with access to computers at home has continued to grow. According to the 2011 Computer and Internet Use report published by US department of Commerce, 77% of US homes have computers in their home, compared to 62% in 2003. Many households, however, do not have multiple computers dedicated to each member living in the house. As such, sharing of computers amongst household members can be a prevalent phenomenon in home computer usage. Understanding this within-house computer sharing phenomenon and identifying the mostly likely person behind the computer screen can be of interest to market researchers and practitioners, particularly those interested in studying effective ways to target online ads based on users, online activities. For survey researchers who are attempting to recruit hard-to-reach individuals like teens and young adults, understanding of computer sharing could help establish contact at times when those individuals are more likely to be behind the computer. Despite its prevalence, within-house computer sharing has barely received any research attention. This study hopes to break through the barriers preventing the light of scientific inquiry into this phenomenon. Read more

Is It Too Much to Ask? The Role of Question Difficulty in Survey Response Accuracy for Measures of Online Behavior

Abstract:

While market research capabilities of online panels have never been greater, the challenges facing these panels in many ways are just as great. Over the past few years, online panels that recruit members using nonprobability/opt-in based methods have come under increased scrutiny and criticism over data quality concerns such as respondent identity and increased satisficing. These concerns have drawn attention to the heart of the issue, which is: the accuracy or truthfulness of data provided by opt-in panel respondents. This issue is of utmost importance given the recently established link between opt-in panel sample and poor survey data quality (see Yeager et. al. 2011). Read more

Evaluation of Alternative Weighting Approaches to Reduce Nonresponse Bias

Abstract:

With declining response rates, surveys increasingly rely on weighting adjustments to correct for potential nonresponse bias. The resulting increased need to improve survey weights faces two key challenges. First, additional auxiliary data are needed to augment the models used to estimate the weights. Depending on the properties of these auxiliary data, nonresponse bias can be reduced, left the same, or even increased. Thus, the second challenge is to be able to evaluate the alternative weights, when the assumption of “different estimates means less bias” may not hold. Ideally, data need to be collected from as many nonrespondents as possible to provide direct estimates of nonresponse bias. Read more