Is It Too Much to Ask? The Role of Question Difficulty in Survey Response Accuracy for Measures of Online Behavior

Abstract:

While market research capabilities of online panels have never been greater, the challenges facing these panels in many ways are just as great. Over the past few years, online panels that recruit members using nonprobability/opt-in based methods have come under increased scrutiny and criticism over data quality concerns such as respondent identity and increased satisficing. These concerns have drawn attention to the heart of the issue, which is: the accuracy or truthfulness of data provided by opt-in panel respondents. This issue is of utmost importance given the recently established link between opt-in panel sample and poor survey data quality (see Yeager et. al. 2011). Read more

Evaluation of Alternative Weighting Approaches to Reduce Nonresponse Bias

Abstract:

With declining response rates, surveys increasingly rely on weighting adjustments to correct for potential nonresponse bias. The resulting increased need to improve survey weights faces two key challenges. First, additional auxiliary data are needed to augment the models used to estimate the weights. Depending on the properties of these auxiliary data, nonresponse bias can be reduced, left the same, or even increased. Thus, the second challenge is to be able to evaluate the alternative weights, when the assumption of “different estimates means less bias” may not hold. Ideally, data need to be collected from as many nonrespondents as possible to provide direct estimates of nonresponse bias. Read more

Difficult Data: Comparing the Quality of Behavioral, Recall, and Proxy Data Across Survey Modes

Abstract:

The mode choice literature is rife with evidence on the impact of different survey modes on response rates, respondent cooperation, and data quality. However, insufficient attention has been paid to the quality of “difficult data” provided when respondents cannot choose the mode and thus cannot maximize their comfort with the survey. Here, “difficult data” correspond to questions that are burdensome for respondents to think about – e.g., very specific details on a behavior, on past events, or on the behavior of other persons. Read more

The Use of GIS Information as Auxiliary Data for Nonresponse Bias Analysis

Abstract:

The Nielsen Company, like most other market research firms, is concerned with falling response rates and the threat of nonresponse bias. The challenge in understanding and adjusting for nonresponse is always the lack of information about the non-responding cases. In recent years, researchers have considered the use of paradata, interviewer observations, and aggregate Census information to provide information about the non-responders. This study introduces and evaluates the use of a new type of GIS-based data, called POI (Points-of-Interest), as a potential auxiliary data source for nonresponse bias analysis. Read more

Should the Third Reminder be Sent? The Role of Survey Response Timing on Web Survey Results

Abstract:

Web surveys are being increasingly incorporated into national survey data collection programs in the United States because of their cost/time-efficiencies. Yet, response rates and data quality issues in web surveys remain important challenges. As a basic study designed to better understand data quality in a mixed mode national survey, this article investigates the degree to which web versus mail survey modes affect unit and item responses. Findings indicate that the web survey mode produces a lower unit response rate compared to the mail mode. However, the web mode elicits higher data quality in terms of item responses to both closed- and open-ended questions. Read more

Humanizing the Internet Cookie? Key Learnings from an Online Panel

Abstract:

The Online advertising is booming. Companies are increasingly relying on Internet cookies to target online ads to consumers and track ad exposure and website click-through rates. Cookies are also used for survey targeting of specific Internet user groups, for measuring ad effectiveness, and for political polling. Companies use the cookies they have dropped as proxies for the consumers they are ultimately trying to reach. However, this approach of humanizing cookies suffers from several measurement errors, due to cookie deletion, blocking cookies, multiple people in the household using the same browser, or one person using multiple browsers. Read more

Measuring Audience Duplication in a Cross-Media Ad Campaign Involving TV and Internet

Abstract:

The Nielsen Company (Nielsen) has extended its Campaign Ratings solution to include the measurement of cross-platform campaigns. By leveraging its industry-standard Nielsen People Meter panel and its innovative Online Campaign Ratings product, Nielsen can now quantify the unduplicated reach, frequency and GRPs for ads running on Television and Internet. Read more

Recruitment and Retention in Multi-Mode Survey Panels

Abstract:

This study builds on a previously published panel recruitment experiment (Rao, Kaminska, and McCutcheon 2010), extending that analysis to an examination of the effectiveness of pre-recruitment factors such as mode and response inducements on three post-recruitment panel participation effects: attrition rates, survey completion rates, and panel data quality. The panel recruitment experiment, conducted with the Gallup Panel, netted 1,282 households with 2,042 panel members. For these recruited members, we collected data on panel participation and retention, and use it for analysis in this study. Read more

Is Past, the Future? Resampling Past Respondents to Improve Current Response Rates

Abstract:

The Nielsen TV Ratings Diary service involves the use of a one-week TV diary survey for measuring TV ratings. While the service has been around for a while, it recently received a sampling makeover to address the diminishing coverage associated with landline random-digit dialing (RDD) surveys. Address-based sampling (ABS) replaced RDD as the sampling methodology for the diary service. Read more

Home or Work or Both? Assessing the Role of Duplication of Website Visitations Using a Online Metered Panel

Abstract:

In this study, for multiple websites, we estimate duplicated audience reach between home and work Internet access locations. By employing a probability-based matching technique, we use metered-panel based data from non-overlapping home and work panels for creating a virtual overlapping home-work panel. Read more