Political Science Research and Methods Page 1 of 20
© The European Political Science Association, 2016 doi:10.1017/psrm.2016.20
How Face-to-Face Interviews and Cognitive Skill Affect
Item Non-Response: A Randomized Experiment Assigning
Mode of Interview*
ANDREW GOOCH AND LYNN VAVRECK
T
echnology and the decreased cost of survey research have made it possible for researchers to
collect data using new and varied modes of interview. These data are often analyzed
as if they were generated using similar processes, but the modes of interview may produce dif-
ferences in response simply due to the presence or absence of an interviewer. In this paper, we
explore the differences in item non-response that result from different modes of interview and find
that mode makes a difference. The data are from an experiment in which we randomly assigned an
adult population to an in-person or self-completed survey after subjects agreed to participate in a
short poll. For nearly every topic and format of question, we find less item non-response in the
self-complete mode. Furthermore, we find the difference across modes in non-response is exacer-
bated for respondents with low levels of cognitive abilities. Moving from high to low levels of cogni-
tive ability, an otherwise average respondent can be up to six times more likely to say “don’t know”
in a face-to-face interview than in a self-completed survey, depending on the type of question.
O
nline, self-completed surveys are growing in popularity. The reduced cost of collecting
data via online surveys has opened up data collection for hundreds of researchers who
otherwise would not have been able to collect their own data. While the use of
self-completed surveys has grown over the last decade, few studies address purely the mode
differences between in-person interviewing—as happens on the phone or in a face-to-face
interview—and self-complete surveying. What are the differences between a respondent’s
answer to the same question when it is posed by another person compared with when it is read
on a computer screen? Several analyses of internet surveys aimed at addressing this question
have been done, but comparative studies routinely conflate sampling method with mode of
interview, making it impossible to discern what is driving the differences in the data. As a result,
our ability to make inferences about the effects on survey responses due only to mode are
limited. A clearer path to causal identification comes from randomly assigning the mode of
interview after a sample has been drawn and a respondent’s participation has been guaranteed.
In this paper, we attempt to go beyond observational comparisons of survey data collected via
different modes by recruiting people to take a survey and then randomly assigning them to a
mode of interview after they agree to complete the survey. In this way, we isolate the effects of
mode alone and learn about the potential effects of switching between these modes of interview.
* Andrew Gooch, Postdoctoral Fellow, Institution for Social and Policy Studies and the Center for the Study of
American Politics, 77 Prospect Street, New Haven, CT 06511 (andrew.gooch@yale.edu). Lynn Vavreck, Pro-
fessor of Political Science and Communication Studies, University of California, Los Angeles, 4289 Bunche Hall
Los Angeles, CA 90095 (lvavreck@ucla.edu). This research is supported by a grant from the National Science
Foundation (SES-1023940). The authors thank Brian Law for managing the project at the MGM Grand and
Felipe Nunes, Sylvia Friedel, Gilda Rodriguez, Adria Tinnin, and Chris Tausanovitch for their participation in
Las Vegas. Doug Rivers and Jeff Lewis provided programming support; John Aldrich, Larry Bartels, Alan
Gerber, Gary Jacobson, Simon Jackman, Vince Hutchings, Gary Segura, John Zaller, and Brian Humes helped
with the design of the experiment. Finally, the authors are grateful to Mike Thies who provided valuable
feedback on drafts of the paper. To view supplementary material for this article, please visit http://dx.doi.org/
10.1017/psrm.2016.20
https:/www.cambridge.org/core/terms. https://doi.org/10.1017/psrm.2016.20
Downloaded from https:/www.cambridge.org/core. UCLA Library, on 15 Mar 2017 at 05:39:15, subject to the Cambridge Core terms of use, available at