Do millennial undergraduates’ views of writing differ when surveyed online versus on paper? Ayshegul B. Musallam a , Diane L. Schallert b, , Hyunjin Kim b a Foreign Language Education, University of Texas at Austin, Austin, TX, 78712, United States b Department of Educational Psychology, University of Texas at Austin, Austin, TX, 78712, United States article info Article history: Available online 24 May 2011 Keywords: Survey formats Online survey Electronic medium Survey delivery Writing experiences of millennial undergraduates abstract The purpose of this study was to test whether different survey delivery conditions made a difference in assessing college students’ practices, affect, and conceptions of academic and nonacademic writing. The delivery conditions represented combinations of three underlying factors: survey format (online versus paper-and-pencil), location (classroom, lab, home), and supervision (proctored or not). Participants (N = 268) were randomly assigned to one of three conditions: (1) a paper version of the survey adminis- tered in classrooms at scheduled proctored sessions; (2) an online version administered in scheduled proctored sessions in a computer lab; (3) an online version at a location and time of the students’ choice. The survey had 103 closed-ended and three open-response questions. Results showed different partici- pation rates across conditions and more variability in time spent for the ‘‘home online’’ group. However, there were few differences by condition to substantive questions regarding the students’ practices, con- ceptions, and affective responses associated with writing. The only place where responses differed by condition was in response to the optional open-ended evaluation of the survey. Ó 2011 Elsevier Ltd. All rights reserved. 1. Introduction When conducting survey research, with its goal of gathering the perceptions, perspectives, and opinions of individuals, researchers increasingly face the choice of using online sites as a way of admin- istering their surveys. Yet, there is always a concern that changing the ways of delivering a survey might change the results obtained substantially. In this study, we compared methods of survey delivery as we gathered college students’ perceptions about their practices and affective responses when engaged in in-school and out-of-school writing. Although previous investigations comparing web-based with traditional methods of survey delivery have typi- cally come to the conclusion that there are either no or only small differences in data obtained in these different survey conditions, we found few studies of delivery effects that had dealt with educa- tional issues (except associated with course instructor surveys or school satisfaction) or that had used random assignment of partic- ipants to conditions. With these concerns in mind, we designed a study to test whether delivery conditions would affect participa- tion and attrition, the amount of time spent on the survey, and the quality of substantive responses to the survey questions. Par- ticularly because our survey was long and included open-ended questions, and dealt with writing, an educational activity both liked and disliked by college students but increasingly important in all settings (Brandt, 2001; Schallert & Wade, 2005; Selfe & Hawisher, 2004), the method of obtaining data might make a difference. In their theoretical work and empirical tests of hypotheses rel- evant to education, researchers today are increasingly making use of online delivery methods to administer surveys. Previous tests of such methods of delivery have generally reported that online, web-based administrations are comparable to more traditional techniques that require meeting respondents in person and having paper versions of surveys filled out in supervised sessions (Cronk & West, 2002; Gosling, Vazire, Srivastava, & John, 2004; Martins, 2010). Nevertheless, researchers continue to worry about the reli- ability and validity of data collected online via the web, due to in- creased variability, participant dropout and attrition, disinhibition of participants, low motivation to provide conscientious responses, lack of controlled environment, or worries about the authenticity of the identity of the respondent (e.g., Cronk & West, 2002; Fouladi, McCarthy, & Moller, 2002; Martins, 2010; Yang, Levine, Xu, & Lopez Rivas, 2009). One issue that concerns researchers is that students filling out a survey at their own leisure, choosing the time and location of their preference, may find themselves easily distracted from the task of reading the survey items carefully by other demands on their 0747-5632/$ - see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.chb.2011.04.016 Corresponding author. Address: Department of Educational Psychology, SZB 504, D5800, University of Texas, Austin, TX 78712, United States. Tel.: +1 512 232 4835, fax: +1 512 471 1288. E-mail addresses: ayshegul.musallam@gmail.com (A.B. Musallam), dschallert@ mail.utexas.edu (D.L. Schallert), hjinkim71@gmail.com (H. Kim). Computers in Human Behavior 27 (2011) 1915–1921 Contents lists available at ScienceDirect Computers in Human Behavior journal homepage: www.elsevier.com/locate/comphumbeh