Assessing Writing 33 (2017) 12–24
Contents lists available at ScienceDirect
Assessing Writing
Assessing peer and instructor response to writing: A corpus
analysis from an expert survey
Ian G. Anson
a,*
, Chris M. Anson
b
a
Department of Political Science, University of Maryland, Baltimore County, 1000 Hilltop Cir., 305 PUP, Baltimore, MD 21250, United
States
b
North Carolina State University, Box 8105, Raleigh, NC 27695, United States
a r t i c l e i n f o
Article history:
Received 22 September 2016
Received in revised form 7 February 2017
Accepted 1 March 2017
Keywords:
Peer response
Peer review
Teacher response
Corpus analysis
a b s t r a c t
Over the past 30 years, considerable scholarship has critically examined the nature of
instructor response on written assignments in the context of higher education (see Straub,
2006). However, as Haswell (2008) has noted, less is currently known about the nature of
peer response, especially as it compares with instructor response. In this study, we criti-
cally examine some of the properties of instructor and peer response to student writing.
Using the results of an expert survey that provided a lexically-based index of high-quality
response, we evaluate a corpus of nearly 50,000 peer responses produced at a four-year
public university. Combined with the results of this survey, a large-scale automated con-
tent analysis shows first that instructors have adopted some of the field’s lexical estimation
of high-quality response, and second that student peer response reflects the early acqui-
sition of this lexical estimation, although at further remove from their instructors. The
results suggest promising directions for the parallel improvement of both instructor and
peer response.
© 2017 Elsevier Inc. All rights reserved.
When responding to written work, do teachers use preferred practices? Do their students learn and model those practices?
Scholars from a variety of disciplines have investigated the quality and content of instructor response to writing, often
concluding that instructors focus their responses on superficial or “lower-order” concerns such as grammar, spelling, and
wording at the expense of more complex rhetorical, structural, and meaning-based considerations (e.g., Connors & Lunsford,
1988). Some recent work has offered more reason for optimism, arguing that instructor response may be undergoing a
“generational shift” toward higher-order considerations by virtue of scholarship in writing studies and writing-across-the-
curriculum initiatives (Dixon & Moxley, 2013). By higher-order, we refer to feedback that assesses broader, conceptual-level
features of writing, such as “the development of ideas, organization, and the overall focus” of a text (Keh, 1990, 296; see
also Nystrand, 1984). However, these developments have yet to influence practice across a wide variety of higher education
contexts (e.g., Bailey & Garner, 2010).
In this study, we extend what is known about these questions by examining the properties of teacher and peer response
on drafts of writing assignments. As a strategy for improving metacognition, revision and editing, awareness of audience, and
other crucial skills, peer response represents an increasingly popular method used by writing instructors (e.g., Berg, 1999;
DiPardo & Freedman, 1988; Lundstrom & Baker, 2009). We first seek to refine the prevailing understanding of response by
establishing a lexicon of response terms from a national survey of experienced writing instructors and scholars. Respondents
in this study value response that communicates principles of audience and purpose, argumentation, clarity and cohesion,
*
Corresponding author.
E-mail addresses: iganson@umbc.edu (I.G. Anson), chris anson@ncsu.edu (C.M. Anson).
http://dx.doi.org/10.1016/j.asw.2017.03.001
1075-2935/© 2017 Elsevier Inc. All rights reserved.