A conceptually relaxed utilization of the variable “search
experience” makes it difficult for researchers to perform
meaningful cross-study comparisons. The purpose of this
study was to examine how search experience is defined
and measured when used as a research variable. We
implemented a qualitative analysis of 32 library and infor-
mation science (LIS) research articles. We found that there
was inconsistent terminology usage and measurements.
Specifically, there were 21 unique labels to describe
the search experience and 18 different measurements. The
majority of the studies used a generic label “search expe-
rience” and relied on the reader to grasp specific context
of the electronic information retrieval environment to
which the variable applies from the description of the
overall research design. In addition, there was a strong
preference for measures that represented subjective self-
reporting about the level of exposure to some information
retrieval system. It is evident that there is a need for arti-
cles to contain detailed definitions of search experience
variables for readers to truly understand the findings.
Introduction
The information users’ facility with electronic information
retrieval (IR) systems has been widely recognized as an
important element in the research on information systems
usage. While many researchers have accounted for search
experience in their research designs, there appears to be lit-
tle consistency in the operational definitions and measure-
ments. In their comprehensive categorization of research
variables applied in user studies and IR research, Meadow,
Marchionini, and Cherry (1994), and Yuan and Meadow (1999)
reinforced the importance of the consistent use of variables
as a prerequisite for better communication among the
researchers in the field. A conceptually relaxed utilization
of the variable “search experience” makes it difficult for
researchers to perform meaningful cross-study comparisons,
prevents researchers from building on the outcomes from the
previous studies, and ultimately contributes to conflicting
findings about the impact of a user’s search experience on
the processes and outcomes of IR systems use.
As an initial step to facilitate future use of search experi-
ence as a research variable, we conducted a comparative
review of how this variable has been measured in a sample of
32 studies published in the key library and information sci-
ence (LIS) research publications. In this article, we use the
concept “search experience” as a label that broadly refers to
any skill an individual possesses for using some electronic IR
system. This concept consists of two key elements:
•
“Search” refers to techniques one applies when using an
electronic IR system to find records of interest.
•
“Experience” means the accumulation of knowledge and
skills needed to use an electronic IR system.
We do not imply that “search experience” is the best possible
label, but one that is broad enough to communicate effec-
tively for the purposes of this article.
Background
The rapid developments of IR technologies have provided
people with abundant opportunities to search for informa-
tion with a variety of IR tools, and therefore have led to the
development of various types of personal experiences with
IR systems. Before the mid-1990s, researchers had focused
on the users’ experiences with command-driven online biblio-
graphic databases (e.g., ERIC, MEDLINE, Dialog) and online
public-access catalogs (e.g., Fenichel, 1979; Hsieh-Yee,
1993; Marchionini, Dwiggins, Katz, & Lin, 1993; Penniman,
1981). This period also includes studies of some early, stand-
alone hypertext systems (e.g., Marchionini, Lin, & Dwiggins,
1990). With proliferation of the Internet and the Web, the Web
search engines gained popularity among the users, which in
turn ignited research interest in users’ experiences with the
Web-based IR systems (Lazonder, Biemans, & Woperis,
2000; Navarro-Prieto, Scaife, & Rogers, 1999; Palmquist &
Kim, 2000).
A review of research studies that focused on the impact of
user experience on IR systems usage reveals that there have
been many mixed findings. For example, early work by
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 58(10):1529–1546, 2007
The Search Experience Variable in Information
Behavior Research
Joi L. Moore, Sanda Erdelez, and Wu He
School of Information Science and Learning Technologies, University of Missouri-Columbia, 303 Townsend Hall,
Columbia, MO 65211. E-mail: moorejoi@missouri.edu
Received August 15, 2005; revised July 24, 2006; accepted December 5,
2006
© 2007 Wiley Periodicals, Inc.
•
Published online 18 June 2007 in Wiley
InterScience (www.interscience.wiley.com). DOI: 10.1002/asi.20635