This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT 1
Incentive Alignment and Risk Perception:
An Information Security Application
Fariborz Farahmand, Senior Member, IEEE, Mikhail (Mike) J. Atallah, Fellow, IEEE,
and Eugene H. Spafford, Fellow, IEEE
Abstract—Technologies and procedures for effectively secur-
ing the enterprise in cyberspace exist, but are largely underde-
ployed. Reasons for this shortcoming include the neglect of the
role of stakeholder perceptions in organizational reward systems,
and misaligned incentives for effective allocation of resources. We
present a methodology for practitioners to employ, with examples
for identification of perverse incentives—situations where the in-
terests of a manager or employee are not aligned with those of the
organization—and for estimation of the damage caused by incen-
tive misalignment. We present our revision to the risk perception
model developed by Fischhoff and Slovic. We also present the re-
sults of our findings from our interviews of 42 information security
executives across the U.S. about the role of risk perception and
incentives in information security decisions. We discuss how to
identify and to correct misalignments, to develop efficient incen-
tive structures, and to include perceptual principles and security
governance in making information security a property of the orga-
nizational environment. This research contributes to the practice
and theory of information security, and has several implications
for practitioners and researchers in the alignment of incentives
and symmetrization of information across organizations.
Index Terms—Alignment, decision-making, incentives,
information security, perceptions, risk.
The internal incentives that shape how the group perceives risks
and rewards may be very different from the reality of the risks and
rewards in the external marketplace. Those incentives can distort risk
perception.
Daniel Kahneman [35].
I. INTRODUCTION
I
NCENTIVES are as important as technical design in achiev-
ing dependability [2]. Most research being carried out today
in information security focuses on developing new technolo-
gies, yet much of the currently existing technology is not being
utilized because of problems that relate to risk perceptions and
misaligned incentives.
Whereas technologically sophisticated analysts employ risk
assessment to evaluate hazards, the majority of experts and the
general public rely on intuitive risk judgments, typically called
Manuscript received September 6, 2010; revised January 18, 2011, August
10, 2011, and December 7, 2011; accepted January 5, 2012. Date of publication;
date of current version. Review of this manuscript was arranged by Department
Editor B. C. Y. Tan. This work was supported in part by the Center for Education
and Research in Information Assurance and Security, Purdue University, and
the National Science Foundation under Grant 0725152.
The authors are with Purdue University, West Lafayette, IN 47907 USA (e-
mail: fariborz@purdue.edu; mja@cs.purdue.edu; spaf@cerias.purdue.edu).
Digital Object Identifier 10.1109/TEM.2012.2185801
“risk perceptions.” The underlying experience that informs those
judgments tends to derive from news media, which rather thor-
oughly document mishaps and threats occurring throughout the
world [37]. Risk perception, in information security, has been
identified as the first and most common area that can cause
the feeling of security to diverge from the reality of security
[34].
Incentives, in the context of information security, are defined
as: “The motive that the people guarding and maintaining the
systems have to do their job properly and also the motive that the
attackers have to try to defeat your policy” [1]. Anderson iden-
tifies incentives along with policy, mechanism, and assurance
as four interacting elements of security engineering analysis
and indicates that misperception of risk underlies many policy
problems [1].
Ba et al. [3] define important unresolved problems along
the incentive-alignment dimension of information systems, and
present a research agenda to address them. They mention be-
havioral theories from other disciplines (e.g., economics, psy-
chology), and argue that research in these areas may illuminate
both how to resolve incentive-alignment issues in information
systems design and to change the underlying assumptions of an
information system. Here, we present a model that aligns with
the framework proposed by Ba et al. and is intended to help
align stakeholder perceptions of information security risks with
governance incentives. This paper advances Ba et al.’s “con-
ceptual” framework by presenting an algorithm to “quantify”
incentive misalignment in the information security decisions.
In particular, our methodology seeks to identify and correct
perverse incentives, as documented in the economics literature
(e.g., [17], [24]). In our methodology, we acknowledge that
practitioners may base their decisions on subjective beliefs that
may well be objectively erroneous.
After presenting some related work and theoretical issues,
we describe a methodology for identifying perverse incen-
tives, provide illustrative examples, and explain how to quantify
incentive-misalignment risk. The remainder of this paper deals
with perceptions of information security risks, including those in
incentive alignment, that may confront practitioners. We briefly
present our revision to the seminal risk perception model devel-
oped by Fischhoff and Slovic, and the results of our interviews
with 42 information security executives across the U.S. about
the role of risk perception and incentives in information secu-
rity decisions. Finally, we discuss why a focus on monetary
incentives alone cannot resolve information security issues, and
thus, why managers need to address a variety of issues including
perceptual principles and security governance.
0018-9391/$31.00 © 2012 IEEE