Painful monitoring: Automatic pain monitoring using the UNBC-McMaster shoulder
pain expression archive database
☆
Patrick Lucey
a, b, c,
⁎, Jeffrey F. Cohn
b, c
, Kenneth M. Prkachin
d
, Patricia E. Solomon
e
,
Sien Chew
f
, Iain Matthews
a, c
a
Disney Research Pittsburgh, Pittsburgh, PA, United States
b
Department of Psychology, University of Pittsburgh, Pittsburgh, PA, United States
c
Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, United States
d
Department of Psychology, University of Northern British Columbia, United States
e
School of Rehabilitation Sciences, McMaster University, Hamilton, Canada
f
SAIVT Laboratory, Queensland University of Technology, Brisbane, Australia
abstract article info
Article history:
Received 1 July 2011
Received in revised form 26 November 2011
Accepted 4 December 2011
Keywords:
Pain
Active Appearance Models (AAMs)
Action Units (AUs)
FACS
In intensive care units in hospitals, it has been recently shown that enormous improvements in patient out-
comes can be gained from the medical staff periodically monitoring patient pain levels. However, due to the
burden/stress that the staff are already under, this type of monitoring has been difficult to sustain so an au-
tomatic solution could be an ideal remedy. Using an automatic facial expression system to do this represents
an achievable pursuit as pain can be described via a number of facial action units (AUs). To facilitate this
work, the “University of Northern British Columbia-McMaster Shoulder Pain Expression Archive Database”
was collected which contains video of participant's faces (who were suffering from shoulder pain) while
they were performing a series of range-of-motion tests. Each frame of this data was AU coded by certified
FACS coders, and self-report and observer measures at the sequence level were taken as well. To promote
and facilitate research into pain and augmentcurrent datasets, we have publicly made available a portion
of this database, which includes 200 sequences across 25 subjects, containing more than 48,000 coded frames
of spontaneous facial expressions with 66-point AAM tracked facial feature landmarks. In addition to describ-
ing the data distribution, we give baseline pain and AU detection results on a frame-by-frame basis at the
binary-level (i.e. AU vs. no-AU and pain vs. no-pain) using our AAM/SVM system. Another contribution we
make is classifying pain intensities at the sequence-level by using facial expressions and 3D head pose
changes.
© 2011 Elsevier B.V. All rights reserved.
1. Introduction
In Atul Gawande's recent book entitled “The Checklist Manifesto”
[1], he notes that tremendous improvement in patient outcomes in
intensive care unit (ICU) settings have been achieved through adher-
ing to standardized hygiene and monitoring checklists. One of these
items on the checklist is pain monitoring, in which a medical staff
member (i.e. nurse or physician) checks on a patient every 4 hours
to evaluate whether they are suffering from pain and to make any
needed adjustments in pain medication, treatment or diagnosis that
may be warranted. However, due to the large workload that hospital
staff currently experience, monitoring pain consistently and reliably
has been difficult to achieve. As such, having a method to automati-
cally monitor pain could be an ideal solution.
Measuring or monitoring pain is normally conducted via self-
report as it is convenient and requires no special skill or staffing.
However, self-report measures cannot be used when patients cannot
communicate verbally (i.e. unconsciousness, breathing tubes interfer-
ing with speech, lacking function speech (infants)), so an observer
rating is required where the observer chooses a face on the “faces of
pain” scale which resembles the facial expression of the patient [3].
Both these measures have problems though, as they are subjective
and do not give a continuous output over time (i.e. is the pain increas-
ing, decreasing, or spiking?).
1
Many researchers have pursued the
goal of obtaining a continuous objective measure of pain through an-
alyzes of tissue pathology, neurological “signatures”, imaging proce-
dures, testing of muscle strength and so on [4]. These approaches
Image and Vision Computing 30 (2012) 197–205
☆ This paper has been recommended for acceptance by special issue Guest Editors
Rainer Stiefelhagen, Marian Stewart Bartlett and Kevin Bowyer.
⁎ Corresponding author at: Disney Research Pittsburgh, Pittsburgh, PA, United
States. Tel.: + 1 14122986976.
E-mail address: pjlucey@gmail.com (P. Lucey).
1
See William et al.'s work [2] for full description and analysis of these factors.
0262-8856/$ – see front matter © 2011 Elsevier B.V. All rights reserved.
doi:10.1016/j.imavis.2011.12.003
Contents lists available at SciVerse ScienceDirect
Image and Vision Computing
journal homepage: www.elsevier.com/locate/imavis