Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.
Letters to the Editor
Academic Medicine, Vol. 90, No. 8 / August 2015 1002
have been well designed, although with
six years and 50,000 patients’ worth of
experience, we are getting better) will
result in persuasive data that not only
promote rapid improvement in care but
also generate excitement among front-
line clinicians eager to learn more. This
excitement is a fundamental requirement
if we are to change our health care
system for the better.
Disclosures: None reported.
Michael Farias, MD, MS, MBA
Pediatric cardiology fellow, Department of
Cardiology, Boston Children’s Hospital, and
Department of Pediatrics, Harvard Medical School,
Boston, Massachusetts.
Kevin G. Friedman, MD
Staff cardiologist, Department of Cardiology, Boston
Children’s Hospital, and Department of Pediatrics,
Harvard Medical School, Boston, Massachusetts.
James E. Lock, MD
Cardiologist-in-chief and professor of pediatrics,
Department of Cardiology, Boston Children’s
Hospital, and Department of Pediatrics, Harvard
Medical School, Boston, Massachusetts.
Jane W. Newburger, MD, MPH
Associate cardiologist-in-chief and professor of
pediatrics, Department of Cardiology, Boston
Children’s Hospital, and Department of Pediatrics,
Harvard Medical School, Boston, Massachusetts.
Rahul H. Rathod, MD
Staff cardiologist, Department of Cardiology, Boston
Children’s Hospital, and Department of Pediatrics,
Harvard Medical School, Boston, Massachusetts;
Rahul.Rathod@childrens.harvard.edu.
References
1 Sox HC, Stewart WF. Algorithms, clinical
practice guidelines, and standardized
clinical assessment and management plans:
Evidence-based patient management
standards in evolution. Acad Med.
2015;90:129–132.
2 Farias M, Friedman KG, Lock JE, Newburger
JW, Rathod RH. Gathering and learning from
relevant clinical data: A new framework.
Acad Med. 2015;90:143–148.
care across several dozen network
institutions. Their commentary focuses
on the similarities between clinical
practice guidelines (CPGs) and SCAMPs,
contending that the major difference
is that SCAMPs encourage deviations
from the plan, and thus use insights
from variation to improve care. In one
sense, Sox and Stewart are exactly right:
SCAMPs are, at the outset, no different
than CPGs in that they rely on established
literature and “expert” opinion to create
a standardized care algorithm. The major
advance of SCAMPs is not, in fact, the
encouragement of deviations as a tool
for learning. Rather, it is the focused
prospective collection of relevant clinical
data, using targeted data statements that
attempt to predict how the SCAMP will
affect an episode of care. This collection
of a limited data set, based on known
uncertainties in an episode of care,
is fundamentally Bayesian in nature.
Since the data collection (including,
but by no means limited to, deviation
data) is tightly focused, the data can be
collected and analyzed in a time frame
unprecedented in medical care.
Until the first data are analyzed, a SCAMP
is a CPG. After the clinicians receive their
first analysis, everything changes: The
CPG becomes a SCAMP. Clinicians learn
from the deviations as well as the targeted
information collected and can improve
the SCAMP using persuasive data (not,
as is the case for CPGs, expert opinion or
“conclusive” data). This process continues
and even accelerates, and clinicians are
invariably surprised by prior clinical
beliefs that are shown to be flawed based
on real data from their own patients.
SCAMPs are becoming increasingly
popular among thoughtful academic
clinical leaders who practice medicine
on a day-to-day basis. The innovation
responsible for this acceptance goes
beyond the assessment of deviations—it
is the use of targeted data statements
to direct data collection and analysis,
predict what happens in real-life
medicine, and permit a continuous-
improvement process. The focus of
the commentary by Sox and Stewart
indicates that we have not done very well
in communicating why SCAMPS work
where other efforts have failed. SCAMPs
provide a framework that facilitates
the collection and analysis of targeted
relevant clinical data. A well-designed
SCAMP (note that not all SCAMPS
uncovering sensitive information that had
no clinical or educational merit, then we
would be guilty of snooping—which is
clearly unethical, regardless of the source
of data or our level of training.
When completing clinical rotations
we sign over patients for whom the
diagnosis has not yet been made and/
or the response to treatment established.
From our own experience, and in
discussion with our colleagues, it is
common practice to ruminate on cases,
ask colleagues for updates, and review
progress via paper and electronic charts.
Rather than snooping, the primary driver
of information-seeking behavior is our
need for cognition.
3,4
And, while we
appreciate that individuals may behave
differently when online versus in person,
5
we should still be capable of meeting our
learning needs without compromising
the privacy needs of patients.
Disclosures: None reported.
Kevin McLaughlin, MB ChB (Hons), PhD
Assistant dean of undergraduate medical education,
Cumming School of Medicine, University of Calgary,
Calgary, Alberta, Canada; kmclaugh@ucalgary.ca.
Sylvain Coderre, MD, MSc
Associate dean of undergraduate medical education,
Cumming School of Medicine, University of Calgary,
Calgary, Alberta, Canada.
References
1 Brisson GE, Johnson Neely K, Tyler PD,
Barnard C. Privacy versus confidentiality:
More on the use of the electronic health record
for learning. Acad Med. 2015;90:1001.
2 Canadian Medical Association. CMA Code
of Ethics. Updated 2004. http://policybase.
cma.ca/dbtw-wpd/PolicyPDF/PD04-06.pdf.
Accessed May 4, 2015.
3 Wilson TD. Human information behavior.
Informing Sci. 2000;3:49–55.
4 Verplanken B, Hazenberg PT, Palenéwen GR.
Need for cognition and external information
search effort. J Res Pers. 1992;26:128–136.
5 Suler J. The online disinhibition effect.
Cyberpsychol Behav. 2004;7:321–326.
Differentiating Standardized
Clinical Assessment and
Management Plans From
Clinical Practice Guidelines
To the Editor: We would like to thank
Sox and Stewart
1
for their extensive
commentary on our Standardized
Clinical Assessment and Management
Plan (SCAMP) initiative,
2
an effort that
now extends to more than 60 SCAMPs
touching nearly all aspects of medical
The Role of Professional
Medical Education Societies
in Fostering Professional
Identity
To the Editor: I read with keen
interest the article by Sabel and
colleagues
1
describing the barriers to
professional identity construction as a
medical educator among physicians in
the United Kingdom. The experience of
anesthesiology in the United States may
serve to illustrate one facet of fostering
professional identity as a medical educator.