CAP Laboratory Improvement Programs Professional Practice Evaluation for Pathologists The Development, Life, and Death of the Evalumetrics Program Keith E. Volmar, MD; Shannon J. McCall, MD; Ronald B. Schifman, MD; Michael L. Talbert, MD; Joseph A. Tworek, MD; Keren I. Hulkower, PhD; Anthony J. Guidi, MD; Raouf E. Nakhleh, MD; Rhona J. Souers, MS; Christine P. Bashleben, MT(ASCP); Barbara J. Blond, MBA, MT(ASCP) Context.—In 2008, the Joint Commission (JC) imple- mented a standard mandating formal monitoring of physician professional performance as part of the process of granting and maintaining practice privileges. Objective.—To create a pathology-specific management tool to aid pathologists in constructing a professional practice-monitoring program, thereby meeting the JC mandate. Design.—A total of 105 College of American Patholo- gists (CAP)–defined metrics were created. Metrics were based on the job descriptions of pathologists’ duties in the laboratory, and metric development was aided by experi- ence from the Q-Probes and Q-Tracks programs. The program was offered in a Web-based format, allowing secure data entry, customization of metrics, and central data collection for future benchmarking. Results.—The program was live for 3 years, with 347 pathologists subscribed from 61 practices (median, 4 per institution; range, 1–35). Subscribers used 93 of the CAP- defined metrics and created 109 custom metrics. The median number of CAP-defined metrics used per pathol- ogist was 5 (range, 1–43), and the median custom-defined metrics per pathologist was 2 (range, 1–5). Most frequent- ly, 1 to 3 metrics were monitored (42.7%), with 20% each following 4 to 6 metrics, 5 to 9 metrics, or greater than 10 metrics. Anatomic pathology metrics were used more commonly than clinical pathology metrics. Owing to low registration, the program was discontinued in 2016. Conclusions.—Through careful vetting of metrics it was possible to develop a pathologist-specific management tool to address the JC mandate. While this initial product failed, valuable metrics were developed and implementation knowledge was gained that may be used to address new regulatory requirements for emerging value-based pay- ment systems. (Arch Pathol Lab Med. 2017;141:551–558; doi: 10.5858/ arpa.2016-0275-CP) I n 2008, the Joint Commission (JC) implemented a new standard mandating evaluation of practitioners’ profes- sional performance as part of the process of granting and maintaining practice privileges in a health care organiza- tion. 1–3 The mandate included 2 distinct forms of evaluation: (1) Ongoing Professional Practice Evaluation (OPPE) and (2) Focused Professional Practice Evaluation (FPPE). OPPE is intended to be a means of performance evaluation that is conducted on an ongoing basis, with the aims to monitor competency, identify areas for possible improvement, and use objective data in decisions for continuance of practice privileges. FPPE involves more specific and time-limited monitoring of performance in 3 situations: (1) when a provider is initially granted privileges, (2) when new privileges are requested for an already privileged provider, and (3) when performance problems involving a privileged provider are identified (either through the OPPE process or by any other means, such as complaints or significant departure from accepted practice). At the heart of the JC mandate is the concept that professional practice evaluation must reflect the competency of individual pathologists (not the laboratory or pathology department as a whole). This must be accomplished through objective measures, or metrics, that assess an individual pathologist’s quality and timeliness of work, as well as compliance with requirements of continuing medical education and institutional service. The JC recommends that OPPE/FPPE programs organize performance metrics based on the 6 competency areas defined by the Accred- itation Council for Graduate Medical Education (ACGME) Accepted for publication September 7, 2016. From the Department of Pathology, Rex Pathology Associates, Raleigh, North Carolina (Dr Volmar); the Department of Pathology, Duke University Medical Center, Durham, North Carolina (Dr McCall); the Department of Pathology, Southern Arizona VA Healthcare System, Tucson (Dr Schifman); the Department of Pathology, Oklahoma University Health Science Center, Oklahoma City (Dr Talbert); the Department of Pathology, St. Joseph Mercy Hospital, Ann Arbor, Michigan (Dr Tworek); Structured Data Team (Dr Hulkower), Department of Biostatistics (Ms Souers), Department of Laboratory Improvement–Surveys (Ms Bashleben), Department of Laboratory Improvement–Biostatistics (Ms Blond), College of Amer- ican Pathologists, Northfield, Illinois; the Department of Pathology, Newton-Wellesley Hospital, Newton, Massachusetts (Dr Guidi); and the Department of Pathology, Mayo Clinic Jacksonville, Jacksonville, Florida (Dr Nakhleh). The authors have no relevant financial interest in the products or companies described in this article. Reprints: Keith E. Volmar, MD, Department of Pathology, Rex Pathology Associates, 4420 Lake Boone Trail, Raleigh, NC 27607 (email: keith.volmar@unchealth.unc.edu). Arch Pathol Lab Med—Vol 141, April 2017 Professional Practice Evaluation for Pathologists—Volmar et al 551