Journal of Consulting and Clinical Psychology 1998, Vol. 66, No. 1, 163-167 Copyright 1998 by the American Psychological Association, Inc. 0022-006X/98/$3.00 Being Bolder With the Boulder Model: The Challenge of Education and Training in Empirically Supported Treatments Gerald C. Davison University of Southern California A number of factors interfere with the realization of the scientist-practitioner model of training in applied psychology. Resistance to empirically supported treatments (ESTs) may arise from both academic faculty and internship supervisors who have an investment in approaches of longer standing but with less empirical justification. A possible problem with ESTs, however, is that they typically derive from studies that use treatment manuals, which, originally developed to define the independent variables in psychotherapy research, have become central in graduate training. Because manuals can constrain clinician behavior and because they are almost always associated with categorically defined diagnostic categories, one can lose sight of the idiographic analysis of single cases. Reliance on manualized treatment can discourage functional analysis of the complexities of individual cases. Achieving some synthesis of this dialectic poses a significant challenge to the continuing development of the science and profession of applied psychology. There is a sense in which discussion today of empirically supported treatments (and psychopathology and assessment, one presumes) is almost quaint. After all, was not the scientist- professional model of training at the very core of the earliest conception of clinical psychology following World War II? The ideal of the clinical (or more generally speaking, the profes- sional or applied) psychologist has for at least the past half century been that of training the student and the professional to think like a scientist and to look to findings from controlled research for clues to understanding psychopathology and in de- vising and evaluating the most effective and most efficient inter- ventions and assessments. But we all know that this has not happened. The ratio of unbridled speculation and appeals to authority vis a vis anything we would call scientific data is very large indeed. And it is inaccurate and unfair to heap all of the blame on the professional school movement that began back in the 1960s when we feared a shortage of doctoral-level clinical psychologists. A disappoint- ment that I believe most of us in avowedly Boulder model training programs share is that we often find ourselves spending time and effort teaching our students intervention and assess- ment procedures and approaches that lack empirical justification or are, at best, inefficient ways to gather information and design humane and effective interventions. It is for this reason—the fact that our scientist-professional rhetoric outdistances our training and educational practices—that the efforts of Division 12 of the American Psychological Association (APA) are timely and welcome (Crits-Christoph, Frank, Chambless, Brody, & Karp, 1995; Task Force on Promotion and Dissemination of Psychological Procedures, 1995). No doubt the chances of fu- ture success are enhanced by the availability in the late 1990s Correspondence concerning this article should be addressed to Gerald C. Davison, Department of Psychology, University of Southern Califor- nia, Los Angeles, California 90089-1061. Electronic mail may be sent to gdaviso@almaak.usc.edu. of a variety of interventions that have been investigated in con- trolled outcome and process studies. As a consequence of this empirical study, explicit and detailed treatment manuals are available that have not only defined the independent variables in research but constitute useful instructional tools for the edu- cation of our future colleagues as well as for the in-service training of those of us who are not familiar with these findings and associated treatment materials. Calhoun, Moras, Pilkonis, and Rehm (1998) laid out clearly and effectively many of the advantages and challenges inherent in the availability of empirically supported treatments (ESTs) with respect to graduate education. I have some reactions to some of what they put forth, along with some elaborations and extensions of several of their points. Eschewing Unverified Procedures There are hurdles to bringing ESTs into our training programs. Courses are sometimes retained out of deference to tradition or to the special interests of a colleague. It can be awkward interpersonally and politically to try to shift course and practi- cum offerings to approaches and procedures that enjoy more empirical support than what some clinical faculty have been doing for years and believe to be effective, especially when one is dealing with tenured (and respected) faculty. This has to be done, however, if we are to be true to our scientist-professional heritage as most recently articulated at the Gainesville Confer- ence (Belar & Perry, 1992).' I would add one further impediment to concentrating on sci- 1 On the other hand, issues of academic freedom come into play when one considers constraints on what and how faculty teaches. This is not an easy matter to resolve. One hopes that the faculty selection process as well as the scholarly environment in one's academic unit coalesce to encourage and support teaching, research, and clinical supervision that are tied as closely as possible to the emergent scientific picture. 163