Page 1 CES Conference ’99 – Evaluation: An Essential Contribution SCÉ Conférence ’99 – L’évaluation : une contribution essentielle PRELIMINARY PROGRAM PROGRAMME PRÉLIMINAIRE Updated March 16, 1999 Mis à jour le 16 mars 1999 (Subject to Change Sujet à changement) Papers • Panels • Posters • Symposia Documents de travail • Panels • Séances d'affichage • Symposiums Sessions will be delivered in the language in which the abstract was submitted. Les séances auront lieu dans la langue dans laquelle le précis a été soumis. Monday, May 17, 1999 – 10:30 am - 12:00 pm Lundi le 17 mai 1999 – 10h30 à 12h00 1. Panel Presentation – Building Evaluation Capacity within a Volunteer Organization: The Canadian Cancer Society H.S. Campbell, C. Nykiforuk, R. Cameron, Centre for Behavioural Research and Program Evaluation K.S. Brown, Faculty of Actuarial Sciences and Statistics, University of Waterloo M. Asselbergs, A. Davis, A. Vezina, Canadian Cancer Society (National Office) In 1993 the Canadian Cancer Society (CCS) and the National Cancer Institute of Canada created the Centre for Behavioural Research and Program Evaluation (CBRPE). Its mandate was to increase the quality and quantity of sociobehavioural research applied to cancer in Canada, and to assist the CCS in program development and evaluation. Over the first five years of operation, CBRPE has conducted a number of program evaluations for the CCS that have shaped the development of informational programs and emotional support programs. Over the next five years, CBRPE and CCS’s will be to increase the evaluation capacity of the CCS, at both the national and divisional level, to develop and utilize performance monitoring and planning that will allow the organization to become consistently focused on effectiveness and quality. This presentation will address the challenges faced in building evaluation capacity in a volunteer organization and explore strategies for meeting these challenges: 1) Securing buy-in and cooperation from program planners as well as those at the grass roots level who deliver programs; 2) Setting priorities, identifying program goals and linking them to performance monitoring and quality improvement; 3) Developing protocols and tools that can be used at both the local level as well as the organizational level for measuring program implementation, reach, impact and costs; and 4) Developing systems by which evaluation data becomes part of ongoing monitoring and quality improvement. Examples from current CCS programs (Cancer Information Service, Emotional Support Programs) will be used to illustrate the above. This session is designed to be interactive and to focus discussion on the challenges facing volunteer organizations as they seek to improve organizational effectiveness. 2. Paper Presentations - Health Care Performance A. Report Cards and Quality Improvement for Stroke Care Adalstein D. Brown, University of Western Ontario Geoffrey M. Anderson, Department of Health Administration, University of Toronto Background: Health care report cards frequently report the outcomes of care provided for stroke. We explore the value of currently available quality improvement tools specific to stroke care. Methods: We identified the most commonly reported health care outcomes for stroke and quality improvement tools such as guidelines and protocols for stroke care. Results: Currently, the continuum of stroke care is not well supported by comprehensive, consistent, and integrated clinical guidelines. Guidelines do not include explicit criteria for the evaluation of patient care following implementation. Guidelines vary in their relevance to specific outcome measures and may provide conflicting advice on appropriate patient management. Discussion: Guidelines and other quality improvement tools are inadequate to support quality improvement as measured in report card