A comparison of an electronic version of the SF-36 General Health Questionnaire to the standard paper version Judy M. Ryan 1 , John R. Corry 2 , Robyn Attewell 3 & Michael J. Smithson 1 1 School of Psychology, Australian National University (E-mail: judy.ryan@anu.edu.au); 2 Occupational Health and Rehabilitation Services; 3 Covance Pty Ltd, Canberra, Australia Accepted in revised form 16 October 2001 Abstract Because of its sound psychometric properties the SF-36 General Health Questionnaire is used throughout the world, yet it is difficult to analyse and score. Using a newly developed software package, onto which any questionnaire can be loaded, we developed an electronic version of the SF-36 General Health Question- naire. The purpose of this study is test the effect of the electronic mode of administration on the mea- surement properties of the SF-36. In a randomised cross-over design study 79 healthy individuals and 36 chronic pain patients completed both electronic and paper versions of the SF-36. Seventy-one percent preferred the electronic SF-36, 7% stated no preference, and 22% preferred the paper version. Completion time for the electronic SF-36 was slightly less, and there were no missing or problematical responses, whereas 44% of participants had at least one missing or problematical response in the paper version. Data entry and auditing time was 8 hours. There was less than 4% inter-version difference for any of the SF-36 sub-scales. The electronic SF-36 was well accepted and slightly quicker to complete than the paper version. We conclude that the electronic SF-36 is equivalent in performance and more effective than the paper version. Key words: Data quality, Electronic administration mode, SF-36 Introduction Quality of life questionnaires such as the short form-36 (SF-36) General Health Questionnaire are increasingly used in research and clinical practice [1]. While questionnaires are important in assess- ing the benefits of therapy for patients they have inherent problems. Patients may misunder- stand or fail to answer particular questions. They also may mark more than one response or place a mark overlapping more than one response. Pa- tient’s responses are either scanned or manually entered into a database, which then needs to be checked for scanning or keying errors. Several studies indicate that manually entering or scanning individual questionnaires into a database is time consuming and results in an increased error rate [2–4]. Over the last decade inter-active software pack- ages such as Medquest 1 (Green Turtle Healthware ‘Medquest’) have been developed. Essentially Medquest is a package of electronic questionnaires based on original validated questionnaires which are in the public domain. The package can be tailored to suit different clinical settings by creat- ing electronic versions of particular question- naires. The Medquest software can be operated on a standard IBM compatible PC, which could be located in a semi-private area of a clinician’s wait- ing room. Patients enter their responses to indi- vidual questions, which appear on the screen, by clicking on the appropriate radio button or nu- meral with the computer mouse, or via the 1 Medquest software has been developed by the second author of this paper. At present it is available for free trial. Quality of Life Research 11: 19–26, 2002. Ó 2002 Kluwer Academic Publishers. Printed in the Netherlands. 19