‘I 1997 The British Assoc~akm of Oral and Maxillofaclal Surgeons Computed intraoperative navigation guidance-a preliminary report on a new technique G. Enislidis, A. Wagner, 0. Ploder, R. Ewers University-Clinic,for Maxillofucial Surgery, Vienna, Austvicr SUMMAR Y. Objective-To assess the value of a computer-assisted three-dimensional guidance system (Virtual Patient System) in maxillofacial operations. Design-Laboratory and open clinical study. Setting-Teaching Hospital, Austria. Subjects-6 patients undergoing various procedures including removal of foreign body (n = 3) and biopsy, maxillary advancement, and insertion of implants (n = 1 each). Interventions-Storage of computed tomographic (CT) pictures on an optical disc, and imposition of intraoperative video images on to these. The resulting display is shown to the surgeon on a micromonitor in his head-up display for guidance during the operations. Main outcome measures-To improve orientation during complex or minimally invasive maxillofacial procedures and to make such operations easier and less traumatic. Results-Successful transferral of computed navigation technology into an operation room environment and positive evaluation of the method by the surgeons involved. Conclusions-Computer-assisted three-dimensional guidance systems have the potential for making complex or minimally invasive procedures easier to do, thereby reducing postoperative morbidity. INTRODUCTION In recent years, three-dimensional imaging techniques have proved to be useful adjuncts to planning, simu- lating and validating operations.‘,2 Imaging facilities have been linked to surgical instruments or systems of guidance to generate a new type of environment, in which the surgeon was provided with information on deep tissue structures during the operation.3 The aim is to provide reliable information on which to base surgical decision-making during the operation without neglecting safety and precision.4 Technical and clinical workers and clinicians are currently colla- borating to develop new interventions and investigate their applications. Our aim is to describe one recent advance. PATIENTS AND METHODS In an attempt to integrate the evolving technology of ‘virtual reality’ in a stereotactic navigation system, Artma Biomedical, Inc., Vienna, Austria,5 developed a new device called the ‘Virtual Patient System@’ (Figs 1 and 2), which was adapted for use in maxillo- facial procedures by the Vienna Clinic for Maxillofacial Surgery.6 Cephalograms, computed tomograms (CT) or magnetic resonance images (MRI) on optical disks were transferred to the workstation for correlation with video recordings of the patient’s face. In this fusion of images, the spatial coordinates of radiologi- cal and video images are imposed on each other. The three-dimensional geometry of all the images is then matched with the patient’s anatomy (Fig. 3). During preoperative planning, the surgeon marks important structures such as foreign bodies, routes of access or anatomical landmarks on the images. To visualize deep structures and to detect the target, these planning graphics can be superimposed on the live video camera images. The combined images- live video and overlay graphics-are then shown on a micromonitor in the surgeon’s head-up display (Figs 4 and 5), which is a head-mounted device using a prism to project the images (Fig. 6) onto the user’s retina. During the procedure, the surgeon is able to see the surgical site and at the same time to watch the images on the micromonitor in the head- up display. This so-called ‘semi-immersive environ- ment’ is guiding the surgeon and increasing freedom of communication, motion, and view. There is no need to twist the neck and stare at a monitor while operating, nor is there any need for head clamps or additional instruments. This is defined as ‘augmented reality’, which is between reality and ‘virtual’ reality. Virtual reality implies full immersion into a virtual environment, that is generated by audiovisual stimuli, and aug- mented reality allows the recognition of structures in the peripheral environment and in the full-color image display simultaneously. Safety features such as unob- structed sight and range of movement are guaranteed. Among the variety of ways of continuously match- ing the artificial environment to reality, we chose an electromagnetic tracking system (Polhemus Inc., Colchester, Vermont, USA) and special software (Artma Biomedical, Inc., Vienna, Austria) to be integrated into our prototype. Electromagnetic sen- sors are fixed to the patient’s forehead, to instruments, to implants and to a video camera, which can be mounted on the surgeon’s head. Whenever an endo- 271