Hybrid Navigation Interface for Orthopedic and Trauma Surgery Joerg Traub 1 , Philipp Stefan 1 , Sandro Michael Heining 2 , Tobias Sielhorst 1 , Christian Riquarts 2 , Ekkehard Euler 2 , and Nassir Navab 1 1 Chair for Computer Aided Medical Procedures (CAMP), TU Munich, Germany {traub, stefanp, sielhors, navab}@cs.tum.edu 2 Chirurgische Klinik und Poliklinik - Innenstadt, LMU Munich, Germany {Sandro-Michael.Heining, Christian.Riquarts, Ekkehard.Euler}@med.uni-muenchen.de Abstract. Several visualization methods for intraoperative navigation systems were proposed in the past. In standard slice based navigation, three dimensional imaging data is visualized on a two dimensional user interface in the surgery room. Another technology is the in-situ visual- ization i.e. the superimposition of imaging data directly into the view of the surgeon, spatially registered with the patient. Thus, the three di- mensional information is represented on a three dimensional interface. We created a hybrid navigation interface combining an augmented real- ity visualization system, which is based on a stereoscopic head mounted display, with a standard two dimensional navigation interface. Using an experimental setup, trauma surgeons performed a drilling task using the standard slice based navigation system, different visualization modes of an augmented reality system, and the combination of both. The inte- gration of a standard slice based navigation interface into an augmented reality visualization overcomes the shortcomings of both systems. 1 Introduction Standard slice based navigation systems are commonly used and commercially available for orthopedic and trauma surgery. In general they consist of a posi- tion and orientation tracking system and a two dimensional user interface. These systems visualize the navigation information based on three dimensional medical imaging data on an external monitor. The three major drawbacks of state of the art navigation systems are a) every imaging and navigation device comes with its own user interface, b) the guidance information based on three dimensional data is visualized on two dimensional user interfaces, and c) the navigational information is not visualized directly on the operation situs, forcing the surgeon to observe the navigation information at a different location as the action is performed. Augmented reality visualization was introduced as an alternative user inter- face for navigated surgery. The navigation information is superimposed onto the surgeon’s view of the real world. In the past decade numerous applications and hardware setups using augmented reality visualization in medical navigation R. Larsen, M. Nielsen, and J. Sporring (Eds.): MICCAI 2006, LNCS 4190, pp. 373–380, 2006. c Springer-Verlag Berlin Heidelberg 2006