Kidney Cancer Augmented Reality: A New Tool To Improve Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In Vitro and In Vivo Results Dogu Teber a , Selcuk Guven b , Tobias Simpfendo ¨rfer c , Mathias Baumhauer c , Esref Oguz Gu ¨ven d , Faruk Yencilek e , Ali Serdar Go ¨zen a , Jens Rassweiler a, * a Department of Urology, SLK-Kliniken Heilbronn, University of Heidelberg, Heidelberg, Germany b Department of Urology, Meram Medical School, Selcuk University, Konya, Turkey c Department of Medical and Biological Informatics, German Cancer Research Center, Heidelberg, Germany d Department of Urology, Mustafa Kemal University, Antakya, Hatay, Turkey e Department of Urology, Yeditepe University, Istanbul, Turkey EUROPEAN UROLOGY 56 (2009) 332–338 available at www.sciencedirect.com journal homepage: www.europeanurology.com Article info Article history: Accepted May 6, 2009 Published online ahead of print on May 19, 2009 Keywords: Computer-Assisted Surgery Augmented Reality Soft Tissue Navigation Laparoscopic Partial Nephrectomy Please visit www.eu-acme.org/ europeanurology to read and answer questions on-line. The EU-ACME credits will then be attributed automatically. Abstract Background: Use of an augmented reality (AR)–based soft tissue navigation system in urologic laparoscopic surgery is an evolving technique. Objective: To evaluate a novel soft tissue navigation system developed to enhance the surgeon’s perception and to provide decision-making guidance directly before initiation of kidney resection for laparoscopic partial nephrectomy (LPN). Design, setting, and participants: Custom-designed navigation aids, a mobile C-arm capable of cone-beam imaging, and a standard personal computer were used. The feasibility and reproducibility of inside-out tracking principles were evaluated in a porcine model with an artificially created intraparenchymal tumor in vitro. The same algorithm was then incorporated into clinical practice during LPN. Interventions: Evaluation of a fully automated inside-out tracking system was repeated in exactly the same way for 10 different porcine renal units. Additionally, 10 patients underwent retroperitoneal LPNs under manual AR guidance by one surgeon. Measurements: The navigation errors and image-acquisition times were determined in vitro. The mean operative time, time to locate the tumor, and positive surgical margin were assessed in vivo. Results and limitations: The system was able to navigate and superpose the virtually created images and real-time images with an error margin of only 0.5 mm, and fully automated initial image acquisition took 40 ms. The mean operative time was 165 min (range: 135–195 min), and mean time to locate the tumor was 20 min (range: 13– 27 min). None of the cases required conversion to open surgery. Definitive histology revealed tumor-free margins in all 10 cases. Conclusions: This novel AR tracking system proved to be functional with a reasonable margin of error and image-to-image registration time. Mounting the pre- or intraop- erative imaging properties on real-time videoendoscopic images in a real-time manner will simplify and increase the precision of laparoscopic procedures. # 2009 European Association of Urology Published by Elsevier B.V. All rights reserved. * Corresponding author. Department of Urology, SLK-Kliniken Heilbronn, Am Gesundbrunnen 20-26, 74078 Heilbronn, Germany. Tel. +49 7131492401; Fax: +49 7131492429. E-mail address: jens.rassweiler@slk-kliniken.de (J. Rassweiler). 0302-2838/$ – see back matter # 2009 European Association of Urology Published by Elsevier B.V. All rights reserved. doi:10.1016/j.eururo.2009.05.017