COMPUTER ANIMATION AND VIRTUAL WORLDS Comp. Anim. Virtual Worlds 2005; 16: 1–10 Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/cav.52 ****************************************************************************************************** An augmented reality system to guide radio-frequency tumour ablation By S. Nicolau*, A. Garcia, X. Pennec, L. Soler and N. Ayache ****************************************************************************************************** Radio-frequency ablation is a difficult operative task that requires a precise needle positioning in the centre of the pathology. This article presents an augmented reality system for hepatic therapy guidance that superimposes in real-time 3D reconstructions (from CT acquisition) and a virtual model of the needle on external views of a patient. The superimposition of reconstructed models is performed with a 3D/2D registration based on radio-opaque markers stuck on to the patient’s skin. The characteristics of the problem (accuracy, robustness and time processing) led us to develop automatic procedures to extract and match the markers and to track the needle in real time. Experimental studies confirmed that our algorithms are robust and reliable. Preliminary experiments conducted on a human abdomen phantom showed that our system is highly accurate (needle positioning error within 3 mm) and enables the surgeon to reach a target in less than 1 minute on average. Our next step will be to perform an in vivo evaluation. Copyright # 2005 John Wiley & Sons, Ltd. Received: 23 December 2003; Accepted: 26 April 2004 KEY WORDS: augmented reality; radio-frequency ablation; 3D/2D registration; needle tracking Introduction The treatment of liver tumours by radio-frequency abla- tion (RFA) is an evolving technology using coagulative necrosis to treat patients with unresectable primary or metastatic hepatic cancers. 1 The guidance procedure to reach tumours with the needle is usually performed visually using intraoperative two-dimensional (2D) cross-sections of the patient obtained with either ultra- sound (US) or computed tomography (CT) acquisition. Because of the difficulty in locating the tumour’s centre in three dimensions (3D), needle positioning is not very accurate and the targeting procedure is very time con- suming, since it requires many trials. Real-time superimposition of images reconstructed in 3D from CT acquisition onto a real patient, so-called augmented reality (AR), may improve the accuracy and decrease complications in interventions such as RFA needle placement. Such AR guidance systems are rou- tinely used in the field of neurosurgery and orthopaedic surgery. 2–8 However, there are few applications on abdominal and thoracic zones. Mourgues 9 superim- poses coronary arteries on endoscopic images with an accuracy about 5 pixels and Langø 10 provides prelimin- ary results in laparoscopy with an accuracy of about 1 cm. Our purpose is to build a guidance system for the needle positioning that superimposes 3D reconstruc- tions of the liver and its tumours onto video images of the patient’s abdomen. To provide a significant help to surgeons, the overall superimposition error has to be less than 5 mm and the computation time of the data processing has to be under 10 minutes. In this article, we present the technical part of our AR system. Twenty-five radio-opaque markers are stuck on to the skin of the human abdomen (see phantom in Figure 1a), which is CT-scanned afterwards (slice thick- ness of 1 mm). A 3D segmentation, followed by a reconstruction (skin, liver and markers) is performed with specific software 11 (see Figures 1b and 1c). Two jointly calibrated cameras 12 are oriented toward the patient from two different points of view. To super- impose the 3D reconstruction in the video images, we performed a 3D/2D points registration between the markers localized on the reconstruction and those visi- ble in the video images. Since an interactive procedure to localize and match the markers in the CT scan and video images takes several minutes, we need to realize these tasks automatically. Firstly, we describe the ****************************************************************************************************** Copyright # 2005 John Wiley & Sons, Ltd. *Correspondence to: S. Nicolau, IRCAD, 1 Place de l’Ho ˆ pital, 67091, Strasbourg, France. E-mail: stephane.nicolau@ircad.u-strasbg.fr