Dynamic organ modeling for minimally-invasive cardiac surgery Stanislaw Szpala 1 , Marcin Wierzbicki 13 , Gerard Guiraudon 234 , Terry Peters 123 1 Robarts Research Institute, 2 Lawson Health Research Institute, 3 University of Western Ontario, 4 Canadian Surgical Technologies and Advanced Robotics (C-STAR), London, Ontario, Canada {sszpala, mwierz, tpeters}@imaging.robarts.ca Keywords: Virtual endoscopy, minimally-invasive cardiac surgery, image guidance, image warping. ABSTRACT While most currently available minimally invasive robotically assisted cardiac surgical systems do not employ 3D image guidance, such support can be generated using pre operative images such as CT. Previously we demonstrated a virtual model of the thorax with simulated surgical instruments, and a pulsating virtual model of the coronary arteries. In this paper we report the overlay of optical endoscopic images of a beating heart phantom with CT-based dynamic volumetric images of the phantom. Spatial matching is obtained through optical tracking of the endoscope and of the phantom, while time synchronization of the display of the model utilizes ECG gating. The spatial accuracy between the optical and virtual images varies from about 0.8 mm to –2.6 mm, while the time discrepancy depends on the frame-rate at which the virtual model is refreshed, and is typically 50-100 ms. Although the CT-based dynamic images are sufficient for animation of the model, artefacts associated with the image registration prevent seamless animation. Instead, to reconstruct the various phases of heart pulsation, we used a high-quality semi-static image of the diastolic phase of the phantom, and warped it to match the CT-based images corresponding to other phases of the heart pulsation. 1. INTRODUCTION Patient trauma and incision sizes during surgery have progressively been reduced by the medical community, especially after the first laparoscopic cholecystectomy [1] in 1985, but the concept of minimally invasive surgery was applied to coronary artery bypass graft (CABG) procedure almost thirteen years later. Stevenson et al. [2] performed the first totally endoscopic CABG on animals in 1998. The next year Loulmet et al. [3] and Mohr et al. [4] independently reported the procedure performed on humans, while Kiaii et al. [5] performed the first minimally invasive robotic coronary artery bypass (MIRCAB). Such a long delay was partially caused by slow development of reliable telemanipulators, with ARTEMIS (1992) being one of the first capable of operation in six degrees of freedom. The Zeus and da Vinci telerobotic devices are currently the most commonly used systems in surgical practice [6]. It soon became apparent that limited view of optical endoscopes, the primary intra-operative monitoring tool, as well as inaccessibility of visually-obstructed organs, could be complemented with a virtual environment generated from preoperative data obtained from imaging modalities like three-dimensional (3D) computerized tomography (CT) or MRI. Vining et al. [7] demonstrated that 3D reconstruction of CT images was useful in examining the tracheobronchial tree, while Gulbins et al. [8] showed that 3D imaging was invaluable in the planning of minimally invasive CABG. We have developed at our institution a virtual cardiac surgical platform using pre- and intra-operative imaging modalities: Chiu et al. [9] demonstrated an overlay of endoscopic images with CT-reconstructed thorax phantom, and Lehmann et al. [10][11] developed virtual animated coronary arteries derived from 2D bi-plane angiograms. Recently, we superposed tracked-endoscope images with CT-derived organ models in neurosurgical context [12], while in other work, Shahidi et al. [13] applied similar methodology to clinical cases. Mourguess et al. [14] developed an image-based refinement procedure for orienting a stereoscopic endoscope with respect to an animal heart. In this paper we present our results involving the fusion of CT-based preoperative images of a beating heart phantom with images from an endoscope that is optically tracked with respect to the phantom. We demonstrate real- time fusion of the endoscopic images with the virtual environment that is robust with respect to the position and the orientation of the endoscope and the phantom, as well as the phase of the cardiac cycle of the heart phantom. Building on earlier work [15], we report further enhancements to the system, including improved tracking of the endoscope pose (position and orientation), and a new method of animation of the virtual model of the beating heart. The new animation warps the 3D image of the diastolic phase to the images of other phases of the cardiac cycle, and provides visually superior results when compared to the previously published animation based on direct images from the cardiac cycle. This is a continuation of the development of the virtual cardiac surgical planning (VCSP) [10] environment, and part of an effort to provide comprehensive imaging support to an intraoperative cardiac surgical assistance environment that employs a telemanipulation system. Our long-term objectives are similar to those of Adhami and Coste-Manière [16]. Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display, edited by Robert L. Galloway, Jr., Proceedings of SPIE Vol. 5367 (SPIE, Bellingham, WA, 2004) · 1605-7422/04/$15 · doi: 10.1117/12.536401 713