Dynamic Data Driven Coupling of Continuous and Discrete Methods for 3D Tracking Dimitris Metaxas and Gabriel Tsechpenakis Center for Computational Biomedicine, Imaging and Modeling (CBIM), Computer Science Dept., Rutgers University, 110 Frelinghuysen Rd, Piscataway, NJ 08854 {dnm, gabrielt}@cs.rutgers.edu Abstract. We present a new framework for robust 3D tracking, using a dynamic data driven coupling of continuous and discrete methods to overcome their limitations. Our method uses primarily the continuous- based tracking which is replaced by the discrete one, to obtain model re-initializations when necessary. We use the error in the continuous tracking to learn off-line, based on SVMs, when the continuous-based tracking fails and switch between the two methods. We develop a novel discrete method for 3D shape configuration estimation, which utilizes both frame and multi-frame features, taking into account the most re- cent input frames, using a time-window. We therefore overcome the error accumulation over time, that most continuous methods suffer from and simultaneously reduce the discrete methods complexity and prevent pos- sible multiple solutions in shape estimation. We demonstrate the power of our framework in complex hand tracking sequences with large rota- tions, articulations, lighting changes and occlusions. 1 Introduction There are generally two major types of approaches to deformable and articulated shape and motion estimation: (i) the continuous ones that exploit the static and the temporal information in images, and (ii) the discrete ones that use only static information, i.e., they estimate the objects configuration based on a single frame. Continuous approaches are usually faster and more accurate than discrete approaches, but when they loose track they cannot easily recover due to error accumulation. On the other hand, discrete approaches can give a good approximation of an objects configuration without error accumulation over time. However, they have high computational cost and are based on searching in databases with limited number of object configurations. In this paper, we introduce a new framework for robust 3D object tracking, to achieve high accuracy and robustness. Focusing on a specific case of tracking, This research has been funded by an NSF-ITR/NGS-0313134 and an NSF-ITR- [ASE+ECS]-0428231 Collaborative Project to the first author. V.S. Sunderam et al. (Eds.): ICCS 2005, LNCS 3515, pp. 712–720, 2005. c Springer-Verlag Berlin Heidelberg 2005