New dynamic zoom calibration technique for a stereo-vision based multi-view 3D modeling system Tao Xian, Soon-Yong Park, Murali Subbarao Dept. of Electrical & Computer Engineering * State Univ. of New York at Stony Brook, Stony Brook, NY, USA 11794-2350 ABSTRACT A new technique is proposed for calibrating a 3D modeling system with variable zoom based on multi-view stereo image analysis. The 3D modeling system uses a stereo camera with variable zoom setting and a turntable for rotating an object. Given an object whose complete 3D model (mesh and texture-map) needs to be generated, the object is placed on the turntable and stereo images of the object are captured from multiple views by rotating the turntable. Partial 3D models generated from different views are integrated to obtain a complete 3D model of the object. Changing the zoom to accommodate objects of different sizes and at different distances from the stereo camera changes several internal camera parameters such as focal length and image center. Also, the parameters of the rotation axis of the turntable changes. We present camera calibration techniques for estimating the camera parameters and the rotation axis for different zoom settings. The Perspective Projection Matrices (PPM) of the cameras are calibrated at a selected set of zoom settings. The PPM is decomposed into intrinsic parameters, orientation angles, and translation vectors. Camera parameters at an arbitrary intermediate zoom setting are estimated from the nearest calibrated zoom positions through interpolation. A performance evaluation of this technique is presented with experimental results. We also present a refinement technique for stereo rectification that improves partial shape recovery. And the rotation axis of multi-view at different zoom setting is estimated without further calibration. Complete 3D models obtained with our techniques are presented. Keywords: Stereo vision, 3D modeling, dynamic zoom calibration, multi-view rotation axis estimation 1. INTRODUCTION Recent advances in consumer digital cameras have made low-cost 3D modeling systems feasible. Conventional 3D modeling techniques use a fixed zoom setting in 3D reconstruction. For objects at different distances and/or of different sizes, a vision system with variable zoom is critical for 3D modeling. In the case of fixed zoom setting, the relative positions of lens components are static. When the zoom setting changes, the camera parameters also vary. To extend the fixed zoom setting camera model to adjustable zoom settings, several algorithms have been presented. Wilson and Shafer 1, 2 introduced an iterative trial and error procedure in which four camera parameters are selected. These camera parameters are -- the effective focal length f , the image center ) , ( 0 0 v u , and the translation along the optical axis 3 T . Up to a 5th degree polynomial is used to estimate the camera parameters from fixed sampled points. Atienza and Zelinsky 3 extended this calibration technique to gaze detection under the assumption that the orientation of the camera coordinate remains unchanged during zoom change. However when the optical configuration of a vision system changes, this assumption is not valid, and a trial and error procedure will be needed to determine the critical parameters. There are several main problems in employing variable/dynamic zoom in 3D modeling. First, many internal camera parameters vary nonlinearly with different zoom settings. Their variations are too complex to be expressed analytically, even for a simple lens system. Second, the relation between the camera coordinate system and the turntable rotation axis * E-mail: {txian, parksy, murali}@ece.sunysb.edu; Tel: 1 631 632-9149; WWW: www.ece.sunysb.edu/~cvl