An unified approach for a simultaneous and cooperative estimation of defocus blur and spatial shifts F. Desche ˆnes a,b , D. Ziou a, * , P. Fuchs b a De ´partement de mathe ´matiques et d’informatique, Faculte ´ des sciences, Universite ´ de Sherbrooke, 2500, boul. de l’Universite ´, Sherbrooke, Que., Canada J1K 2R1 b Centre de robotique, E ´ cole des Mines de Paris, 60, boul. Saint-Michel, 75272 Paris Cedex 06, France Received in revised form 25 July 2003 Abstract This paper presents an algorithm for a cooperative and simultaneous estimation of depth cues: defocus blur and spatial shifts (stereo disparities, two-dimensional (2D) motion, and/or zooming disparities). These cues are estimated from two images of the same scene acquired by a camera evolving in time and/or space and for which the intrinsic parameters are known. This algorithm is based on generalized moment expansion. We show that the more blurred image may be expressed as a function of the partial derivatives of the two images, the blur difference and the horizontal and vertical shifts. Hence, these depth cues can be computed by resolving a system of equations. The behavior of the algorithm is studied for constant and linear images, step edges, lines and junctions. The rules governing the choice of its parameters are then discussed. The proposed algorithm is tested using synthetic and real images. The results obtained are accurate and dense. They confirm that defocus blurs and spatial shifts (stereo disparities, 2D motion, and/or zooming disparities) can be simultaneously computed without using the epipolar geometry. They thus implicitly show that the unified approach allows: (1) blur estimation even if the spatial locations of corresponding pixels do not match perfectly; (2) spatial shift estimation even if some of the intrinsic parameters of the camera have been modified during the capture. q 2003 Elsevier B.V. All rights reserved. Keywords: Three-dimensional computer vision; Unified model; Feature extraction; Depth from defocus; Blur; Disparity; Motion; Zoom; Moment expansion; S-transform 1. Introduction In computer vision, the three-dimensional (3D) percep- tion of a real scene allows us to understand the 3D relationship of objects in world space. It has many applications in fields such as medicine, cinema, robotics, aerospace industry and remote sensing. 3D perception is generally related to the computation of depth information, which involves the extraction of relevant image features (depth cues) such as shadows, motion, blur, disparity, etc. [29]. Inspired by the 3D perception system of human being, many searchers suggest that the use of only one depth cue is insufficient and that it is essential to consider complementary sources of information [1,3,10,19,33]. They affirm that the limitations associated to the use of a single depth cue can be overcome by taking into account complementary information obtained from additional cues. Our work thus tries to address this problem. Specifically, we are interested in simultaneous and cooperative estimation of blur differences (depth from defocus) and spatial shifts (stereo disparities, two-dimen- sional (2D) motion, and/or zooming disparities). Let us consider I ð~ rðt; pÞ; t; p; gÞ where ~ rðt; pÞ¼ðxðt; pÞ; yðt; pÞÞ T ; the images of a real scene obtained by a camera evolving in time ðtÞ and/or space and for which the values of the extrinsic (p : position and orientation) and intrinsic parameters (g : aperture, focal length, lens radius, etc.) are known. The horizontal ðd x Þ and vertical ðd y Þ spatial shifts in the image plane are, respectively, defined as d x ¼ xðt 1 ; p 1 Þ 2 xðt 2 ; p 2 Þ and d y ¼ yðt 1 ; p 1 Þ 2 yðt 2 ; p 2 Þ; that is the distances in X and Y between a pair of corresponding image 0262-8856/$ - see front matter q 2003 Elsevier B.V. All rights reserved. doi:10.1016/j.imavis.2003.08.003 Image and Vision Computing 22 (2004) 35–57 www.elsevier.com/locate/imavis * Corresponding author. Tel.: þ1-819-821-8000x2859; fax: þ 1-819- 821-8200. E-mail addresses: djemel.ziou@dmi.usherb.ca (D. Ziou); francois. deschenes@dmi.usherb.ca (F. Desche ˆnes); philippe.fuchs@caor.ensmp.fr (P. Fuchs).