Further investigations on ToF cameras distance errors and their corrections. D. Falie, V. Buzuloiu The Image Processing and Analysis Laboratory, Polytechnic University of Bucharest, Romania dfalie@alpha.imag.pub.ro , buzuloiu@alpha.imag.pub.ro Abstract—The distance calibration of the Time of Flight camera is a difficult task due to the errors produced by multi reflections of the light inside the camera body and outside. In any room the active light emitted by the camera is reflected by the walls and the objects in the scene. Thus the reflected light by the objects in the scene is the sum of this light and the direct light emitted by the camera. The distance information is affected by this indirect light. The calibration method we propose can be performed not only in laboratory condition but also in any conditions. The distance errors for all objects in the scene can be corrected if white or black tags (labels) are attached on objects. I. INTRODUCTION The ToF cameras -- type of 3D cameras dedicated to various applications where an image of the distances to the objects in the scene is the essential new ingredient -- are succinctly described in [4], [5], [6] and in our companion paper [3] of this conference and we invite the interested reader to refer to it for the description of the work principle of the ToF cameras. Here we just mention that, at this stage of the technology, the distance measurements are affected by very important errors due to multiple reflections of the direct light on external objects as well as inside the camera and that a main task towards the improvement of these cameras is to analyze this perturbing component of the “active” light arriving on a pixel and to find methods to compensate it. These cameras have their own source of light which is an infrared radiation (produced by an array of LED’s) amplitude modulated with 20 MHz, which is reflected by the objects in the scene and then detected by each pixel detector, as an amplitude and phase (compared to the emitted 20 MHz wave). So, each pixel will output at a given instant a complex signal, ܫ ሺሻ ൌ ሺሻ · ·ఝ ሺሻ (1) where index m stands for “measured”. If the only component of the incoming light would be for the point in the scene corresp i he pixel in the image: ond ng to t ܫௗ ሺሻ ൌ ௗ ሺሻ · ·ఝ ሺሻ (2) the phase φ(i) would be in a direct and simple relation to the di h t p the scene to the camera [4] stance of t a oint of 2 · ሺሻ ൌ ଶ·గ· · ሺሻ (3) where is the speed of light and is the modulation frequency (see also Fig. 1). Figure 1. The object in the scene is iluminated directly by the modulated light and indirectly by the light reflected by other objects. Inside the camera body the incoming light is reflected by the chip surface to the lenses surface and back to the chip. In any real setting the incoming light has many other components but only those coming from our “active source” will influence the output of the pixel detector; unfortunately there are such parasite components produced by multiple reflections (of the light from an active source) on various objects in the scene as well as by multiple reflections inside the d ot these two kinds of parasite components by camera. We en e ܫଵ ሺሻ ൌ ଵ ሺሻ · ·ఝ ೝభ ሺሻ (4) and ܫଶ ሺሻ ൌ ଶ ሺሻ · ·ఝ ೝమ ሺሻ (5) and we shall put