IMAGE-BASED AND INTRINSIC-FREE VISUAL NAVIGATION OF A MOBILE ROBOT DEFINED AS A GLOBAL VISUAL SERVOING TASK C. P´ erez, N. Garc´ ıa-Aracil, J.M. Azor´ ın, J.M. Sabater, L. Navarro 2 R. Saltar´ en 1 1 Departamento de Autom´ atica, Electr´ onica e Inform ´ atica Industrial Universidad Polit´ ecnica de Madrid. 2 Dept. Ingenier´ ıa de Sistemas Industriales. Universidad Miguel Hern´ andez. Avd. de la Universidad s/n. Edif. Torreblanca. 03202 Elche, Spain Keywords: Visual servoing, mobile robot navigation, continuous path control. Abstract: The new contribution of this paper is the definition of the visual navigation as a global visual control task which implies continuity problems produced by the changes of visibility of image features during the navigation. A new smooth task function is proposed and a continuous control law is obtained by imposing the exponential decrease of this task function to zero. Finally, the visual servoing techniques used to carry out the navigation are the image-based and the intrinsic-free approaches. Both are independent of calibration errors which is very useful since it is so difficult to get a good calibration in this kind of systems. Also, the second technique allows us to control the camera in spite of the variation of its intrinsic parameters. So, it is possible to modify the zoom of the camera, for instance to get more details, and drive the camera to its reference position at the same time. An exhaustive number of experiments using virtual reality worlds to simulate a typical indoor environment have been carried out. 1 INTRODUCTION Image-based visual servoing approach is now a well known control framework (Hutchinson et al., 1996). A new visual servoing approach, which allows to con- trol a camera with changes in its intrinsic parame- ters, has been published in the last years (Malis and Cipolla, 2000; Malis, 2002c). In both approaches, the reference image corresponding to a desired po- sition of the robot is generally acquired first (during an off-line step), and some image features extracted. Features extracted from the initial image or invariant features calculated from them are used with those ob- tained from the desired one to drive back the robot to its reference position. The framework for robot navigation proposed is based on pre-recorded image features obtained during a training walk. Then, we want that the mobile robot repeat the same walk by means of image-based and intrinsic-free visual servoing techniques. The main contribution of this paper are the definition of the vi- sual navigation as a global visual control task. It im- plies continuity problems produced by the changes of visibility of image features during the navigation and the computing of a continuous control law associated to it. According to our knowledge, the approximation proposed to the navigation is totally different and new in the way of dealing with the features which go in/out of the image plane during the path and similar to some references (Matsumoto et al., 1996) in the way of specifying the path to be followed by the robot. 2 AUTONOMOUS NAVIGATION USING VISUAL SERVOING TECHNIQUES The strategy of the navigation method used in this pa- per is shown in Figure 1. The key idea of this method is to divide the autonomous navigation in two stages: the first one is the training step and the second one is the autonomous navigation step. During the training step, the robot is human commanded via radio link or whatever interface and every sample time the robot acquires an image, computes the features and stores them in memory. Then, from near its initial position, the robot repeat the same walk using the reference features acquired during the training step. 189 Pérez C., García-Aracil N., M. Azorín J., M. Sabater J., Navarro L. and Saltarén R. (2005). IMAGE-BASED AND INTRINSIC-FREE VISUAL NAVIGATION OF A MOBILE ROBOT DEFINED AS A GLOBAL VISUAL SERVOING TASK. In Proceedings of the Second International Conference on Informatics in Control, Automation and Robotics - Robotics and Automation, pages 189-195 DOI: 10.5220/0001155301890195 Copyright c SciTePress