Mobile Robot Localization Under Varying Illumination * Matjaˇ z Jogan, Aleˇ s Leonardis Faculty of Computer and Information Science, University of Ljubljana, Trˇ zaˇ ska 25, 1001 Ljubljana, Slovenia Horst Wildenauer Pattern Recognition and Image Processing Group Institute for Computer Aided Automation, Vienna University of Technology, Favoritenstrasse 9/1832, A-1040, Vienna, Austria Horst Bischof Institute for Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16 2. OG, A-8010 Graz, Austria Abstract Methods for mobile robot localization that use eigenspaces of panoramic snapshots of the environ- ment are in general sensitive to changes in the illumination of the environment. Therefore, we propose in this paper an approach which achieves a reliable localization under severe illumination conditions by illumination insensitive eigenspaces. The method in question uses gradient filtering of the eigenspaces. The method was tested on images obtained by a mobile robot and, as we show, it outperforms by far the other known methods. 1. Introduction To enable localization of a mobile robot in an outdoor or indoor environment, a number of methods have been pro- posed that construct an appearance model of the environ- ment by capturing panoramic views of locations, obtained with an omnidirectional sensor [1, 3, 9]. The model of ap- pearance is predominantly constructed by compressing the set of visual snapshots captured at different locations us- ing PCA, resulting in the eigenspace representation, which has been successfully used in many areas of computer vi- sion [6, 8]. With this approach, images captured during the process of learning get represented as points in a low— dimensional eigenspace, which is spanned by the princi- pal components of the data - eigenimages. Localization can then be performed by a projection of the momentary * H. W. was supported by a grant from the Austrian National Fonds zur F¨ orderung der wissenschaftlichen Forschung (P13981INF). M. J. and A. L. acknowledge the support from the Ministry of Education, Science and Sport of Republic of Slovenia (Research Program 506). H. B. was supported by the K plus Competence Center ADVANCED COMPUTER VISION. panoramic view on the eigenspace, followed by a search for the nearest coefficient of the training images. The eigenspace method was mainly used in a straight- forward way of classifying target appearances by projection without accounting for the possible discrepancies between the learned data and the subsequent images that have to be recognized during the localization phase. This can lead to potential problems when we want the robot to be able to estimate its position in a dynamic environment, with chang- ing configurations of moving objects and persons, and with changing illumination conditions. To cope with occlusions from objects, a robust algorithm for the calculation of the eigenimage coefficients was proposed [4, 3]. While this method can also tolerate some artifacts that appear due to the illumination (e.g. specularities or dark shadows), it results in erroneous localization when deal- ing with global or smooth illumination changes. Some ap- proaches attempt to alleviate the problem of global illumi- nation by a normalization [5]. In the case of occlusions such an approach can not be applied, since normalization is in- herently nonrobust. In panoramic images this problem gets even harder, since they depict 360 degrees of the surround- ing, which integrates several local lighting conditions. Such a variety clearly cannot be handled by simple normalization. In this paper we describe a method for mobile robot lo- calization under varying illumination that achieves illumi- nation invariance of the recognition process by convolving the eigenimages with a bank of linear filters. As a starting point we use the method that was presented and tested on object recognition by Bischof et al. [2], and modify it in order to be applicable for the task of mobile robot localiza- tion. As we will demonstrate, we achieve excellent results even in severe illumination conditions. In section 2 we first briefly review the eigenspace me- thod. Then we show how it is possible to calculate the coef-