C-space evaluation using hierarchical data structures F.J. Blanco V. Moreno B. Curto R. Ther´on jblanco@abedul.usal.es vmoreno@abedul.usal.es bcurto@abedul.usal.es theron@usal.es Dpto. Computer Science and Automation Universidad de Salamanca Plaza de la Merced s/n, Salamanca, SPAIN Abstract In this work we present a hierarchical algorithm to evaluate the configuration space of a mobile robot. Alt- hough it is based on discrete convolution of workspace and robot, quadtree representations have been intro- duced in order to obtain directly a quadtree represen- tation of the C-space. Because these representations compact highly the information, it can be used for workspaces with high dimensions or/and resolutions. Also, as the algorithm is hierarchical, calculations at high resolutions are performed only in those regions where it is necessary. 1 Introduction This paper addresses the problem to represent the obstacles at the Configuration Space (C-space) in an hierarchical data structure for a mobile robot. The C-space representation is necessary to achieve an au- tonomous actuation of robots where a main aspect is to avoid collisions with the obstacles of the robot envi- ronment when a task is performed. The collision avoi- dance, as a task, can be developed in an easier way by working at the C-space than at the robot workspace since, in the former, the robot position and orientation are characterized by just one point [10]. In the past, the obstacle representation at the C- space has been related closely with the path plan- ning procedures. In this way, Lozano-P´erez propo- ses the planning task in two different steps: findspace, where the collision free robot configurations are found and findpath where a sequence of configurations can be found to move the robot from a given location to another one. This paper is focused on the first step, providing a structure that represents the geometrical constraints imposed on the robot movements by the obstacles. Nevertheless, this resulting C-space can be used at different stages or modules of an autonomous robot system as tracking controller, trajectory gene- ration and so on [11]. The most of the previous works consider a robot as a rigid object that moves freely in a workspace par- tially occupied by a set of obstacles. The obstacles and the robot are considered as convex objects, and later they extend the study to the non-convex case ([10, 13]). The computational time depends directly on the objects shape and number and, more concisely, on the number of vertices. In a practical implementa- tion, a procedure to detect vertexes is needed, so the computational load is increased. As well, they only work with polygonal objects (polyhedral ones in 3-D) and it can be necessary an approximation to represent a real workspace. Other works focus on this task, not as geometri- cal computation, but as a convolution product of the robot and the workspace [8]. In these works real repre- sentations of the workspace and the robot are conside- red since what they really do is the convolution of two functions that represent both. By the means of the convolution theorem, it can be seen as the simple mul- tiplication of the two functions got from applying the Fourier transform to the previous ones. In practice, to get higher speediness the Fast Fourier Transform (FFT) is used to convolve discrete representations of workspace and robot. Although it is a great achie- vement, this way of representing the workspace, the robot, and the C-space makes that a large amount of memory is required when a great precision is needed. For instance, for a typical two-dimensional space of size 10m × 10m with a resolution of 1cm that is re- presented as an array with 10 4 pixels per side, and assuming eight bytes 1 to represent the values of the elements in the matrices, requires 10 8 × 8 = 800 Mby- tes, what is not a great amount but for lager spaces or 1 As the Fourier transform is used, it is necessary to work with complex numbers with precision of floats 1623 Proceedings of ICAR 2003 The 11th International Conference on Advanced Robotics Coimbra, Portugal, June 30 - July 3, 2003