3D Object following based on visual information for Unmanned Aerial Vehicles Iv´ an F. Mondrag´ on, Pascual Campoy, Miguel A. Olivares-Mendez, Carol Martinez Computer Vision Group Centro de Automtica y Robtica (CAR) -UPM-CSIC C. Jos´ e Guti´ errez Abascal 2, 28006 Madrid, Spain Email: imondragon@etsii.upm.es web: www.vision4uav.com Abstract—This article presents a novel system and a control strategy for visual following of 3D moving object by an Un- manned Aerial Vehicles UAV. The presented strategy is based only on the visual information given by an adaptive tracking method based on the color information, which jointly with the dynamics of a camera fixed to a rotary wind UAV are used to develop an Image-based visual servoing IBVS system. This system is focused on continuously following a 3D moving target object, maintaining it with a fixed distance and centered on the image plane. The algorithm is validated on real flights on outdoors scenarios, showing the robustness of the proposed systems agains winds perturbations, illumination and weather changes among others. The obtained results indicate that the proposed algorithms is suitable for complex controls task, such object following and pursuit, flying in formation, as well as their use for indoor navigation Keywords- Visual Servoing, UAV, Object Following. I. I NTRODUCTION Our research interest focuses on developing computer vision techniques to provide UAVs with an additional source of information to perform visually guided tasks - this includes tracking and visual servoing, inspection, autonomous object following, pursuit and flying in formation among others. Different works have been done where a vision system in conjunction with range sensor was used for Object Following test. On [1] an omnidirectional visual systems is used as bearing sensor, while the distance to the target is measured using a range sensor, for control a robotic wheelchair on indoors. Others systems have been proposed only based on visual information for cooperative robotics [2]. Visual infor- mation also have been proposed on aerial robotics for flying in formation [3]. Several approaches also have been proposed for fixed wind UAV flying at constant altitude following circular paths, in order to pursuit a moving object on a ground planar surface [4],[5]. In the same way, several approaches have been proposed for rotary wind UAV following a terrestrial target [6], [7]. Visual servoing also have been successfully implemented on aerial vehicles. Pose-based methods, in which is necessary to estimate the 3D position have been employed for applications like autonomous landing on moving objects [8], while image- based methods have been used for positioning [9], generally assuming a fixed distance to the object, reducing the complex- ity of the derived controlled and the necessity to estimate the reference depth. This paper presents a real time flying objects follow- ing method based only on visual information to generate a Dynamic Look and Move control architecture based on our previous visual control architecture developed for UAVs [10]. Section II present the flying object following problem statement. Section III explains how an adaptive color tracking method is used to identify and track the target object on the image plane. Then this information is employed to derive an interaction matrix that relates the features changes on the image plane with the dynamics of the camera fixed to a rotary wind UAV as is presented on section IV. The integration of the developed system on a UAV electric quadcopter is presented in section V. Finally, section VI show the test results of the proposed algorithm running onboard a UAV, validating our approach for an autonomous object flying’s following method based on visual information. II. PROBLEM STATEMENT Considering a flying object T moving with a unknown trajectory on the world space R 3 , and a flying robot O with an attached fixed calibrated pinhole camera, both having an idealized flying dynamics. The control goal is to command the flying robot in order to track the target object, maintaining it always onto the camera FOV with a fixed separation distance. Taking into account the figure 1 and considering the target object as a 3D spherical surface, it is projected on the camera image plane as a circular region that can be defined by its center of projection x t =[x t , y t ] T and the circumference diameter ø t . Because the target is an ideally spherical surface, the projection point (x t ) can be considered as the image projection of target’s sphere centroid with coordinates on the camera frame defines as X Tc =[X Tc , Y Tc , Z Tc ] T . The detected circumference (with a diameter of ø t pixels) on the image plane corresponds to the projections of the sphere perimeter (with a fixed diameter Ø T ) that results of the intersection of the plane which normal is parallel to the vector defined by the camera optical centre and the sphere centroid, that divides the target in two hemispheres. The projected diameter also can be used to estimate the distance to the target, because it is inversely proportional to the distance from the camera to the object.