Hardware Implementation of a Real Time Lucas and Kanade Optical Flow N. Roudel, F. Berry, J. Serot LASMEA 24 avenue des Landais 63177 Aubi` ere, France roudel,berry,serot@univ-bpclermont.fr L. Eck CEA List 8 route du Panorama, BP6 92265 Fontenay-aux-roses, France Laurent.eck@cea.fr Abstract This paper presents a FPGA-based design that aims at apply real-time vision processes and specially Optical flow estimation processes. The main goal of this work is to be embedded in a Micro air robot in order to provide critical information for autonomous flights. Thus, the motion field is one of the dominating information in the way of safety for the robot. Based on these motion information, obsta- cles avoidance, for example, could be add to increase the autonomous degree of the robot. 1 Introduction Since many years, lot of projects on development of au- tonomous land or air robots(UAV for Unmanned Aerial Ve- hicle) have been launched. Many reasons may explain such craze for these topics. Indeed specific tasks could be unsafe or even impossible for a human customers (areas of fight- ing, nuclear radiation, hazardous areas ,...). However, in spite of the hostile environments, the robot integrity must be insured. For this reason, different strategies of navi- gation and exploration can be used. In the most of these strategies, the knowledge of ego-motion and the measure- ments of potential moving target is a keystone of numerous algorithms. The motion evaluation can be done by differ- ent kind of sensors (inertial set, camera,...). Using a camera implies the computation of optical flow which is defined by pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and the scene. However, extraction of optical flow has a high computation cost and usually the strategy to evaluate optical flow with an air robot consists in send- ing (via wireless communication) image flow and comput- ing the motion on remote hardware. Once the computation is done, safety strategy is elaborated from these informa- tion and appropriate action is sent to the robot. This implies that on long distance flights (considering a static process- ing base on the ground), lost of communication can appear and the UAV safety is not sure. Consequently, the auton- omy of robot flight is highly limited. On the other hand, UAV implies several constraints such as the weight or/and the power consumption. So, the robot designer have to se- lect the best matching between sensing devices, algorithms and hardware for processing. In these works, we propose to use a camera associated with a FPGA-embedded optical flow algorithm in order to measure the motion field around the robot. Some papers, dealing with a hardware implementation of optical flow computation, can be found. Thus [1] and [2]proposed optical flow algorithm implementation with a computation speed of about 20-30FPS for a small image resolution (less than VGA resolution). In our application, this speed is too low for safeguarding the robot. In [3], a high speed real-time optical flow implementation is dis- played with a PCI-e card including two FPGAs. This kind of works does not take into account the embedded aspect. Others papers, such as [4] and [5], proposed Real-time optical flow estimations based on different algorithms. The structure of this paper is as follows. In section 2, the Lucas and Kanade optical flow algorithm is pre- sented and few considerations on its implementation are proposed. The next sections (3 and 4) introduces the data flow design and proposes the hardware implementation of the process. Finally, experimental results on realistic im- age sequence obtained by an implementation on our smart camera (SeeMos)is given in section 5. 2 Lucas and Kanade Algorithm Due to the works of Baron [6] which compare the per- formances of correlation, gradient, energy and phase-based optical flow extraction methods, the Lucas and Kanade method [7] has been chosen as its not recursion and low computational complexity providing good accuracy. This method is ”local” method which is only valid on small