27 AMR Vision System for Perception, Job Detection and Identification in Manufacturing Sarbari Datta and Ranjit Ray Robotics and Automation Group, Central Mechanical Engineering Research Institute India 1. Introduction Autonomous mobile robots are becoming an integral part of flexible manufacturing system especially for material transport, cleaning and assembly purpose. The advantage of this type of robots is that the existing manufacturing environment need not be altered or modified as in case of conventional AGVs where permanent cable layouts or markers are required for navigation. These robots are also used extensively for survey, inspection, surveillance, bomb and mine disposal, underwater inspection and space robotics. For autonomous navigation, proprioceptive and exteroceptive sensors are mounted on these mobile robots. As proprioceptive sensors measure the kinematic states of the robot, they accrue error over time and they are supplemented by exteroceptive sensors like ultrasonic and laser range finders, camera and global positioning systems that provide knowledge of its local environment which the robot subsequently uses to navigate. Here we describe the vision system of first indigenous autonomous mobile robot, AMR, with manipulator for environment perception during navigation and for job detection and identification required for material handling in a manufacturing environment. 1.1 Autonomous Mobile Robot System (AMR) The ultimate goal for research on autonomous navigation of mobile robot is to endow these robots with some practical intelligence so that they can relieve or replace the human operators of tedious and repetitive tasks and for this reason manufacturing is one area where mobile robots are becoming a necessity. Among on-going research on autonomous mobile robots for applications related to manufacturing, University of Massachusetts Amherst is developing a mobile robot with a comprehensive suite of sensors that includes LRF and vision along with a dexterous manipulator, as mobility extends the workspace of the manipulator, posing new challenges by permitting the robot to operate in unstructured environments (Katz et al., 2006). Bundeswehr University Munich is developing vision-guided intelligent robots for automated manufacturing, materials handling and services, where vision guided mobile robots ATHENE I and II navigates in structured environments based on the recognition of its current situation and a calibration-free manipulator handles various objects using an stereo-vision system (Bischoff & Graefe, 1998). Source: Vision Systems: Applications, ISBN 978-3-902613-01-1 Edited by: Goro Obinata and Ashish Dutta, pp. 608, I-Tech, Vienna, Austria, June 2007 Open Access Database www.i-techonline.com