Robotics and Autonomous Systems 37 (2001) 297–309
Vision-based integrated system for object inspection and handling
Josef Pauli
a,∗,1
, Arne Schmidt
b
, Gerald Sommer
a
a
Christian-Albrechts-Universit¨ at zu Kiel, Institut f ¨ ur Informatik und Praktische Mathematik, Preusserstrasse 1-9, D-24105 Kiel, Germany
b
Heidelberger Druckmaschinen AG, Center of Expertise Engineering, Dr.-Hell-Strasse, D-24107 Kiel, Germany
Received 16 February 2001; received in revised form 19 July 2001
Communicated by F.C.A. Groen
Abstract
Image-based effector servoing is a process of perception–action cycles for handling a robot effector under continual visual
feedback. This paper applies visual servoing mechanisms not only for handling objects, but also for camera calibration
and object inspection. A 6-DOF manipulator and a stereo camera head are mounted on separate platforms and are steered
independently. In a first phase (calibration phase), camera features are determined like the optical axes and the fields of
sharp view. In the second phase (inspection phase), the robot hand carries an object into the field of view of one camera,
then approaches the object along the optical axis to the camera, rotates the object for reaching an optimal view, and finally
the object shape is inspected in detail. In the third phase (assembly phase), the system localizes a board containing holes of
different shapes, determines the hole which fits most appropriate to the object shape, then approaches and arranges the object
appropriately. The final object insertion is based on haptic sensors, but is not treated in the paper. At present, the robot system
has the competence to handle cylindrical and cuboid pegs. For handling other object categories the system can be extended
with more sophisticated strategies of the inspection and/or assembly phase. © 2001 Elsevier Science B.V. All rights reserved.
Keywords: Visual feedback control; Optical axis estimation; Shape inspection; Object assembly
1. Introduction
Image-based robot servoing (short, visual servoing)
is the backbone of Robot Vision systems. The book
edited by Hashimoto [8] collects various approaches
of automatic control of mechanical systems using
visual sensory feedback. A tutorial introduction to
visual servo control of robotic manipulators has been
published by Hutchinson et al. [10]. Quite recently,
a special issue of the International Journal on Com-
puter Vision has been devoted to image-based robot
servoing [9].
∗
Corresponding author. Tel.: +49-431-560484;
fax: +49-431-560481.
E-mail address: jpa@ks.informatik.uni-kiel.de (J. Pauli).
1
URL: www.ks.informatik.uni-kiel.de/∼jpa/
Frequently, papers on visual servoing treat isolated
sub-tasks, e.g. approaching an object to a target lo-
cation [7]. Opposed to that, this work demonstrates
exemplary the usefulness of servoing for treating a
spectrum of sub-tasks involved in an overall robotic
application. The novelty is to consider servoing as a
universal mechanism for camera–robot calibration,
active viewing, shape inspection, and object assembly.
Furthermore, minimalism principles are considered by
extracting just the necessary image information and
avoiding 3D reconstruction, which leads to real-time
usage. Related to the application of peg-in-hole as-
sembly operations, it is favorable to integrate video
and force information [11]. However, this paper fo-
cuses on the vision-related sub-tasks of the overall
peg-in-hole application which take place primarily in
the run-up phase prior to the actual insertion phase.
0921-8890/01/$ – see front matter © 2001 Elsevier Science B.V. All rights reserved.
PII:S0921-8890(01)00161-0