A study on the sensitivity of photogrammetric camera calibration and stitching Jason de Villiers Council for Scientific and Industrial Research, Pretoria, South Africa Email: jdvilliers@csir.co.za Fred Nicolls University of Cape Town Cape Town, South Africa Email: fred.nicolls@uct.ac.za Abstract—This paper presents a detailed simulation study of an automated robotic photogrammetric camera calibration system. The system performance was tested for sensitivity with regard to noise in the robot movement, camera mounting and image processing of the light sources. Real world applicability of the calibrations are assessed by quantifying the accuracy with which they generate a photogrammetric stitched panorama. It was found that system performance is robust in the presence of noise, with the focal length accuracy being a prime determinant in over all calibration accuracy and stitching performance. I. I NTRODUCTION This paper investigates the suitability of using a robot arm to calibrate a camera for use in real-time photogrammetric stitching via means of simulation. Such a calibration procedure would allow automated and adaptable calibration procedures for a variety of cameras [1]. Section I-A provides more detail on robotic camera calibration. Photogrammetric stitching is useful in a number of applications ranging from surveillance through navigation, it is discussed in more detail in Section I-B. A. Robotic camera calibration Photogrammetric camera calibration is the determination of camera parameters such that the pixel coordinates corre- sponding to an object in the world coordinate frame can be found and, conversely, a vector in the world coordinate frame corresponding to a pixel coordinate can be sought. Specifically the following parameters are sought: 1) Distorted to Undistorted (DU) pixel domain mapping. 2) Undistorted to Distorted (UD) pixel domain mapping. 3) Camera focal length. 4) Pixel dimensions. 5) Pixel skewness. 6) Camera principle point. 7) Camera 6 Degree of Freedom (DOF) position, here- after called the pose, of the camera in world coordi- nates. It is common to use a checker board or other regular grid of lines or circles as an optical reference to calibrate a camera. Examples include the popular Open Computer Vision (OpenCV) [2] and California Institute of Technology [3] calibration toolboxes as well as numerous academic articles, for example [4], [5], [6], [7], [8], [9], [10]. All of these systems translate poorly to cameras of other sensitivity spectra, and may require multiple targets for cam- eras of different fields of view (FOV) and resolution [11]. The calibrations are also not always repeatable as there is often a human component. Therefore much work has been done to automate the calibration process and make it robust to changes of cameras. Examples include Peters et al. [12] work on automatic stereo calibration and de Villiers’ single camera system [1], [11]. The latter is applicable to any number of cameras regardless of the amount of overlap in their FOVs and provides all the required parameters listed above. The modelling and simulation performed in this work is based on the second system. The details of the camera calibration are covered in the patent [1] and are only described here at a high level to provide context. A robot arm is mounted on an optical table. On the end of the robot arm is mounted a light source (LS) which can be removed and replaced with high precision. This facilitates swapping LSs for different camera sensitivity spectra. The camera that will be calibrated is then placed on a highly repeatable mount looking at the robot and its LS. The robot is commanded through a sequence of discrete positions, to emulate either a 2D grid or other pattern depending on the exact calibration parameter being measured. At each point in the movement sequence an image of the LS is captured and processed to find the pixel position of the centre of the LS. This centre position and the pose of the robot are recorded. After completion of the movement sequence the robot poses and LS pixel positions are processed to determine the camera parameter being measured. Some of the calibration require the camera to capture a movement sequence from several different mounting locations whose relative poses are known. B. Photogrammetric stitching Photogrammetric stitching is the process of creating a panorama from an array of images without using their image content. This is performed by making use the camera’s pho- togrammetric parameters determined by prior calibration of the array of cameras. There are several examples of such systems that seem use such a process including Thales’ Gatekeeper [13] and Point Grey’s Ladybug family of omnidirectional cam- eras [14]. Essentially the stitching is performed by hypothesising a set of points in the real world and then projecting each point onto each camera’s focal plane (catering for lensing affects)