Lighting design for machine vision application * Sunil Kumar Kopparapu * Cognitive Systems Research Laboratory, Tata Consultancy Services Limited, Plot No 14, Sector 24, Vashi-Turbe, Navi Mumbai Maharastra 400 705, India Received 19 December 2005; accepted 19 December 2005 Abstract Low-level image processing is an essential first step in any machine vision application. Low-level vision processing tasks need good lighting in the work environment for them to function robustly. Hence, good and uniform illumination from external light source is essential for machine vision applications to function. In this paper, we suggest a design procedure to obtain uniform illumination on the scene being imaged using several light sources. We pose the problem of determining the optimal position of the light sources as a minimisation problem. Simulation results show the effectiveness and suitability of the proposed procedure to illuminate the scene uniformly. q 2006 Elsevier B.V. All rights reserved. Keywords: Lighting design; Machine vision; Optimal light placement; Simulated annealing 1. Introduction The selection and placement of cameras and light sources is one of the most important steps in creating a successful vision system, because obtaining high-quality images can greatly simplify the vision algorithms and improve their reliability [1]. Low-level image processing tasks like segmentation are an essential first component of any machine vision application. The low-level vision tasks operate on the grey level images and hence are subject to perform differently under different lighting conditions. There are two ways of taking care of this (i) make the low-level vision task robust to illumination changes or (ii) have a controlled illumination using external light source. For real-time machine vision applications where time is a major constraint, it is best to choose to have an external light source to illuminate the scene than to invest important processor cycles into making the vision algorithm robust to spatial or time variations in ambient lighting. So external lighting becomes important for machine vision application. In machine vision application, good and uniform lighting is important. Non-uniform illumination by an external light source can cause more harm than good and fail the segmentation process (Fig. 1). Fig. 1(a) shows the original grey level image captured without any external lighting and Fig. 1(b) shows the segmented image obtained using the k-means segmental algorithm [2]. Similarly Fig. 1(c) and (d) show the grey level image and the segmented image under external lighting conditions. Clearly it can be seen that the scene (Fig. 1(c) and (d)) is segmented poorly (wall in the scene is partitioned into different segments) and the effect of external lighting is evident in the form of circles and indeed this segmented image which forms the basis for further machine vision application would result in the task being performed poorly. While on the other hand the segmentation is good when there are no external light sources illuminating the scene (Fig. 1(a) and (b)). Observe that a scene under natural lighting condition is better and correctly segmented (wall is marked as a single segment) than the same scene which is subjected to external lighting (compare Fig. 1(b) and (d)). While this is true it is also true that the natural lighting conditions are dynamic and change all the time and as discussed earlier for a machine vision application where time is crucial it is not feasible to implement algorithms that are robust to lighting conditions without burning important computational time. This motivates us to design procedures to position external light sources so that the scene is uniformly illuminated. There has been significant interest in the vision and robotics community in the more general area of sensor placement ([1,3–7] to cite a few). Mersch [8] gives an overview of the machine vision lighting techniques and discusses the most commonly encountered problems in machine vision lighting. The issue of light source placement in particular is discussed in Image and Vision Computing 24 (2006) 720–726 www.elsevier.com/locate/imavis 0262-8856/$ - see front matter q 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.imavis.2005.12.016 * Part of the work was done at CSIRO Manufacturing Science and Technology, Queensland Centre for Advanced Technologies, P.O. Box 883, Kenmore, Qld 4069, Australia. * Tel.: C91 2256163251; fax: C91 2227839926. E-mail address: sunil.kopparapu@tcs.com (S. Kopparapu).