* Corresponding author. Tel.: #65-790-4319; fax: #65-792- 6559. E-mail addresses: asysgao@ntu.edu.sg (Y. Gao), asmkleung@ntu.edu.sg (M.K.H. Leung). Pattern Recognition 35 (2002) 361}371 Line segment Hausdor! distance on face matching Yongsheng Gao*, Maylor K.H. Leung School of Computer Engineering, Nanyang Technological University, Singapore 639798, Singapore Received 17 March 2000; received in revised form 28 November 2000; accepted 28 November 2000 Abstract A novel concept of line segment Hausdor! distance is proposed in this paper. Researchers apply Hausdor! distance to measure the similarity of two point sets. It is extended here to match two sets of line segments. The new approach has the advantage to incorporate structural and spatial information to compute the similarity. The added information can conceptually provide more and better distinctive capability for recognition. This would strengthen and enhance the matching process of similar objects such as faces. The proposed technique has been applied on line segments generated from the edge maps of faces with encouraging result that supports the concept experimentally. The results also implicate that line segments could provide su$cient information for face recognition. This might imply a new way for face coding and recognition. 2001 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved. Keywords: Line segment Hausdor! distance; Hausdor! distance; Line segment; Structure; Disparity; Face recognition 1. Introduction Psychological studies [1,2] indicated that human rec- ognizes line drawings as quickly and almost as accurately as gray level pictures. These results might implicate that edge images of objects can be used for object recognition and achieve similar accuracy as gray level images. Taka H cs [3] used Hausdor! distance to measure the similarity of two binary frontal face images, and achieved 92% accu- racy in identifying the input. He argued that the process of face recognition may start at a much earlier stage and edge images can be used for fast screening of faces with- out the involvement of high-level cognitive functions. Olson and Huttenlocher [4] proposed an oriented Haus- dor! distance that makes use of the direction of gradient to cut down the errors of false positives. The use of orientation information improves the recognition perfor- mance for dense and complicated object matching. Based on these studies, a novel object recognition approach is proposed here to harness the structural and spatial in- formation of an object edge map. Edge images have the advantage of less demand on storage space and they are less sensitive to illumination changes. After thinning of the edge map, a polygonal line "tting process is applied to generate the line segment representation of an object. This representation, using dominant points (i.e. end points of line segments) on the curves, further reduces the storage requirement. The proposed line segment Haus- dor! distance (LHD) measure is then employed to match objects. Compared to conventional applications of Haus- dor! distance, LHD has better distinctive power because it can make uses of the additional attributes of line orientation and line-point association, i.e. it is not encouraged to match two lines with large orientation di!erence, and all the points on one line have to match to points on another line only. The proposed technique has been applied on line segments generated from the edge maps of faces for face identi"cation. Encouraging results that support the concept experimentally were obtained. The results also implicate that line segments could provide su$cient information for face recognition. This might imply a new way for face coding and recognition. In the following, a brief introduction of Hausdor! distance together with a discussion on its weaknesses is highlighted in Section 2. In Section 3, a novel line segment Hausdor! distance for line matching is pro- posed and described in detail. Encouraging experimental 0031-3203/01/$20.00 2001 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved. PII:S0031-3203(01)00049-8