DGPF Tagungsband 22 / 2013 – Dreiländertagung DGPF, OVG, SGPF 481 Co-registration of Time-of-Flight (TOF) camera generated 3d point clouds and thermal infrared images (IR) LUDWIG HOEGNER 1 , MARTIN WEINMANN 2 , BORIS JUTZI 2 , STEFAN HINZ 2 & UWE STILLA 1 Abstract: This article presents an investigation on the co-registration of a 3d point cloud and intensity image of a near-infrared (NIR) time-of-flight (TOF) camera system and a thermal infrared (TIR) camera. The TIR-camera is relatively oriented to the TOF-camera. As the radiometric behavior of the TIR-camera and the NIR-image of the TOF-camera is different for most pixels, corners are detected in both images and co-registered. The intensity values of TIR-camera are projected into the TOF point cloud and from there in the TOF-camera image plane. The quality of the projection is evaluated comparing the projected TIR image from the view of the TOF-camera and the TOF NIR image. Zusammenfassung: Gegenstand dieses Artikels ist die Koregistrierung einer 3d Punktwolke und eines Nah-Infrarot (NIR) Bildes einer Time-of-Flight (TOF) Kamera mit einer thermischen Infrarot (TIR) Kamera. Die TIR Kamera wird relativ zur TOF Kamera orientiert. Da sich die radiometrischen Eigenschaften der TIR Kamera und des NIR Bildes der TOF Kamera für die meisten Pixel unterscheiden, werden Ecken in beiden Bildern detektiert und koregistriert. Die Intensitätswerte der TIR Kamera werden in die Punktwolke der TOF Kamera projiziert und von dort in die Bildebene der TOF Kamera. Die Qualität der Projektion wird durch Überlagerung des in die TOF Kamera projizierten TIR Bildes und des NIR Bildes der TOF Kamera überprüft. 1 Motivation Deriving an appropriate 3D description of man-made and natural environments is of great interest in Computer Vision, Photogrammetry and Remote Sensing. Most of the current approaches are based on the use of image and/or range data (HARTLEY & ZISSERMAN, 2008). Sets of images and image sequences can be used to simultaneously calculate the relative orientation of the images and generate 3d points clouds (POLLEFEYS et al, 2008; MAYER, 2007). Image- based methods of deriving 3d point clouds allow the observation and reconstruction of moving objects and include color information for the 3d points. These methods are nevertheless limited to textures scenes where corresponding homologues image points can be found. Range measuring sensors like laserscanners overcome this limitation but are in general not capable of capturing moving objects. The combination of laserscanners and a camera allows the generation of a colored 3d point cloud. Simultaneously capturing intensity information of high quality as well as range information by images with a single measurement, new active time-of-flight (TOF) sensors seem to be well- suited for combining the advantages of range measurement and images (WEINMANN et al., 2011). However, the acquired intensity typically represents information of the visual domain and hence, only radiometric and geometric surface properties of observed objects are captured 1) Photogrammetrie & Fernerkundung,Technische Universität München (TUM) Arcisstraße 21, 80333 München, www.pf.bv.tum.de 2) Institut für Photogrammetrie und Fernerkundung, Karlsruhe Institute of Technology (KIT) Englerstr. 7, 76131 Karlsruhe, http://www.ipf.kit.edu