Calibration of the optical 3D sensor "Flying Triangulation" M. Schröter, F. Willomitzer, O. Arold, S. Ettl, G. Häusler Institute of Optics, Information and Photonics Friedrich-Alexander University Erlangen-Nürnberg, Germany mailto:maximilian.schroeter@physik.uni-erlangen.de The optical 3D measurement principle “Flying Triangulation” enables an acquisi- tion of 3D surface data of complex objects. To get metric 3D data with best possi- ble accuracy, a calibration of the sensor is needed. The current calibration of our sensor is difficult, time-consuming and requires high user interaction which does not mirror the spirit of our measurement principle. We present an easy, fast, and accurate novel method for calibration. 1 Introduction In [1] we introduced the optical 3D measure- ment principle "Flying Triangulation". A hand-guided, easy, motion-robust measurement of complex ob- jects is possible. No external tracking is necessary, due to sophisticated registration algorithms. The sensor principle is freely scalable, which allows to build sensors for a wide range of complex objects (see Fig. 1). Fig. 1 The principle Flying Triangulation enables a motion- robust 3D acquisition of a wide range of objects [2]. To get metric 3D data with best possible accuracy, a calibration of the sensor is needed. Currently, we apply a model-free calibration method which has some weaknesses: It is time-consuming, requires a high amount of user interaction and some expensive tools. However, the FlyTri measurement process is easy, fast, and accurate, which should be inherited by our calibration as well. 2 Sensor properties Flying Triangulation is based on an extension of the well-known light-sectioning principle. θ Camera Projector θ Triangulation angle Fig. 2 Setup sketch of the Flying Triangulation measure- ment principle. A multiple line pattern is projected onto an object and observed by the camera under the triangulation angle θ (see Fig. 2). This way we obtain sparse 3D data along those lines for each camera image. While moving around the object a dense 3D point cloud al- ready arises after a few seconds. Since both the pro- jector and the camera are optical systems affected by aberrations, both require a calibration. 3 Current calibration method Fig. 3 Left: Current calibration body. Right: Positioning of the sensor on a steel bar. A waviness of the marker plate, which reduces the accuracy, is clearly visible. DGaO Proceedings 2012 – http://www.dgao-proceedings.de – ISSN: 1614-8436 – urn:nbn:de:0287-2012-P025-1