Validating LiDAR Sensor Surveillance Technology versus conventional out-the-window View for safety critical Airport Operations Hannes Brassel, Alexander Zouhar and Hartmut Fricke Chair of Air Transport Technology and Logistics Technische Universität Dresden Dresden, Germany {hannes.brassel, alexander.zouhar, hartmut.fricke}@tu-dresden.de Abstract— This paper performs a quantitative comparison between established surveillance techniques based on Out-The- Window View and Light Detection And Ranging (LiDAR) under identical visibility conditions. The aim is to understand how LiDAR technology can potentially improve the situational awareness of the Apron Air Traffic Controller (ATCo). Our work hereby extends previous evaluations of LiDAR technology with application to apron surveillance by explicitly comparing the visual performance of LiDAR and Out-The-Window View. ATCo scanning activities on the apron conceptually follow (a) object detection, (b) size estimation, (c) class recognition, and (d) identification, where “size” constitutes an essential distinctive visual feature. Hence, comparing different surveillance techniques can revert into judging the respective level of performance reached to solve these visual tasks per technology candidate. To that end, the presented quantitative comparison relies on performance indicators derived from a field experiment. Our results show that LiDAR excels Out-The-Window View at degraded visibility conditions judged as the safety-critical setting. During good weather however, no winner can be identified specifically for the higher-level vision tasks (c) and (d). We so conclude, that LiDAR is a valuable candidate to significantly enhance the situational awareness of ATCo especially during adverse weather conditions and should be considered as a safety barrier. Keywords- LiDAR, OTWV, airport ground surveillance, adverse weather, object detection, object size, class recognition, identification, LVO I. INTRODUCTION In today’s Air Traffic Management (ATM) weather still constitutes a hardly predictable and uncontrollable, performance sensitive factor for both the Aircraft Operator (AO) and the Air Traffic Controller (ATCo). Especially, weather conditions have a significant safety performance impact on flight operations both en-route and on ground [1]. As for airport ground operations, still a large number of ATC procedures and advisories rely on line-of-sight conditions. Consequently, airport ground surveillance largely require Out- The-Window View (OTWV) capability for the ATCo. Dependent on local settings, the OTWV may be supported by e.g., video cameras (CCTV), Surface Movement Radar (SMR), secondary radar-based Multilateration (MLAT) and/or magnetic field sensors. As the OTWV directly links to the prevailing weather/lighting conditions such as e.g., fog during times of low visibility as defined in European Regulation (EU) No 965/2012 (IR-OPS Annex V Subpart E) for which ATC shall apply LVO. These operations however reduce significantly traffic throughput [2, 3]. The rationale of LVO and other such ATC measures is always to primarily grant the safety of flight operations at all times. While the resulting capacity backlogs during LVO typically lead to economic losses (e.g. delays or network manager regulated traffic), a more critical case arises if a degraded ATCo’s situational awareness would lead to a reduced ability to recognize conflicts and thus to poor decision making. This hazard was confirmed in several data studies such as in the Aviation Safety Reporting System (ASRS) analysis on ATCo- related safety-relevant occurrences, where 72.4% of all occurrences constituted in the failure to perceive information or are attributable to misperceived information [4]. According to IATA accident category distribution (2014-2018), over 10% of all aviation accidents in Europe result in ground damage, which is also in line with recent Boeing statistical summaries [5, 6]. The economic damage of a single ramp accident is estimated at $ 250,000 on average and $5 billion in total a year [7]. The International Civil Aviation Organization’s (ICAO) concept for an Advanced Surface Movement Guidance and Control Systems (A-SMGCS) aims at overcoming the weather/lighting and line-of-sight dependencies of the OTWV in airport ground surveillance [8]. The most recent Eurocontrol Specification for A-SMGCS Services also considers automated controller assistance functions such as e.g., the detection of conflicts between aircraft or vehicles on or near the runway [9]. Light detection and Ranging (LiDAR) sensors combined with computer vision algorithms for object recognition seem to be a promising candidate for a cost-effective augmented reality solution compared to sole OTWV that functions well and that does not rely on cooperative objects to be controlled [10]. A. Problem statement and overview Airport ground surveillance deeply relies on the real-time availability of highly precise and (weather/lighting) robust sensor data with high integrity and continuity levels capturing the local traffic situation on the movement area. It is therefore crucial to quantify to what extent the sensor-based performance