Research Article Neuromorphic Vision Based Multivehicle Detection and Tracking for Intelligent Transportation System Guang Chen , 1,2 Hu Cao, 3 Muhammad Aafaque, 2 Jieneng Chen, 4 Canbo Ye, 1 Florian Röhrbein, 2 Jörg Conradt, 5 Kai Chen, 1 Zhenshan Bing, 2 Xingbo Liu, 1 Gereon Hinz, 2 Walter Stechele , 6 and Alois Knoll 2 1 College of Automotive Engineering, Tongji University, China 2 Robotics and Embedded Systems, Technische Universit¨ at M¨ unchen, Germany 3 State Key Laboratory of Advanced Design and Manufacturing for Vehicle Body, Hunan University, China 4 College of Electronics and Information Engineering, Tongji University, China 5 Department of Computational Science and Technology, KTH Royal Institute of Technology, Sweden 6 Integrated Systems, Technische Universit¨ at M¨ unchen, Germany Correspondence should be addressed to Guang Chen; guang@in.tum.de Received 10 August 2018; Revised 1 October 2018; Accepted 6 November 2018; Published 2 December 2018 Academic Editor: Krzysztof Okarma Copyright © 2018 Guang Chen et al. Tis is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Neuromorphic vision sensor is a new passive sensing modality and a frameless sensor with a number of advantages over traditional cameras. Instead of wastefully sending entire images at fxed frame rate, neuromorphic vision sensor only transmits the local pixel- level changes caused by the movement in a scene at the time they occur. Tis results in advantageous characteristics, in terms of low energy consumption, high dynamic range, sparse event stream, and low response latency, which can be very useful in intelligent perception systems for modern intelligent transportation system (ITS) that requires efcient wireless data communication and low power embedded computing resources. In this paper, we propose the frst neuromorphic vision based multivehicle detection and tracking system in ITS. Te performance of the system is evaluated with a dataset recorded by a neuromorphic vision sensor mounted on a highway bridge. We performed a preliminary multivehicle tracking-by-clustering study using three classical clustering approaches and four tracking approaches. Our experiment results indicate that, by making full use of the low latency and sparse event stream, we could easily integrate an online tracking-by-clustering system running at a high frame rate, which far exceeds the real-time capabilities of traditional frame-based cameras. If the accuracy is prioritized, the tracking task can also be performed robustly at a relatively high rate with diferent combinations of algorithms. We also provide our dataset and evaluation approaches serving as the frst neuromorphic benchmark in ITS and hopefully can motivate further research on neuromorphic vision sensors for ITS solutions. 1. Introduction Neuromorphic vision sensors, inspired by biological vision, use an event-driven frameless approach to capture tran- sients in visual scenes. In contrast to conventional cameras, neuromorphic vision sensors only transmit local pixel-level changes (called “events”) caused by movement in a scene at the time of occurrence and provide an information rich stream of events with a latency within tens of microseconds. To be specifc, a single event is a tuple (, , , ), where x, y are the pixel coordinates of the event in 2D space, t is the time-stamp of the event, and is the polarity of the event, which is the sign of the brightness change (increasing or decreasing). Furthermore, the requirements for data storage and compu- tational resources are drastically reduced due to the sparse nature of the event stream. Apart from the low latency and high storage efciency, neuromorphic vision sensors also enjoy a high dynamic range of 120 dB. In combination, these properties of neuromorphic vision sensors inspire entirely new designs of intelligent transportation systems. In order Hindawi Journal of Advanced Transportation Volume 2018, Article ID 4815383, 13 pages https://doi.org/10.1155/2018/4815383