Received 10 July 2022, accepted 27 July 2022, date of publication 8 August 2022, date of current version 17 August 2022. Digital Object Identifier 10.1109/ACCESS.2022.3197665 Multiagent Information Fusion for Connected Driving: A Review JAMES KLUPACS , AMIRALI KHODADADIAN GOSTAR , THARINDU RATHNAYAKE , IQBAL GONDAL, ALIREZA BAB-HADIASHAR , (Senior Member, IEEE), AND REZA HOSEINNEZHAD Royal Melbourne Institute of Technology, Melbourne, VIC 3000, Australia Corresponding author: Amirali Khodadadian Gostar (amirali.khodadadian@rmit.edu.au) This work was supported by the Australian Research Council under Grant DE210101181. ABSTRACT This paper reviews the state-of-the-art multi-sensor fusion approaches applicable in the next-generation intelligent transportation systems where connected vehicles are cooperatively driven for maximum safety and efficiency. The review finds out that complementary sensor fusion in a time-varying distributed network is required, and for such applications, the state-of-the-art is sensor fusion in the random finite set filtering framework. The fundamental bases of random finite set filters are reviewed with more elaboration on a particular filter called the Labeled Multi-Bernoulli filter. An information-theoretic approach for data fusion based on minimizing information divergence between statistical densities is presented, along with how different divergence functions can be used for sensor fusion. Different approaches are evaluated for their tracking performance and computational cost in a realistic simulation scenario. Their advantages, and disadvantages in the context of real-time implementation in a connected driving scenario are discussed. INDEX TERMS Random finite sets, intelligent transport systems, multi-object tracking, information fusion. I. INTRODUCTION Connected devices are increasingly making the world around us smarter, safer, and more efficient. The world of trans- portation and driving is no different. Connected vehicles can help us avoid obstacles, reduce risks on the road and make the driving experience more enjoyable. Having its own con- nection to the internet, a connected vehicle shares data with other devices around it. This makes it possible for vehicles connected in a centralized or distributed network to share their sensory information with each other. Through the integration of all the information received from onboard sensors with those from neighboring vehi- cles, a connected vehicle can achieve a more accurate and comprehensive situational awareness. This can take multiple forms such as multi-modal information fusion from multi- ple onboard sensors such as audio sensors and video captur- ing devices or external GPS signals combined with onboard localization devices to obtain more accurate localization The associate editor coordinating the review of this manuscript and approving it for publication was Byung-Seo Kim . results. Finally, information from multiple vehicles may be combined such as video from opposite ends of a road to gain a more complete awareness of the surroundings, known as complementary fusion, which is the main focus of this paper. This will effectively contribute to increased intelligence of the connected vehicle in making various trajectory plan- ning and local maneuvering actions, whether it is part of an advanced driver-assist capability of the vehicle or its own self-driving capability. In this context, efficient multi-sensor data fusion is an intrinsic part of the design of any intelligent transportation system (ITS) that involves connected vehicles. Sensor fusion solutions have attracted strong interest in multi-vehicle applications within the ITS domain [1]. Several methods have been proposed to fuse the information gathered by a relatively large number of sensors in a multi-vehicle network [2]. These include different information fusion tech- niques and metrics for the main types of network topologies. Although a single vehicle could be configured to have 360 sensing capabilities through radar, lidar or camera that provide information in the immediate surroundings, events outside the sensors’ field-of-view (FoV) range cannot be 85030 This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/ VOLUME 10, 2022