International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 07 Issue: 07 | July 2020 www.irjet.net p-ISSN: 2395-0072
© 2020, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1073
Multi-Sensor Fusion and Sensor Calibration for Autonomous Vehicles
Smitha Gogineni
Staff Engineer Instrumentation & Controls, Texas, USA
---------------------------------------------------------------------***----------------------------------------------------------------------
Abstract - Usage of Autonomous vehicles have grown in the
recent years even though their safety records are still in
question. While the manufacturers are adding more
sophisticated sensors and technology to make them safer,
there are still issues with cars classification and detection of
objects around them. Automated sensors with the combination
of automotive software and computers perform a vital role in
autonomous driving as they monitor surroundings, detect
obstacles, they allow the automation system to take over full
control of the vehicle, thereby saving drivers a significant
amount of time by doing tasks in much more efficient and safe
ways and safely plan the routes and paths autonomously.
While autonomous vehicle technology appears to be
developing at a continual pace, so far no commercially
available vehicles have yet passed the required level 4 ranking
for road-safe autonomous vehicles as the contemporary
autonomous vehicle (AV) systems face critical obstacles along
the road to reaching the primary safety and reliability goals.
There is still a huge amount of technology improvement that
needs to be taken in order to ensure autonomous vehicle safety
on the roads. This paper presents the current advancements of
the autonomous vehicle driving technologies and points to the
still existing performance challenges for the development of
level 5 fully automated autonomous driving.
Key Words: Autonomous vehicles, Advanced driver
assistance systems, Autonomous driving, Automotive,
Intelligent vehicles, LIDAR, Sensor Calibration, Sensor Fusion
1. INTRODUCTION
This year is supposed to be a remarkable year for self-
driving cars where all major autonomous vehicle (AV) car
makers have boldly declared years ago that this year the full
automation autonomous driving and the permanent
backseat driver status will be achieved. Even with
extraordinary efforts from many of the leading auto makers,
the fully autonomous cars are still out of reach and almost
every one of the above predictions has been rolled back as
the engineering teams of all these companies realized the
complexity of the target and that this is going to be a much
more incremental process. Possibly the major technical
hindrance, is adapting human intelligence that enables car
driving which was taken for granted to be easily replicated
to autonomous driving systems proving previous predictions
to be far too optimistic. There is an imminent gap, an
important fact that current levels of vehicle autonomy are
adequately low and the accountability for supervisory
actions still resides with the drivers to operate the vehicles
safely. There are welfare benefits of autonomous vehicles
that could possibly eliminate emissions, increase traffic
efficiency, improve road safety with accurate driving
decisional problems associated with the human infirmities of
fatigue, misperceptions, distractions and intoxication in the
context of driving. As such there is a strong need to develop
the autonomous vehicles to SAE level 5 to reap the outcome
of this autonomous revolution.
2. AUTONOMOUS VEHICLE SENSORS
The classifications of autonomous driving are the adopted
standards J3016 of the international engineering and
automotive industry association, Society of Automotive
Engineers SAE and U.S. Department of Transportation’s
National Highway Traffic Safety Administration (NHTSA) are
as follows
Level 0: Driver only: the human driver controls everything
independently, steering, throttle, brakes, etc.
Level 1: Assisted driving: assistance systems help during
vehicle operation (Cruise Control, ACC) – Year 2000
Level 2: Partial automation: the operator must monitor
the system at all times. At least one system, such as cruise
control and lane centering, is fully automated – Year 2013
Level 3: Conditional automation: the operator monitors
the system and can intervene when necessary. Safety-critical
functions, under certain circumstances, are shifted to the
vehicle – Year 2018
Level 4: High automation: there is no monitoring by the
driver required. Vehicles are designed to operate safety-
critical functions and monitor road conditions for an entire
trip. However, the functions do not cover all every driving
scenario and are limited to the operational design of the
vehicle – Year 2024
Level 5: Full automation: operator-free driving – Year 2030
2.1 Cameras
The most commonly used and primary sensors by all top
and leading driverless technology developers are video
cameras, radar sensors, ultrasonic sensors and lidar sensors.
These sensors are further classified as active and passive,
while active sensors send out energy in the form of a wave
and detect objects based upon the information received such
as radar sensors, the passive sensors simply receive
information from the environment without emitting a wave,
such as cameras. Camera/Image Sensors
Cars manufactured from 2018 already have the reverse
cameras and the front cameras for lane departure warning