978-3-9810801-8-6/DATE12/©2012 EDAA
Low-power Embedded System for Real-Time
Correction of Fish-Eye Automotive Cameras
Mauro Turturici, Sergio Saponara
Department of Information Engineering
University of Pisa
Via G. Caruso 16, 56122, Pisa (PI), I
Luca Fanucci
Consorzio Pisa Ricerche scarl
Corso Italia 116 56125 Pisa (PI), I
Emilio Franchi
R.I.CO. srl
Via Adriatica 17, 60022, Castelfidardo
(AN), I
Abstract— The design and the implementation of a flexible and
cost-effective embedded system for real-time correction of fish-
eye automotive cameras is presented. Nowadays many car
manufacturers already introduced on-board video systems,
equipped with fish-eye lens, to provide the driver a better view of
the so-called blind zones. A fish-eye lens achieves a larger field of
view (FOV) but, on the other hand, causes distortion, both radial
and tangential, of the images projected on the image sensor. Since
radial distortion is noticeable and dangerous, a real-time system
for its correction is presented, whose low-power, low-cost and
flexibility features are suitable for automotive applications.
Keywords— Fish-eye camera, video automotive assistance
systems, real-time image processing, distortion correction, radial
distortion, fish-eye lens, blind zones.
I. INTRODUCTION AND REVIEW OF STATE OF THE ART
FISH-EYE CORRECTION SYSTEMS
In last years the use of cameras for automotive application
has increased a lot [1][2]. Nowadays many car manufacturers
offer video systems on their vehicles to give to the driver a
better view of the so-called “blind spots”. Fish-eye lenses are
commonly used for automotive applications, due to their large
FOV but, on the other hand, they suffer of radial and tangential
distortion. Since it is very important to give a correct view to
the driver, adjustment is required for video captured by a
camera equipped with a fish-eye lens. Correction of images
affected by fish-eye effects has been treated in literature, but
until nowadays existing solutions refer to off-line correction of
a still picture with a software running on a PC. On the contrary,
real-time processing is needed for automotive driver assistance.
A low-power and low-cost implementing platform is also
required for automotive applications characterized by large
volume market and where power-efficiency is becoming a
main issue. Few solutions have been proposed for real-time
fish-eye correction. FPGA-based solutions have been
announced by Altera [3] and Xylon [4]. Both technologies are
based on volatile SRAM technology so an external non-volatile
memory device is needed. Moreover these solutions implement
just a fixed correction algorithm while, to adapt the solution to
different types of lenses, cameras and displays, an higher level
of flexibility is required. Another solution has been recently
announced by Techwell [5]. More information about this
solution are not available, but it is known that the system is
based on proprietary and custom Intersil Image Signal
Processor, specifically designed for Techwell surveillance
devices, and this reduces its flexibility for other cameras with
different correction requirements.
To overcome the above issues this paper presents a low-
cost, flexible and real-time DSP solution for correcting video
stream captured by cameras equipped with fish-eye lenses. The
paper is focused on the implementation aspects while fish-eye
lens theory and the used correction algorithms have been
discussed and detailed in [6] and [7].
II. FISH-EYE EFFECT CORRECTION
A fish-eye optic can easily reach an angular FOV wider
than 180 degrees but causes image distortion effects: radial
and tangential. The radial one is the most noticeable and so it
is the bottleneck for the successful application of fish-eye
cameras to automotive video systems. Several types of fish-
eye lens exist, each differs from others for its mapping
function, i.e. a mathematical formula that associates points of
the image sensor to points of the scene. Let R
fish
be the
distance between the optical axis (the line that ideally goes
from the center of the scene to the center of the image sensor,
perpendicular to this) and the projected point on the image
sensor, and let θ be the angle from a point on the scene and the
optical axis. An example of mapping function is given by Eq.
1. The parameter f is the distance between the objective and
the image sensor.
2 sin
2
fish
R f
θ
⎛ ⎞
⎜ ⎟
⎝ ⎠
=
(Eq. 1)
Since the mapping function of a normal (without any
distortion) lens is given by Eq. 2, it is possible to re-arrange
pixels of the source distorted image to get a new target image
without any noticeable distortion.
tan( )
norm
R f θ = (Eq. 2)
Consider a blank image with the same resolution as the
source, distorted, one. For every target pixel of the blank
image (x
t
, y
t
) it is possible to compute the coordinates of the
correct pixel (x
s
, y
s
) by reversing and re-arranging Eq. 1 and
Eq. 2, and assuming that tangential distortion is negligible.
The results of this operation is showed in Eq. 3
1
2 2
2 2
1
2
2 ,
, sin tan
t t
s s
t t
t t
x y
f
x y
fx y
x y
-
+
⎡ ⎛ ⎞⎤
⎜ ⎟ ⎢ ⎥
⎜ ⎟
+ ⎢ ⎥
⎣ ⎝ ⎠⎦
=
(Eq. 3)
Further details about this method, called “back-mapping
method”, are given in [7].