Surface Projection for Mixed Pixel Restoration
Robert L. Larkins
1
, Michael J. Cree, Adrian A. Dorrington, John P. Godbaz
Department of Engineering, University of Waikato
Hamilton, New Zealand
1
Email: RLL6@students.waikato.ac.nz
Abstract—Amplitude modulated full-field range-imagers are
measurement devices that determine the range to an object
simultaneously for each pixel in the scene, but due to the nature of
this operation, they commonly suffer from the significant problem
of mixed pixels. Once mixed pixels are identified a common
procedure is to remove them from the scene; this solution is not
ideal as the captured point cloud may become damaged. This
paper introduces an alternative approach, in which mixed pixels
are projected onto the surface that they should belong. This is
achieved by breaking the area around an identified mixed pixel
into two classes. A parametric surface is then fitted to the class
closest to the mixed pixel, with this mixed pixel then being project
onto this surface. The restoration procedure was tested using
twelve simulated scenes designed to determine its accuracy and
robustness. For these simulated scenes, 93% of the mixed pixels
were restored to the surface to which they belong. This mixed
pixel restoration process is shown to be accurate and robust for
both simulated and real world scenes, thus provides a reliable
alternative to removing mixed pixels that can be easily adapted
to any mixed pixel detection algorithm.
I. I NTRODUCTION
Full-field range-imaging cameras provide the simultaneous
acquisition of range data at each pixel in an image. Full-
field amplitude modulated continuous wave (AMCW) lidar
systems [1] achieve this by illuminating a scene with amplitude
modulated light, and determine the range by measuring the
phase offset between the received light and the transmitted
light. The range to an object at each pixel of the camera is
determined from the phase difference and knowledge of the
speed of light.
The mixed pixel phenomenon occurs in AMCW systems
when the light that a pixel captures is contaminated by multiple
reflections from the scene. The range calculated for a mixed
pixel under the assumption of a single return can erroneously
be anything from zero up to the ambiguity distance [2]. The
details of mixed pixels are described in greater detail in
Section II below.
There has been little research into identifying mixed pixels
in a range image despite mixed pixels being a significant
source of error. The most common approach reported in
literature is to identify mixed pixels from a produced point
cloud, as the characteristics of the mixed pixel in a point
cloud tend to differ from those that are not mixed. The ability
of three mixed pixel identification algorithms, namely the
normal-angle filter, edge-length filter and the cone algorithm,
are investigated by Tang et al. [3]. They found that none
of these three algorithms performed exceptionally well, with
the normal-angle method performing the best of the three.
Two simplistic methods of dealing with mixed pixels, namely,
identifying isolated points in three dimensional coordinate
space and median filtering were mentioned by Hebert and
Krotkov [2], but were not further elaborated upon. Alternative
approaches of detecting and even correcting mixed pixels have
been investigated; these include decomposing the mixed pixels
into their distinct components [4], detecting discontinuities
in the returned signal amplitude [5], and deconvolving the
returned signal, to identify the range and intensity of all signal
returns seen by each pixel [6].
Once a mixed pixel is identified the general approach is
to remove it outright. This deals with the problem of the
mixed pixel, but has potential to introduce other errors, such
as distorting object edges and creating holes in surfaces due
to falsely detected mixed pixels. The method presented in this
paper is designed to restore the mixed pixels to the surface that
they belong to and has the advantage of not removing points.
This paper specifically focusses on the restoration of mixed
pixels, but the presented technique can also help reposition
points affected by noise by utilising the locations of their
neighbours.
In this paper mixed pixel restoration using surface projection
is achieved via a series of steps. The first step of mixed pixel
identification begins once a point cloud has been produced
from a range imaging camera, and is described in Section II.
Section III details how Otsu thresholding is used to segment
the neighbours of a mixed pixel into two clusters. A parametric
surface is fit to each class and each point is projected onto
the closest surface. This surface modelling and projection is
described in Section IV. Testing of the mixed pixel restoration
is carried out by simulating a set of scenes that determine the
precision of this process; this is detailed in Section V.
II. MIXED PIXELS AND THEIR I DENTIFICATION
Mixed pixels are a significant problem prevalent in full-field
AMCW lidar systems which use modulated light to illuminate
a set of objects in a scene. Each pixel of the camera sensor
captures a piece of the returned light, and by determining
the phase of the captured light with respect to the reference
modulation the distance to the area viewed by the pixel is
determined. Mixed pixels occur when the sensor picks up light
that has been reflected back from two or more objects in the
scene. This occurs, for example, when a single pixel sees the
boundary of two adjoining objects at different ranges.
The capture of a scene produces points in a spherical
coordinate system centred on the camera, thus the pixel sees
978-1-4244-4698-8/09/$25.00 ©2009 IEEE
24th International Conference Image and Vision Computing New Zealand (IVCNZ 2009)
- 431 -
© 2009 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, including reprinting/republishing this material for advertising or
promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.