TUNING ASYMBOOST CASCADES IMPROVES FACE DETECTION
I. Visentini, C. Micheloni, G.L. Foresti
University of Udine
Department of Computer Science
Via delle Scienze 206, 33100 Udine, ITALY
visentin, michelon, foresti@dimi.uniud.it
ABSTRACT
The face detection problem is certainly one of the most stud-
ied topics in artificial vision. This interest raises from the con-
science that this is a crucial step for every system that uses
biometric information. Video surveillance and security sys-
tems, biometrics, HCI and multimedia applications are some
examples of systems that exploit face localization to improve
their robustness. AdaBoost and AsymBoost based classifiers
are widely used to achieve high performances saving compu-
tational time. In this paper, a new reactive strategy to build
a strong classifier cascade is provided; at each stage of the
cascade a different tradeoff between accuracy and computa-
tional complexity is explored. The results will show that this
method is effective, and propose a way to construct a rapid
and robust multipose detector.
Index Terms— AdaBoost, AsymBoost, Boosting, Reac-
tive learning, Face detection
1. INTRODUCTION
During the last years, the face detection field was deeply in-
vestigated. The possible wide application of this technique
made grow an increasing interest. In fact, the face detector
is a fundamental step in many systems that use biometric in-
formation, such as video surveillance, security, Human Com-
puter Interaction, games and multimedia, face and facial ex-
pression recognition ones.
When the detection of an object has to be performed on
real and complex scenario, the problem is a pattern recogni-
tion task. Boosting techniques [1, 2, 3] have been proved to
be really efficient in handling the problem especially for what
concerns the face detection one. The rising investigation was
due to the necessity of a robust detector that cuts off the false
negatives (missed faces) by keeping low the false positives
(alarms).
AdaBoost [4] was the first implementation of a Boosting
algorithm, but unfortunately did not face the problem of the
disparity between positive samples (i.e., faces) and negative
samples (i.e.,no-faces) that are available in real images. Other
considerable developements of the boosting idea are RealAd-
aBoost [5], in wich the confidence is a real value, FloatBoost
[6], that uses an euristic to backward weak hypotheses, and
AsymBoost [3], that introduces a mechanism able to asym-
metrically weight the two class’ samples. The face detection
problem applied to a multipose case is presented in [7], where
a rapid multiface detector is shown to be effective and capable
of handling more pose variations.
The novelty of our solution is represented by a strategy
that works on the asymmetry of boosting problems. As con-
sequence of the False Positive (FP) rate achieved so far, the
asymmetry parameter is tuned during the learning rounds, al-
lowing an active reaction. We further developed such a strat-
egy to apply it even in context of cascade of classifiers. We
will show how, using an automated tuning strategy during
both the learning of the single classifiers and the construc-
tion of the cascade, at the same time the number of False
Negatives can be significantly reduced without affecting the
performance of the overall system.
In the remainder of the paper we will give in Section 2 a
short description of the boosting techniques, where particu-
lare attention is paid to Adaboost and Asymboost algorithms
and their exploiting in cascade. In Section 3, we will resume
the new concept of reactive control of the asymmetry during
the learning levels. Finally, in Section 5 we will show how
the proposed strategy works in multipose case, ensuring good
performance and saving computational effort.
2. BOOSTING, ADABOOST AND ASYMBOOST
Boosting algorithms are iterative procedures which produce a
linear combination of simple hypotheses h
1
,...,h
T
to gen-
erate a robust ensemble [1]
H(x)= sign
T
n=1
α
t
h
t
(x)
(1)
where each hypothesis h
i
is slighty better than random guess-
ing, α
i
are the coefficients used by linear combination.
Such an iterative batch learning algorithm is based on two
main ideas:
IV - 477 1-4244-1437-7/07/$20.00 ©2007 IEEE ICIP 2007