Contents lists available at ScienceDirect
Signal Processing
journal homepage: www.elsevier.com/locate/sigpro
Optimal constraint vectors for set-membership affine projection
algorithms
☆
Wallace A. Martins
a,
⁎
, Markus V.S. Lima
a
, Paulo S.R. Diniz
a
, Tadeu N. Ferreira
b
a
DEL–DEE/Poli & PEE/COPPE, Federal University of Rio de Janeiro (UFRJ), Rio de Janeiro, Brazil
b
Fluminense Federal University, Rio de Janeiro, Brazil
ARTICLE INFO
Keywords:
Adaptive signal processing
Set-membership filtering
Affine projection
Convex optimization
ABSTRACT
There is a growing interest in adaptive filtering solutions whose learning processes are data selective, bringing
about computational reduction and energy savings while improving estimation accuracy. The set-membership
affine projection algorithms are a representative family of algorithms including data-selection mechanisms. The
update process of these algorithms depends on the choice of a constraint vector (CV) which, up to now, is based
on some heuristics. In this paper we propose an optimal CV and discuss some of its inherent properties. The
resulting problem falls into a convex optimization framework, allowing some unexpected features to surface; for
instance, the widely used simple choice CV is asymptotically optimal for statistically white stationary inputs.
Simulations indicate the optimal CV outperforms the simple choice CV regarding update rates and steady-state
mean squared errors for statistically colored inputs.
1. Introduction
The set-membership (SM) algorithms [1–23] rely on the concept of
set-membership filtering (SMF), which allows the reduction of compu-
tational burden by updating the filter coefficients only in the cases
where the error is greater than a prescribed threshold, i.e., the
innovation in the observed data is checked before the data are used
in the learning process. This SMF property, known as data selection, is
also responsible for the robustness against noise of the SM algorithms.
The set-membership affine projection (SM-AP) algorithm is an
interesting alternative to the AP algorithm [24], for it combines data
reuse [24–30] with data selection, thus yielding a computationally
efficient algorithm with low steady-state mean squared error (MSE)
and high convergence speed. The SM-AP generalizes many algorithms,
including the set-membership normalized least mean square (SM-
NLMS) [5], the set-membership binormalized LMS (SM-BNLMS) [8],
and their non-SM counterparts with unit convergence factor [28].
The only difference between the cost-functions defining the update
rule of the SM-AP and AP algorithms is that the SM-AP does not
impose a null a posteriori error constraint, whereas the AP does. In
fact, the SM-AP constrains the a posteriori error to be equal to the so-
called constraint vector (CV), which guarantees the a posteriori error
will be within the aforementioned prescribed threshold. This fact along
with the input-innovation check are the key aspects fostering the
success of the SM-AP algorithm across different applications.
Although CVs play a central role in SM-AP algorithms, all previous
solutions employ heuristically defined CVs, such as the so-called simple
choice CV (SC-CV) [7,8]. Besides the resulting good performance
induced by those CVs, the heuristics employed to define them find
reasonable geometrical justification—see, for instance, [31]—, which
somehow explains why using them is satisfactory instead of pursuing
optimal CV solutions. Nonetheless, the question remains open: are
there optimal CVs and how do they relate to the most commonly used
heuristic solution SC-CV? This work answers these questions by
recasting the problem in a convex optimization framework.
1.1. Main contributions
•
It is shown how to solve the SM-AP original optimization problem.
In fact, all previous works yield suboptimal solutions in general
setups. The key aspect of the previous suboptimal solutions is their
computational simplicity regarding the update process.
•
It is shown when the widespread used SC-CV coincides with the
optimal CV through geometric and formal algebraic arguments. A
key result here is that the SC-CV is asymptotically optimal for
statistically white inputs.
http://dx.doi.org/10.1016/j.sigpro.2016.11.025
Received 5 August 2016; Received in revised form 26 October 2016; Accepted 29 November 2016
☆
This work was supported by Faperj (E-26/201.390/2014, E-26/202.890/2015, and E-26/010.001573/2016), Capes (23038.009440/2012-42), and CNPq (304112/2013-5,
453868/2014-2, and 311153/2015-1), Brazilian Research Councils.
⁎
Corresponding author.
E-mail addresses: wallace.martins@smt.ufrj.br (W.A. Martins), markus.lima@ieee.org (M.V.S. Lima), diniz@smt.ufrj.br (P.S.R. Diniz), tadeu_ferreira@id.uff.br (T.N. Ferreira).
Signal Processing 134 (2017) 285–294
Available online 02 December 2016
0165-1684/ © 2016 Elsevier B.V. All rights reserved.
MARK