Title
A comparative study of two popular families of sparsity-aware adaptive filters
Abstract
In this paper, we review two families for sparsity-aware adaptive filtering. Proportionate-type NLMS filters try to accelerate filter convergence by assigning each filter weight a different gain that depends on its actual value. Sparsity-norm regularized filters penalize the cost function minimized by the filter using sparsity-promoting norms (such as ℓ0 or ℓ1) and derive new stochastic gradient descent rules from the regularized cost function. We compare both families of algorithms in terms of computational complexity and studying how well they deal with the convergence vs steady-state error tradeoff. We conclude that sparsity-norm regularized filters are computationally less expensive and can achieve a better tradeoff, making them more attractive in principle. However, selection of the strength of the regularization term seems to be a critical element for the good performance of these filters.
Year
DOI
Venue
2014
10.1109/CIP.2014.6844507
CIP
Keywords
Field
DocType
adaptive filters,computational complexity,convergence of numerical methods,gradient methods,least mean squares methods,convergence,normalized least mean squares,proportionate-type nlms filters,regularized cost function,sparsity-aware adaptive filters,sparsity-norm regularized filters,sparsity-promoting norms,steady-state error tradeoff,stochastic gradient descent rules,ℓ1 regularization,sparse system identification,proportionate adaptive filters,algorithm design and analysis,adaptive systems,cost function,signal to noise ratio,steady state
Convergence (routing),Mathematical optimization,Stochastic gradient descent,Regularization (mathematics),Adaptive filter,Recursive least squares filter,Mathematics,Computational complexity theory
Conference
ISSN
Citations 
PageRank 
2327-1671
2
0.44
References 
Authors
7
6