Title
A Fast Stochastic Gradient Algorithm: Maximal Use Of Sparsification Benefits Under Computational Constraints
Abstract
In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsity the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithm-design as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints, are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.
Year
DOI
Venue
2010
10.1587/transfun.E93.A.467
IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES
Keywords
Field
DocType
adaptive filter, stochastic gradient algorithm, proportionate adaptive filtering
Convergence (routing),Gradient method,Mathematical optimization,Algorithm,Adaptive filter,Constrained optimization problem,Mathematics
Journal
Volume
Issue
ISSN
E93A
2
0916-8508
Citations 
PageRank 
References 
1
0.35
16
Authors
2
Name
Order
Citations
PageRank
Masahiro Yukawa127230.44
Wolfgang Utschick21755176.66