Abstract | ||
---|---|---|
The proportionate normalized least mean square (PNLMS) algorithm and its variants are by far the most popular adaptive filters that are used to identify sparse systems. The convergence speed of the PNLMS algorithm, though very high initially, however, slows down at a later stage, even becoming worse than sparsity agnostic adaptive filters like the NLMS. In this paper, we address this problem by introducing a carefully constructed l1 norm (of the coefficients) penalty in the PNLMS cost function which favors sparsity. This results in certain zero attractor terms in the PNLMS weight update equation which help in the shrinkage of the coefficients, especially the inactive taps, thereby arresting the slowing down of convergence and also producing lesser steady state excess mean square error (EMSE). We also carry out the convergence analysis (in mean) of the proposed algorithm. |
Year | Venue | Field |
---|---|---|
2015 | CoRR | Attractor,Least mean squares filter,Convergence (routing),Excess mean square error,Mathematical optimization,Normalization (statistics),Algorithm,Adaptive filter,Mathematics |
DocType | Volume | Citations |
Journal | abs/1507.02921 | 0 |
PageRank | References | Authors |
0.34 | 15 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Rajib Lochan Das | 1 | 35 | 4.97 |
Mrityunjoy Chakraborty | 2 | 124 | 28.63 |