Title
Online Bayesian Shrinkage Regression
Abstract
The present work introduces an original and new online regression method that extends the shrinkage via limit of Gibbs sampler (SLOG) in the context of online learning. In particular, we theoretically show how the proposed online SLOG (OSLOG) is obtained using the Bayesian framework without resorting to the Gibbs sampler or considering a hierarchical representation. Moreover, in order to define the performance guarantee of OSLOG, we derive an upper bound on the cumulative squared loss. It is the only online regression algorithm with sparsity that gives logarithmic regret. Furthermore, we do an empirical comparison with two state-of-the-art algorithms to illustrate the performance of OSLOG relying on three aspects: normality, sparsity and multicollinearity showing an excellent achievement of trade-off between these properties.
Year
DOI
Venue
2019
10.1007/s00521-020-04947-y
NEURAL COMPUTING & APPLICATIONS
Keywords
DocType
Volume
Regression, Regularisation, Online learning, Competitive analysis
Conference
32
Issue
ISSN
Citations 
23
0941-0643
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Waqas Jamil100.34
Hamid Bouchachia262.84