Title
Online Regression with Model Selection.
Abstract
Online learning algorithms have a wide variety of applications in large scale machine learning problems due to their low computational and memory requirements. However, standard online learning methods still suffer some issues such as lower convergence rates and limited capability to select features or to recover the true features. In this paper, we present a novel framework for online learning based on running averages and introduce a series of online versions of some popular existing offline algorithms such as Adaptive Lasso, Elastic Net and Feature Selection with Annealing. We prove the equivalence between our online methods and their offline counterparts and give theoretical feature selection and convergence guarantees for some of them. In contrast to the existing online methods, the proposed methods can extract model with any desired sparsity level at any time. Numerical experiments indicate that our new methods enjoy high feature selection accuracy and a fast convergence rate, compared with standard stochastic algorithms and offline learning algorithms. We also present some applications to large datasets where again the proposed framework shows competitive results compared to popular online and offline algorithms.
Year
Venue
Field
2018
arXiv: Machine Learning
Convergence (routing),Offline learning,Feature selection,Elastic net regularization,Lasso (statistics),Model selection,Online and offline,Rate of convergence,Artificial intelligence,Mathematics,Machine learning
DocType
Volume
Citations 
Journal
abs/1803.11521
0
PageRank 
References 
Authors
0.34
9
2
Name
Order
Citations
PageRank
Lizhe Sun100.34
Adrian Barbu276858.59