Title
Scaled Least Squares Estimator for GLMs in Large-Scale Problems.
Abstract
We study the problem of efficiently estimating the coefficients of generalized linear models (GLMs) in the large-scale setting where the number of observations $n$ is much larger than the number of predictors $p$, i.e. $ngg p gg 1$. We show that in GLMs with random (not necessarily Gaussian) design, the GLM coefficients are approximately proportional to the corresponding ordinary least squares (OLS) coefficients. Using this relation, we design an algorithm that achieves the same accuracy as the maximum likelihood estimator (MLE) through iterations that attain up to a cubic convergence rate, and that are cheaper than any batch optimization algorithm by at least a factor of $mathcal{O}(p)$. We provide theoretical guarantees for our algorithm, and analyze the convergence behavior in terms of data dimensions. % Finally, we demonstrate the performance of our algorithm through extensive numerical studies on large-scale real and synthetic datasets, and show that it achieves the highest performance compared to several other widely used optimization algorithms.
Year
Venue
Field
2016
NIPS
Convergence (routing),Least squares,Mathematical optimization,Ordinary least squares,Maximum likelihood,Cubic convergence,Generalized linear model,Gaussian,Optimization algorithm,Mathematics
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Erdogdu, Murat A.114010.70
Lee H. Dicker233.02
Mohsen Bayati367844.93