Title
Technical Note—Consistency Analysis of Sequential Learning Under Approximate Bayesian Inference
Abstract
Abstract“Bayesian learning works even with censored information” We often learn about a problem from information that is incomplete or censored: for example, a medical treatment may cause side effects with no indication of what the right dose should have been. Bayesian belief models are useful in such settings, but cannot be constructed using traditional methods; as a result, practitioners have developed ways of constructing them approximately. These approximations have been very successful in many application domains, yet until now have lacked theoretical support. The paper “Consistency analysis of sequential learning under approximate Bayesian inference,” by Chen and Ryzhov, links approximate Bayesian learning to stochastic approximation theory. Using this link, the authors prove – for the first time – the consistency of a suite of approximate Bayesian methods culled from the literature. One highlight is an entirely new consistency proof for Bayesian logistic regression, a well-established approximation technique that essentially treats logistic regression as if it were ordinary least squares.Approximate Bayesian inference is a powerful methodology for constructing computationally efficient statistical mechanisms for sequential learning from incomplete or censored information. Approximate Bayesian learning models have proven successful in a variety of operations research and business problems; however, prior work in this area has been primarily computational, and the consistency of approximate Bayesian estimators has been a largely open problem. We develop a new consistency theory by interpreting approximate Bayesian inference as a form of stochastic approximation (SA) with an additional “bias” term. We prove the convergence of a general SA algorithm of this form and leverage this analysis to derive the first consistency proofs for a suite of approximate Bayesian models from the recent literature.
Year
DOI
Venue
2020
10.1287/opre.2019.1850
Periodicals
Keywords
Field
DocType
statistical learning,approximate Bayesian inference,censored information,incomplete information,Bayesian logistic regression
Mathematical optimization,Technical note,Bayesian inference,Medical treatment,Stochastic modelling,Statistical learning,Artificial intelligence,Sequence learning,Complete information,Machine learning,Mathematics,Consistency analysis
Journal
Volume
Issue
ISSN
68
1
0030-364X
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Ye Chen100.34
Ilya O. Ryzhov212814.12