Title
A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques
Abstract
This paper introduces a novel variational inference (VI) method with Bayesian and gradient descent techniques. To facilitate the approximation of the posterior distributions for the parameters of the models, the Stein method has been used in Bayesian variational inference algorithms in recent years. Unfortunately, previous methods fail to either explicitly describe the influence of its history in the tracing of particles (Q(x) in this paper) in the approximation, which is important information in the search for particles. In our paper, Q(x) is considered in design of the operator Bp, but the chance of jumping out of the local optimum may be increased, especially in the case of complex distribution. To address the existing issues, a modified Stein variational inference algorithm is proposed, which can make the gradient descent of Kullback-Leibler (KL) divergence more random. In our method, a group of particles are used to approximate target distribution by minimizing the KL divergence, which changes according to the newly defined kernelized Stein discrepancy. Furthermore, the usefulness of the suggested technique is demonstrated by using four data sets. Bayesian logistic regression is considered for classification. Statistical studies such as parameter estimate classification accuracy, F1, NRMSE, and others are used to validate the algorithm's performance.
Year
DOI
Venue
2022
10.3390/sym14061188
SYMMETRY-BASEL
Keywords
DocType
Volume
Stein method, Bayesian variational inference, KL divergence, Bayesian logistic regression
Journal
14
Issue
ISSN
Citations 
6
2073-8994
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Limin Zhang102.70
Jing Dong200.34
Junfang Zhang300.34
Junzi Yang400.34