Title
STDP-Compatible Approximation of Backpropagation in an Energy-Based Model.
Abstract
We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with respect to output units that have received an outside driving force pushing them away from the stationary point. Backpropagated error gradients correspond to temporal derivatives with respect to the activation of hidden units. These lead to a weight update proportional to the product of the presynaptic firing rate and the temporal rate of change of the postsynaptic firing rate. Simulations and a theoretical argument suggest that this rate-based update rule is consistent with those associated with spike-timing-dependent plasticity. The ideas presented in this article could be an element of a theory for explaining how brains perform credit assignment in deep hierarchies as efficiently as backpropagation does, with neural computation corresponding to both approximate inference in continuous-valued latent variables and error backpropagation, at the same time.
Year
DOI
Venue
2017
10.1162/NECO_a_00934
Neural Computation
Field
DocType
Volume
Momentum (technical analysis),Markov chain Monte Carlo,Inference,Models of neural computation,Approximate inference,Latent variable,Stationary point,Artificial intelligence,Backpropagation,Mathematics,Machine learning
Journal
29
Issue
ISSN
Citations 
3
0899-7667
9
PageRank 
References 
Authors
0.48
4
5
Name
Order
Citations
PageRank
Yoshua Bengio1426773039.83
Thomas Mesnard290.48
Asja Fischer313911.26
Saizheng Zhang4572.43
Wu, Yuhuai51589.68