Title
Expressive Power and Approximation Errors of Restricted Boltzmann Machines.
Abstract
We present explicit classes of probability distributions that can be learned by Restricted Boltzmann Machines (RBMs) depending on the number of units that they contain, and which are representative for the expressive power of the model. We use this to show that the maximal Kullback-Leibler divergence to the RBM model with $n$ visible and $m$ hidden units is bounded from above by $n - \left\lfloor \log(m+1) \right\rfloor - \frac{m+1}{2^{\left\lfloor\log(m+1)\right\rfloor}} \approx (n -1) - \log(m+1)$. In this way we can specify the number of hidden units that guarantees a sufficiently rich model containing different classes of distributions and respecting a given error tolerance.
Year
Venue
Field
2011
NIPS
Discrete mathematics,Boltzmann machine,Divergence,Error tolerance,Probability distribution,Expressive power,Mathematics,Bounded function
DocType
ISSN
Citations 
Conference
Advances in Neural Information Processing Systems 24, pages 415-423, 2011
12
PageRank 
References 
Authors
0.77
14
3
Name
Order
Citations
PageRank
Guido Montúfar122331.42
Johannes Rauh215216.63
Nihat Ay335847.47