Abstract | ||
---|---|---|
We develop a novel modeling framework for Boltzmann machines, augmenting each hidden unit with a latent transformation assignment variable which describes the selection of the transformed view of the canonical connection weights associated with the unit. This enables the inferences of the model to transform in response to transformed input data in a stable and predictable way, and avoids learning multiple features differing only with respect to the set of transformations. Extending prior work on translation equivariant (convolutional) models, we develop translation and rotation equivariant restricted Boltzmann machines (RBMs) and deep belief nets (DBNs), and demonstrate their effectiveness in learning frequently occurring statistical structure from artificial and natural images. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1007/978-3-642-21735-7_1 | ICANN (1) |
Keywords | Field | DocType |
canonical connection weight,boltzmann machine,input data,rotation equivariant,deep belief net,transformation equivariant boltzmann machine,natural image,translation equivariant,multiple feature,hidden unit,latent transformation assignment variable | Deep belief nets,Boltzmann machine,Equivariant map,Computer science,Canonical connection,Artificial intelligence,Boltzmann constant,Machine learning | Conference |
Volume | ISSN | Citations |
6791 | 0302-9743 | 9 |
PageRank | References | Authors |
0.65 | 11 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jyri J. Kivinen | 1 | 51 | 3.66 |
Christopher K. I. Williams | 2 | 6807 | 631.16 |