Abstract | ||
---|---|---|
While much work has been done on unsupervised learning in feedforward neural network architectures, its potential with (theoretically more powerful) recurrent networks and time-varying inputs has rarely been explored. Here we train Long Short-Term Memory (LSTM) recurrent networks to maximize two information-theoretic objectives for unsupervised learning: Binary Information Gain Optimization (BINGO) and Nonparametric Entropy Optimization (NEO). LSTM learns to discriminate different types of temporal sequences and group them according to a variety of features. |
Year | DOI | Venue |
---|---|---|
2001 | 10.1007/3-540-44668-0_95 | Int. Conference on Artificial Neural Networks |
Keywords | Field | DocType |
recurrent network,unsupervised learning,long short-term memory,discriminate different type,feedforward neural network architecture,lstm recurrent neural networks,nonparametric entropy optimization,time-varying input,temporal sequence,information-theoretic objective,binary information gain optimization,recurrent neural network,feedforward neural network,information gain,long short term memory | Feedforward neural network,Binary information,Computer science,Recurrent neural network,Nonparametric statistics,Unsupervised learning,Artificial intelligence,Deep learning,Artificial neural network,Machine learning,Feed forward | Conference |
Volume | ISSN | ISBN |
2130 | 0302-9743 | 3-540-42486-5 |
Citations | PageRank | References |
9 | 0.49 | 29 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Magdalena Klapper-Rybicka | 1 | 9 | 0.49 |
Nicol N. Schraudolph | 2 | 1185 | 164.26 |
Jürgen Schmidhuber | 3 | 17836 | 1238.63 |