Abstract | ||
---|---|---|
Long Short-Term Memory (LSTM) is one of the best recent supervised sequence learning methods. Using gradient descent, it trains memory cells represented as differentiable computational graph structures. Interestingly, LSTM's cell structure seems somewhat arbitrary. In this paper we optimize its computational structure using a multi-objective evolutionary algorithm. The fitness function reflects the structure's usefulness for learning various formal languages. The evolved cells help to understand crucial features that aid sequence learning. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/978-3-642-04277-5_76 | ICANN (2) |
Keywords | Field | DocType |
evolving memory cell structures,long short-term memory,fitness function,differentiable computational graph structure,aid sequence learning,cell structure,sequence learning,memory cell,computational structure,gradient descent,crucial feature,recent supervised sequence,formal language | Gradient descent,Formal language,Pattern recognition,Evolutionary algorithm,Computer science,Recurrent neural network,Fitness function,Artificial intelligence,Sequence learning,Machine learning,Reinforcement learning,Memory cell | Conference |
Volume | ISSN | Citations |
5769 | 0302-9743 | 23 |
PageRank | References | Authors |
1.81 | 14 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Justin Bayer | 1 | 157 | 32.38 |
Daan Wierstra | 2 | 5412 | 255.92 |
Julian Togelius | 3 | 2765 | 219.94 |
Jürgen Schmidhuber | 4 | 17836 | 1238.63 |