Title | ||
---|---|---|
Investigating Long Short-Term Memory Networks for Various Pattern Recognition Problems. |
Abstract | ||
---|---|---|
The purpose of this paper is to further investigate how and why long short-term memory networks (LSTM) perform so well on several pattern recognition problems. Our contribution is three-fold. First, we describe the main highlights of the LSTM architecture, especially when compared to standard recurrent neural networks (SRN). Second, we give an overview of previous studies to analyze the behavior of LSTMs on toy problems and some realistic data in the speech recognition domain. Third, the behavior of LSTMs is analyzed on novel problems which are relevant for pattern recognition research. Thereby, we analyze the ability of LSTMs to classify long sequences containing specific patterns at an arbitrary position on iteratively increasing the complexity of the problem under constant training conditions. We also compare the behavior of LSTMs to SRNs for text vs. non-text sequence classification on a real-world problem with significant non-local time-dependencies where the features are computed only locally. Finally, we discuss why LSTMs with standard training methods are not suited for the task of signature verification. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1007/978-3-319-08979-9_37 | Lecture Notes in Artificial Intelligence |
DocType | Volume | ISSN |
Conference | 8556 | 0302-9743 |
Citations | PageRank | References |
1 | 0.34 | 13 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sebastian Otte | 1 | 47 | 12.57 |
Marcus Liwicki | 2 | 1292 | 101.35 |
Dirk Krechel | 3 | 44 | 13.19 |