Title
Parallel Long Short-Term Memory for multi-stream classification
Abstract
Recently, machine learning methods have provided a broad spectrum of original and efficient algorithms based on Deep Neural Networks (DNN) to automatically predict an outcome with respect to a sequence of inputs. Recurrent hidden cells allow these DNN-based models to manage long-term dependencies such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM). Nevertheless, these RNNs process a single input stream in one (LSTM) or two (Bidirectional LSTM) directions. But most of the information available nowadays is from multistreams or multimedia documents, and require RNNs to process these information synchronously during the training. This paper presents an original LSTM-based architecture, named Parallel LSTM (PLSTM), that carries out multiple parallel synchronized input sequences in order to predict a common output. The proposed PLSTM method could be used for parallel sequence classification purposes. The PLSTM approach is evaluated on an automatic telecast genre sequences classification task and compared with different state-of-the-art architectures. Results show that the proposed PLSTM method outperforms the baseline n-gram models as well as the state-of-the-art LSTM approach.
Year
DOI
Venue
2016
10.1109/SLT.2016.7846268
2016 IEEE Spoken Language Technology Workshop (SLT)
Keywords
DocType
Volume
long short-term memory,sequence classification,stream structuring
Conference
abs/1702.03402
ISSN
ISBN
Citations 
2639-5479
978-1-5090-4904-2
1
PageRank 
References 
Authors
0.39
14
5
Name
Order
Citations
PageRank
Mohamed Bouaziz110.72
Mohamed Morchid28422.79
richard dufour39823.98
georges linar es413629.55
Renato De Mori5960161.75