Title
Long-term Forecasting using Tensor-Train RNNs.
Abstract
We present Tensor-Train RNN (TT-RNN), a novel family of neural sequence architectures for multivariate forecasting in environments with nonlinear dynamics. Long-term forecasting in such systems is highly challenging, since there exist long-term temporal dependencies, higher-order correlations and sensitivity to error propagation. Our proposed tensor recurrent architecture addresses these issues by learning the nonlinear dynamics directly using higher order moments and high-order state transition functions. Furthermore, we decompose the higher-order structure using the tensor-train (TT) decomposition to reduce the number of parameters while preserving the model performance. We theoretically establish the approximation properties of Tensor-Train RNNs for general sequence inputs, and such guarantees are not available for usual RNNs. We also demonstrate significant long-term prediction improvements over general RNN and LSTM architectures on a range of simulated environments with nonlinear dynamics, as well on real-world climate and traffic data.
Year
Venue
Field
2017
arXiv: Learning
Time series,Higher order moments,Mathematical optimization,Propagation of uncertainty,Nonlinear system,Tensor,Computer science,Multivariate statistics,Artificial intelligence,Tensor train,Machine learning
DocType
Volume
Citations 
Journal
abs/1711.00073
6
PageRank 
References 
Authors
0.49
15
4
Name
Order
Citations
PageRank
Qi Yu118812.87
Qi Yu218812.87
Stephan Zheng3142.63
Animashree Anandkumar41629116.30