Abstract | ||
---|---|---|
Complex number have long been favoured for digital signal processing, yet complex representations rarely appear in deep learning architectures. RNNs, widely used to process time series and sequence information, could greatly benefit from complex representations. We present a novel complex gate recurrent cell. When used together with norm-preserving state transition matrices, our complex gated RNN exhibits excellent stability and convergence properties. We demonstrate competitive performance of our complex gated RNN on the synthetic memory and adding task, as well as on the real-world task of human motion prediction. |
Year | Venue | Field |
---|---|---|
2018 | neural information processing systems | Convergence (routing),Digital signal processing,Complex number,Matrix (mathematics),Recurrent neural network,Human motion,Artificial intelligence,Deep learning,Machine learning,Mathematics,Process time |
DocType | Volume | Citations |
Journal | abs/1806.08267 | 1 |
PageRank | References | Authors |
0.35 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Moritz Wolter | 1 | 1 | 3.73 |
Yao, Angela | 2 | 582 | 28.10 |