Title
Self-Consistent Learning of Neural Dynamical Systems From Noisy Time Series
Abstract
We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical system using deep neural networks. Under the assumption that the trajectories follow the latent dynamical system, we determine the unknowns of the dynamical system, and filter out stochastic outliers in the measurements. In this sense the method is self-consistent. The embedding dimension is determined iteratively during the training by using the false-nearest-neighbors Algorithm and it is implemented as an attention map to the state vector. This allows for a state space reconstruction without a priori information on the signal. By exploiting the differentiability of the neural solution trajectory, we can define the neural dynamical system locally at each time, mitigating the need for forward and backwards passing through numerical solvers of the canonical adjoint method. On a chaotic time series masked by additive Gaussian noise, we demonstrate that the denoising ability and the predictive power of the proposed method are mainly due to the self-consistency, insensitive to methods used for the state space reconstruction.
Year
DOI
Venue
2022
10.1109/TETCI.2022.3146332
IEEE Transactions on Emerging Topics in Computational Intelligence
Keywords
DocType
Volume
Denoising and deterministic forecasting,neural dynamical systems,self-consistent learning
Journal
6
Issue
ISSN
Citations 
5
2471-285X
0
PageRank 
References 
Authors
0.34
6
2
Name
Order
Citations
PageRank
Zhe Wang100.34
Claude Guet200.34