Abstract | ||
---|---|---|
Continuous deep learning architectures have recently re-emerged as variants of Neural Ordinary Differential Equations (Neural ODEs). The infinite-depth approach offered by these models theoretically bridges the gap between deep learning and dynamical systems; however, deciphering their inner working is still an open challenge and most of their applications are currently limited to the inclusion as generic black-box modules. In this work, we "open the box" and offer a system-theoretic perspective, including state augmentation strategies and robustness, with the aim of clarifying the influence of several design choices on the underlying dynamics. We also introduce novel architectures: among them, a Galerkin-inspired depth-varying parameter model and neural ODEs with data-controlled vector fields. |
Year | Venue | DocType |
---|---|---|
2020 | NIPS 2020 | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Stefano Massaroli | 1 | 0 | 2.37 |
Poli Michael | 2 | 0 | 0.68 |
Jinkyoo Park | 3 | 12 | 7.83 |
Atsushi Yamashita | 4 | 321 | 67.29 |
Hajime Asama | 5 | 826 | 237.10 |