Title
Echo State Networks With Self-Normalizing Activations On The Hyper-Sphere
Abstract
Among the various architectures of Recurrent Neural Networks, Echo State Networks (ESNs) emerged due to their simplified and inexpensive training procedure. These networks are known to be sensitive to the setting of hyper-parameters, which critically affect their behavior. Results show that their performance is usually maximized in a narrow region of hyper-parameter space called edge of criticality. Finding such a region requires searching in hyper-parameter space in a sensible way: hyper-parameter configurations marginally outside such a region might yield networks exhibiting fully developed chaos, hence producing unreliable computations. The performance gain due to optimizing hyper-parameters can be studied by considering the memory-nonlinearity trade-off, i.e., the fact that increasing the nonlinear behavior of the network degrades its ability to remember past inputs, and vice-versa. In this paper, we propose a model of ESNs that eliminates critical dependence on hyper-parameters, resulting in networks that provably cannot enter a chaotic regime and, at the same time, denotes nonlinear behavior in phase space characterized by a large memory of past inputs, comparable to the one of linear networks. Our contribution is supported by experiments corroborating our theoretical findings, showing that the proposed model displays dynamics that are rich-enough to approximate many common nonlinear systems used for benchmarking.
Year
DOI
Venue
2019
10.1038/s41598-019-50158-4
SCIENTIFIC REPORTS
DocType
Volume
Issue
Journal
9
1
ISSN
Citations 
PageRank 
2045-2322
1
0.35
References 
Authors
0
3
Name
Order
Citations
PageRank
Pietro Verzelli111.71
Cesare Alippi21040115.84
Lorenzo Livi330425.67