Title | ||
---|---|---|
partial-FORCE: a fast and robust online training method for recurrent neural networks |
Abstract | ||
---|---|---|
Recurrent neural networks (RNNs) are helpful tools for modeling dynamical systems by neuronal populations, but efficiently training RNNs has been a challenging topic. In recent years, a recursive least squares (RLS) based method for modifying all the recurrent connections, called the full-FORCE method, has been gaining attention as a fast and robust online training rule. This method introduces a second network (called the teacher reservoir) during training to provide suitable target dynamics to all the hidden units of the task-performing network (called the student network). Thanks to the RLS-based approach, the full-FORCE method can be applied to training continuous-time networks and spiking neural networks. In this study, we propose a generalized version of the full-FORCE method: the partial-FORCE method. In the proposed method, only part of the student network neurons (called supervised neurons) is supervised by only part of the teacher reservoir neurons (called supervising neurons). As a result of this relaxation, the size of the student network and that of the teacher reservoir can be different, which is biologically plausible as a possible model of the memory transfer in the brain. Furthermore, we numerically show that the partial-FORCE method converges faster and is more robust against variations in parameter values and initial conditions than the full-FORCE method, even without the price of computational cost. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/IJCNN52387.2021.9533964 | 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) |
Keywords | DocType | ISSN |
recurrent neural network, supervised learning, reservoir computing, FORCE learning, full-FORCE | Conference | 2161-4393 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hiroto Tamura | 1 | 0 | 1.35 |
Gouhei Tanaka | 2 | 51 | 11.80 |