Title
Evolving and merging hebbian learning rules: increasing generalization by decreasing the number of rules
Abstract
ABSTRACTGeneralization to out-of-distribution (OOD) circumstances after training remains a challenge for artificial agents. To improve the robustness displayed by plastic Hebbian neural networks, we evolve a set of Hebbian learning rules, where multiple connections are assigned to a single rule. Inspired by the biological phenomenon of the genomic bottleneck, we show that by allowing multiple connections in the network to share the same local learning rule, it is possible to drastically reduce the number of trainable parameters, while obtaining a more robust agent. During evolution, by iteratively using simple K-Means clustering to combine rules, our Evolve & Merge approach is able to reduce the number of trainable parameters from 61,440 to 1,920, while at the same time improving robustness, all without increasing the number of generations used. While optimization of the agents is done on a standard quadruped robot morphology, we evaluate the agents' performances on slight morphology modifications in a total of 30 unseen morphologies. Our results add to the discussion on generalization, overfitting and OOD adaptation. To create agents that can adapt to a wider array of unexpected situations, Hebbian learning combined with a regularising "genomic bottleneck" could be a promising research direction.
Year
DOI
Venue
2021
10.1145/3449639.3459317
Genetic and Evolutionary Computation Conference
Keywords
DocType
Citations 
Plastic neural networks, local learning, indirect encoding, generalization
Conference
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Joachim Winther Pedersen100.68
Sebastian Risi246054.67