Title
Hebbian semi-supervised learning in a sample efficiency setting
Abstract
We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), with a semi-supervised training strategy that combines Hebbian learning with gradient descent: all internal layers (both convolutional and fully connected) are pre-trained using an unsupervised approach based on Hebbian learning, and the last fully connected layer (the classification layer) is trained using Stochastic Gradient Descent (SGD). In fact, as Hebbian learning is an unsupervised learning method, its potential lies in the possibility of training the internal layers of a DCNN without labels. Only the final fully connected layer has to be trained with labeled examples.
Year
DOI
Venue
2021
10.1016/j.neunet.2021.08.003
Neural Networks
Keywords
DocType
Volume
Convolutional Neural Networks,Computer vision,Semi-supervised learning,Hebbian learning,Sample efficiency
Journal
143
Issue
ISSN
Citations 
1
0893-6080
1
PageRank 
References 
Authors
0.41
0
4
Name
Order
Citations
PageRank
Gabriele Lagani111.76
Fabrizio Falchi245955.65
Claudio Gennaro349057.23
Giuseppe Amato4505106.68