Title
Stochastic Neuromorphic Learning Machines For Weakly Labeled Data
Abstract
At learning tasks where humans typically outperform computers, neuromorphic learning machines can have potential advantages in learning in terms of power and complexity compared to mainstream technologies. Here, we present Synaptic Sampling Machines (S2M), a class of stochastic neural networks that use stochasticity at the connections (synapses) to implement energy efficient semi-and unsupervised learning for weakly or unlabeled data. Stochastic synapses play the dual role of a regularizer during learning and a mechanism for implementing stochasticity in neural networks. We show a S2M network architecture that is well suited for a dedicated digital implementation, that is potentially hundredfold more energy efficient compared to equivalent algorithms operating on GPUs.
Year
Venue
Field
2016
PROCEEDINGS OF THE 34TH IEEE INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD)
Online machine learning,Competitive learning,Semi-supervised learning,Instance-based learning,Computer science,Stochastic neural network,Neuromorphic engineering,Unsupervised learning,Artificial intelligence,Artificial neural network,Machine learning
DocType
ISSN
Citations 
Conference
1063-6404
2
PageRank 
References 
Authors
0.35
7
1
Name
Order
Citations
PageRank
Emre Neftci118317.52