Title
Origin of the computational hardness for learning with binary synapses.
Abstract
Through supervised learning in a binary perceptron one is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights. However, to find such assignments in practice is quite a nontrivial task. The relation between the weight space structure and the algorithmic hardness has not yet been fully understood. To this end, we analytically derive the Franz-Parisi potential for the binary perceptron problem by starting from an equilibrium solution of weights and exploring the weight space structure around it. Our result reveals the geometrical organization of the weight space; the weight space is composed of isolated solutions, rather than clusters of exponentially many close-by solutions. The pointlike clusters far apart from each other in the weight space explain the previously observed glassy behavior of stochastic local search heuristics.
Year
DOI
Venue
2014
10.1103/PhysRevE.90.052813
PHYSICAL REVIEW E
DocType
Volume
Issue
Journal
90
5
ISSN
Citations 
PageRank 
1539-3755
3
0.47
References 
Authors
0
2
Name
Order
Citations
PageRank
Haiping Huang151.95
Yoshiyuki Kabashima213627.83