Abstract | ||
---|---|---|
Several variants of a stochastic local search process for constructing the synaptic weights of an Ising perceptron are studied. In this process, binary patterns are sequentially presented to the Ising perceptron and are then learned as the synaptic weight configuration is modified through a chain of single- or double-weight flips within the compatible weight configuration space of the earlier learned patterns. This process is able to reach a storage capacity of alpha approximate to 0.63 for pattern length N = 101 and alpha approximate to 0.41 for N = 1001. If in addition a relearning process is exploited, the learning performance is further improved to a storage capacity of alpha approximate to 0.80 for N = 101 and alpha approximate to 0.42 for N = 1001. We found that, for a given learning task, the solutions constructed by the random walk learning process are separated by a typical Hamming distance, which decreases with the constraint density a of the learning task; at a fixed value of a, the width of the Hamming distance distribution decreases with N. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1088/1742-5468/2010/08/P08014 | JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT |
Keywords | Field | DocType |
disordered systems (theory),neuronal networks (theory),analysis of algorithms,stochastic search | Discrete mathematics,Combinatorics,Random walk,Quantum mechanics,Analysis of algorithms,Ising model,Hamming distance,Local search (optimization),Perceptron,Synaptic weight,Mathematics,Configuration space | Journal |
Volume | Issue | ISSN |
abs/1003.1 | 08 | 1742-5468 |
Citations | PageRank | References |
2 | 0.46 | 2 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Haiping Huang | 1 | 5 | 1.95 |
Haijun Zhou | 2 | 76 | 12.53 |