Abstract | ||
---|---|---|
We present results of computational experiments with an extension of the Perceptron algorithm by a special type of simulated annealing. The simulated annealing procedure employs a logarithmic cooling schedule c(k)=Γ/ln(k+2), where Γ is a parameter that depends on the underlying configuration space. For sample sets S of n-dimensional vectors generated by randomly chosen polynomials w1·x1a1+···+wn·xnan⩾ϑ, we try to approximate the positive and negative examples by linear threshold functions. The approximations are computed by both the classical Perceptron algorithm and our extension with logarithmic cooling schedules. For n=256,…, 1024 and ai=3,…, 7, the extension outperforms the classical Perceptron algorithm by about 15% when the sample size is sufficiently large. The parameter Γ was chosen according to estimations of the maximum escape depth from local minima of the associated energy landscape. |
Year | DOI | Venue |
---|---|---|
2001 | 10.1023/A:1011369322571 | Neural Processing Letters |
Keywords | Field | DocType |
cooling schedules,neural networks,perceptron algorithm,simulated annealing,threshold functions | Simulated annealing,Polynomial,Maxima and minima,Adaptive simulated annealing,Artificial intelligence,Logarithm,Estimation theory,Perceptron,Mathematics,Machine learning,Configuration space | Journal |
Volume | Issue | ISSN |
14 | 1 | 1573-773X |
Citations | PageRank | References |
6 | 0.55 | 13 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Andreas A Albrecht | 1 | 164 | 31.44 |
C. K. Wong | 2 | 1459 | 513.44 |