Title | ||
---|---|---|
Discrete symbolic optimization and Boltzmann sampling by continuous neural dynamics: Gradient Symbolic Computation. |
Abstract | ||
---|---|---|
Gradient Symbolic Computation is proposed as a means of solving discrete global optimization problems using a neurally plausible continuous stochastic dynamical system. Gradient symbolic dynamics involves two free parameters that must be adjusted as a function of time to obtain the global maximizer at the end of the computation. We provide a summary of what is known about the GSC dynamics for special cases of settings of the parameters, and also establish that there is a schedule for the two parameters for which convergence to the correct answer occurs with high probability. These results put the empirical results already obtained for GSC on a sound theoretical footing. |
Year | Venue | Field |
---|---|---|
2018 | arXiv: Computation and Language | Convergence (routing),Symbolic dynamics,Applied mathematics,Computer science,Symbolic computation,Sampling (statistics),Artificial intelligence,Boltzmann constant,Dynamical system,Machine learning,Computation,Free parameter |
DocType | Volume | Citations |
Journal | abs/1801.03562 | 0 |
PageRank | References | Authors |
0.34 | 0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Paul Tupper | 1 | 0 | 0.68 |
Paul Smolensky | 2 | 215 | 93.76 |
Pyeong Whan Cho | 3 | 0 | 0.34 |