Abstract | ||
---|---|---|
Recent work by Siegelmann and Sontag hod demonatrated that polynomial time on linear satu- rated recurrent neural networks equab polynomial time on standard computational models: !bring machines if the weights of the net are rationab, and nonuniform circuits if the weights are reab. Here we develop further connections between the languages recognized by such neural nets and other complen'ty classes. We present connections to space- bounded classes, simulation of parallel compu- tational models such as Vector Machines, and a dis- cussion of the charaderkations of various nonuni- form classes in terms of Kolmogorov complezity. |
Year | DOI | Venue |
---|---|---|
1993 | 10.1109/SCT.1993.336521 | Structure in Complexity Theory Conference |
Keywords | Field | DocType |
Turing machines,computational complexity,parallel algorithms,recurrent neural nets,Kolmogorov complexity,Turing machines,Vector Machines,complexity classes,linear saturated recurrent neural networks,neural computation,nonuniform circuits,parallel computational models,polynomial time,space-bounded classes,structural complexity | Quantum complexity theory,PH,Complexity class,Structural complexity theory,Kolmogorov complexity,Computer science,Algorithm,Theoretical computer science,Descriptive complexity theory,Time complexity,Computational complexity theory | Conference |
Citations | PageRank | References |
11 | 0.99 | 5 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
José L. Balcázar | 1 | 701 | 62.06 |
Ricard Gavaldà | 2 | 1265 | 81.30 |
Hava T. Siegelmann | 3 | 980 | 145.09 |
Eduardo D. Sontag | 4 | 3134 | 781.88 |