Title
Some Structural Complexity Aspects of Neural Computation
Abstract
Recent work by Siegelmann and Sontag hod demonatrated that polynomial time on linear satu- rated recurrent neural networks equab polynomial time on standard computational models: !bring machines if the weights of the net are rationab, and nonuniform circuits if the weights are reab. Here we develop further connections between the languages recognized by such neural nets and other complen'ty classes. We present connections to space- bounded classes, simulation of parallel compu- tational models such as Vector Machines, and a dis- cussion of the charaderkations of various nonuni- form classes in terms of Kolmogorov complezity.
Year
DOI
Venue
1993
10.1109/SCT.1993.336521
Structure in Complexity Theory Conference
Keywords
Field
DocType
Turing machines,computational complexity,parallel algorithms,recurrent neural nets,Kolmogorov complexity,Turing machines,Vector Machines,complexity classes,linear saturated recurrent neural networks,neural computation,nonuniform circuits,parallel computational models,polynomial time,space-bounded classes,structural complexity
Quantum complexity theory,PH,Complexity class,Structural complexity theory,Kolmogorov complexity,Computer science,Algorithm,Theoretical computer science,Descriptive complexity theory,Time complexity,Computational complexity theory
Conference
Citations 
PageRank 
References 
11
0.99
5
Authors
4
Name
Order
Citations
PageRank
José L. Balcázar170162.06
Ricard Gavaldà2126581.30
Hava T. Siegelmann3980145.09
Eduardo D. Sontag43134781.88