Abstract | ||
---|---|---|
Coding neural network simulators by hand is often a tedious and error-prone task. In this paper, we seek to remedy this situation by presenting a code generator that produces efficient C++ simulation code for a wide variety of backpropagation networks. We define a high-level, Maple-like language that allows the specification of such networks. This language is compiled to C++ code segments that in turn are executable in link with an already given generic code for backpropagation networks. Our generator allows the specification of arbitrary network topologies (with the restriction of full connections between layers) and weightchange formulae, while the activation rule and error propagation rule remain fixed. With this tool, future research on learning rules for backpropagation networks can be made more efficient by eliminating routine work and producing code that is guaranteed to be error-free. |
Year | DOI | Venue |
---|---|---|
1993 | 10.1007/3-540-56798-4_173 | IWANN |
Keywords | Field | DocType |
neural network simulation,automatic generation,backpropagation,network topology,neural network,error propagation,code generation | Computer science,Network topology,Probabilistic neural network,Theoretical computer science,Learning rule,Time delay neural network,Artificial intelligence,Artificial neural network,Backpropagation,Code (cryptography),Machine learning,Executable | Conference |
ISBN | Citations | PageRank |
3-540-56798-4 | 0 | 0.34 |
References | Authors | |
6 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Stephan Dreiseitl | 1 | 338 | 34.80 |
Dongming Wang | 2 | 458 | 55.77 |