Title
Neural network tomography: Network replication from output surface geometry
Abstract
Multilayer perceptron networks whose outputs consist of affine combinations of hidden units using the tanh activation function are universal function approximators and are used for regression, typically by reducing the MSE with backpropagation. We present a neural network weight learning algorithm that directly positions the hidden units within input space by numerically analyzing the curvature of the output surface. Our results show that under some sampling requirements, this method can reliably recover the parameters of a neural network used to generate a data set.
Year
DOI
Venue
2011
10.1016/j.neunet.2011.01.006
Neural Networks
Keywords
Field
DocType
function approximation,supervised learning,regression,backpropagation,neural network,activation function,multilayer perceptron
Feedforward neural network,Activation function,Algorithm,Probabilistic neural network,Time delay neural network,Multilayer perceptron,Artificial intelligence,Echo state network,Backpropagation,Artificial neural network,Machine learning,Mathematics
Journal
Volume
Issue
ISSN
24
5
0893-6080
Citations 
PageRank 
References 
4
0.46
11
Authors
4
Name
Order
Citations
PageRank
Rupert C. J. Minnett140.46
Andrew T. Smith2609.22
William C. Lennon Jr.340.46
Robert Hecht-Nielsen4448110.50