Title
A new extension of self-optimizing neural networks for topology optimization
Abstract
The paper introduces a new extension of the ontogenic Self-Optimizing Neural Networks (SONNs)[4] making possible to optimize a neural network (NN) topology for a whole training data (TR) set at once. The classical SONNs optimize a NN topology only for subnetworks related to trained classes. The described SONN extension enables to optimize topology for all classes at once. Moreover, this extension makes possible to compute a minimal SONN topology for given TD which can be sometimes insufficient in view of generalization. The SONN extension computes better discrimination coefficients and automatically develops the topology that reflects all well-discriminative data features into the NN topology in order to achieve a good generalization property. Furthermore, the SONN extension can also automatically reduce the input dimension space of any TD and automatically recognize and correctly classify inverted inputs (especially important for image classification). All extended SONN computations are fully automatic and deterministic. There is no need to use any parameters given by user. The SONNs are free from many training problems, e.i. initiation, convergence, overfitting. The extended SONNs can be also used to unsupervised training.
Year
DOI
Venue
2005
10.1007/11550822_65
ICANN (1)
Keywords
Field
DocType
whole training data,classical sonns,neural network,topology optimization,new extension,nn topology,extended sonns,training problem,unsupervised training,minimal sonn topology,extended sonn computation,sonn extension,image classification
Convergence (routing),Computer science,Algorithm,Image processing,Unsupervised learning,Artificial intelligence,Topology optimization,Overfitting,Contextual image classification,Artificial neural network,Machine learning,Computation
Conference
Volume
ISSN
ISBN
3696
0302-9743
3-540-28752-3
Citations 
PageRank 
References 
4
0.66
1
Authors
1
Name
Order
Citations
PageRank
Adrian Horzyk15312.76