Title
Some Issues of the Paradigm of Multi-Learning Machine-Modular Neural Networks
Abstract
This paper addresses some issues on the weighted linear integration of modular neural networks (MNN: a paradigm of hybrid multi-learning machines). First, from the general meaning of variable weights and variable elements synthesis, three basic kinds of integrated models are discussed that are intrinsic-factors-determined, extrinsic-factors-determined, and hybrid-factors-determined. The authors point out: integrations dominated by both of the internal and external elements are highly correlative with not only the historical quality of the sub-networks, but also with the environment in which the information is processed. In the sense of the mean of square error (MSE), several sufficient conditions to improve the whole system's performance are given while deleting one/some sub-networks in all the networks population. Meanwhile, when the whole performance of the current MNN system possesses is unsatisfactory, a corresponding improved strategy which need add one/some sub-networks is presented. For the optimal weights vector under the framework of the weighted sum of the sub-networks' outputs, we point out some constraints forms of the sub-networks' integrated weights are unreasonable and present a general form while the corresponding computational algorithms are described briefly. The authors present a new training algorithm of sub-networks named “'Expert in one thing and good at many' (EOGM).” In this algorithm, every sub-network is trained on a primary dataset with some of its near neighbors as the accessorial datasets. Simulated results with a kind of dynamic integration methods show the effectiveness of these algorithms, where the performance of the algorithm with EOGM is better than that of the algorithm with a common training method.
Year
DOI
Venue
2009
10.1080/15501320802540058
International Journal of Distributed Sensor Networks
Keywords
DocType
Volume
general form,optimal weights vector,current mnn system,corresponding computational algorithm,multi-learning machine-modular neural networks,common training method,corresponding improved strategy,new training algorithm,general meaning,dynamic integration method,whole performance
Journal
5
Issue
ISSN
Citations 
1
1550-1477
1
PageRank 
References 
Authors
0.36
0
3
Name
Order
Citations
PageRank
Pan Wang1876.95
shuai feng2452.59
Zhun Fan310613.81