Title
An Online Self-Constructive Normalized Gaussian Network With Localized Forgetting
Abstract
In this paper, we introduce a self-constructive Normalized Gaussian Network (NGnet) for online learning tasks. In online tasks, data samples are received sequentially, and domain knowledge is often limited. Then, we need to employ learning methods to the NGnet that possess robust performance and dynamically select an accurate model size. We revise a previously proposed localized forgetting approach for the NGnet and adapt some unit manipulation mechanisms to it for dynamic model selection. The mechanisms are improved for more robustness in negative interference prone environments, and a new merge manipulation is considered to deal with model redundancies. The effectiveness of the proposed method is compared with the previous localized forgetting approach and an established learning method for the NGnet. Several experiments are conducted for a function approximation and chaotic time series forecasting task. The proposed approach possesses robust and favorable performance in different learning situations over all testbeds.
Year
DOI
Venue
2017
10.1587/transfun.E100.A.865
IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES
Keywords
Field
DocType
Normalized Gaussian Networks, dynamic model selection, online learning, chaotic time series forecasting
Online learning,Forgetting,Normalization (statistics),Constructive,Computer science,Theoretical computer science,Gaussian,Artificial intelligence
Journal
Volume
Issue
ISSN
E100A
3
0916-8508
Citations 
PageRank 
References 
0
0.34
10
Authors
5
Name
Order
Citations
PageRank
Jana Backhus101.01
Ichigaku Takigawa220918.15
Hideyuki Imai310325.08
Mineichi Kudo4927116.09
Masanori Sugimoto577595.39