Title
Automatic Construction of Multi-layer Perceptron Network from Streaming Examples
Abstract
Autonomous construction of deep neural network (DNNs) is desired for data streams because it potentially offers two advantages: proper model's capacity and quick reaction to drift and shift. While self-organizing mechanism of DNNs remains an open issue, this task is even more challenging to be developed for standard multi-layer DNNs than that using the different-depth structures, because addition of a new layer results in information loss of previously trained knowledge. A Neural Network with Dynamically Evolved Capacity (NADINE) is proposed in this paper. NADINE features a fully open structure where its network structure, depth and width, can be automatically evolved from scratch in the online manner and without the use of problem-specific thresholds. NADINE is structured under a standard MLP architecture and the catastrophic forgetting issue during the hidden layer addition phase is resolved using the proposal of soft-forgetting and adaptive memory methods. The advantage of NADINE, namely elastic structure and online learning trait, is numerically validated using nine data stream classification and regression problems where it demonstrates performance's improvement over prominent algorithms in all problems. In addition, it is capable of dealing with data stream regression and classification problems equally well.
Year
DOI
Venue
2019
10.1145/3357384.3357946
Proceedings of the 28th ACM International Conference on Information and Knowledge Management
Keywords
Field
DocType
concept drifts, continual learning, data streams, deep learning, online learning
Data mining,Computer science,Multilayer perceptron
Conference
ISBN
Citations 
PageRank 
978-1-4503-6976-3
5
0.41
References 
Authors
0
5
Name
Order
Citations
PageRank
Mahardhika Pratama170250.02
Choiru Za'in271.79
Andri Ashfahani352.10
Yew-Soon Ong426323.35
Weiping Ding527844.96