Title
Distributed ARTMAP: a neural network for fast distributed supervised learning
Abstract
Distributed coding at the hidden layer of a multi-layer perceptron (MLP) endows the network with memory compression and noise tolerance capabilities. However, an MLP typically requires slow off-line learning to avoid catastrophic forgetting in an open input environment. An adaptive resonance theory (ART) model is designed to guarantee stable memories even with fast on-line learning. However, ART stability typically requires winner-take-all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to combine the computational advantages of MLP and ART systems in a real-time neural network for supervised learning. An implementation algorithm here describes one class of dARTMAP networks. This system incorporates elements of the unsupervised dART model, as well as new features, including a content-addressable memory (CAM) rule for improved contrast control at the coding field. A dARTMAP system reduces to fuzzy ARTMAP when coding is winner-take-all. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compression. (C) 1998 Elsevier Science Ltd. All rights reserved.
Year
DOI
Venue
1998
10.1016/S0893-6080(98)00019-7
Neural Networks
Keywords
Field
DocType
art,supervised learning,neural network
Forgetting,Adaptive resonance theory,Computer science,Fuzzy logic,Supervised learning,Coding (social sciences),Multilayer perceptron,Artificial intelligence,Artificial neural network,Perceptron,Machine learning
Journal
Volume
Issue
ISSN
11
5
0893-6080
Citations 
PageRank 
References 
69
9.91
13
Authors
3
Name
Order
Citations
PageRank
Gail A. Carpenter12909760.83
Boriana L. Milenova214815.68
Benjamin W. Noeske3699.91