Title
Decaying Potential Fields Neural Network: An Approach For Parallelizing Topologically Indicative Mapping Exemplars
Abstract
Mapping methodologies aim to make sense or connections from hard data. The human mind is able to efficiently and quickly process images through the visual cortex, in part due to its parallel nature. A basic Kohonen self-organizing feature map (SOFM) is one example of a mapping methodology in the class of neural networks that does this very well. Optimally the result is a nicely mapped neural network representative of the data set, however SOFMs do not translate to a parallelized architecture very well. The problem stems from the neighborhoods that are established between the neurons, creating race conditions for updating winning neurons. We propose a fully parallelized mapping architecture based loosely on SOFM called decaying potential fields neural network (DPFNN). We show that DPFNN uses neurons that are computationally uncoupled but symbolically linked. Through analysis we show this allows for the neurons to reach convergence with having only a passive data dependency on each other, as opposed to a hazard generating direct dependency. We have created this network to closely reflect the efficiency and speed of a parallel approach, with results that rival or exceed those of similar topological networks such as SOFM.
Year
DOI
Venue
2015
10.1109/ICMLA.2015.56
2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA)
Keywords
Field
DocType
self-organization parallelization, recall of data, potentials
Convergence (routing),Data dependency,Architecture,Pattern recognition,Visual cortex,Computer science,Self-organizing map,Artificial intelligence,Artificial neural network,Machine learning
Conference
Citations 
PageRank 
References 
0
0.34
6
Authors
2
Name
Order
Citations
PageRank
Clinton Rogers100.34
Iren Valova213625.44