Title
Balancing the learning ability and memory demand of a perceptron-based dynamically trainable neural network
Abstract
Artificial neural networks (ANNs) have become a popular means of solving complex problems in prediction-based applications such as image and natural language processing. Two challenges prominent in the neural network domain are the practicality of hardware implementation and dynamically training the network. In this study, we address these challenges with a development methodology that balances the hardware footprint and the quality of the ANN. We use the well-known perceptron-based branch prediction problem as a case study for demonstrating this methodology. This problem is perfect to analyze dynamic hardware implementations of ANNs because it exists in hardware and trains dynamically. Using our hierarchical configuration search space exploration, we show that we can decrease the memory footprint of a standard perceptron-based branch predictor by 2.3 with only a 0.6% decrease in prediction accuracy.
Year
DOI
Venue
2018
https://doi.org/10.1007/s11227-018-2374-x
The Journal of Supercomputing
Keywords
Field
DocType
Artificial neural network,Branch prediction,Perceptron,SimpleScalar
Computer science,Space exploration,Artificial intelligence,Footprint,Train,Memory footprint,Artificial neural network,Perceptron,Branch predictor,Complex problems,Distributed computing
Journal
Volume
Issue
ISSN
74
7
0920-8542
Citations 
PageRank 
References 
0
0.34
20
Authors
5
Name
Order
Citations
PageRank
Edward Richter101.35
Spencer Valancius200.34
Josiah McClanahan300.34
John Mixter400.34
Ali Akoglu515729.40