Title
Memory-error tolerance of scalable and highly parallel architecture for restricted Boltzmann machines in Deep Belief Network
Abstract
A key aspect of constructing highly scalable Deep-learning microelectronic systems is to implement fault tolerance in the learning sequence. Error-injection analyses for memory is performed using a custom hardware model implementing parallelized restricted Boltzmann machines (RBMs). It is confirmed that the RBMs in Deep Belief Networks (DBNs) provides remarkable robustness against memory errors. Fine-tuning has significant effects on recovery of accuracy for static errors injected to the structural data of RBMs during and after learning, which are either at cell-level or block level. The memory-error tolerance is observable using our hardware networks with fine-graded memory distribution.
Year
DOI
Venue
2016
10.1109/ISCAS.2016.7527244
2016 IEEE International Symposium on Circuits and Systems (ISCAS)
Keywords
Field
DocType
Deep Learning,restricted Boltzmann machines (RBMs),fault tolerance
Data modeling,Boltzmann machine,Computer science,Parallel computing,Deep belief network,Robustness (computer science),Fault tolerance,Artificial intelligence,Deep learning,Memory errors,Scalability
Conference
ISSN
ISBN
Citations 
0271-4302
978-1-4799-5342-4
2
PageRank 
References 
Authors
0.63
5
5
Name
Order
Citations
PageRank
Kodai Ueyoshi131.65
Takao Marukame242.69
Tetsuya Asai312126.53
Masato Motomura49127.81
Alexandre Schmid52911.91