Title
Selective Neuron Re-Computation (SNRC) for Error-Tolerant Neural Networks
Abstract
Artificial Neural networks (ANNs) are widely used to solve classification problems for many machine learning applications. When errors occur in the computational units of an ANN implementation due to for example radiation effects, the result of an arithmetic operation can be changed, and therefore, the predicted classification class may be erroneously affected. This is not acceptable when ANNs are used in many safety-critical applications, because the incorrect classification may result in a system failure. Existing error-tolerant techniques usually rely on physically replicating parts of the ANN implementation or incurring in a significant computation overhead. Therefore, efficient protection schemes are needed for ANNs that are run on a processor and used in resource-limited platforms. A technique referred to as Selective Neuron Re-Computation (SNRC), is proposed in this paper. As per the ANN structure and algorithmic properties, SNRC can identify the cases in which the errors have no impact on the outcome; therefore, errors only need to be handled by re-computation when the classification result is detected as unreliable. Compared with existing temporal redundancy-based protection schemes, SNRC saves more than 60 percent of the re-computation (more than 90 percent in many cases) overhead to achieve complete error protection as assessed over a wide range of datasets. Different activation functions are also evaluated.
Year
DOI
Venue
2022
10.1109/TC.2021.3056992
IEEE Transactions on Computers
Keywords
DocType
Volume
Neural networks,machine learning,sigmoid,error-tolerance
Journal
71
Issue
ISSN
Citations 
3
0018-9340
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Shanshan Liu154.50
Pedro Reviriego252775.56
Fabrizio Lombardi300.68