Abstract | ||
---|---|---|
FPGAs have been successfully used for the implementation of dedicated accelerators for a wide range of machine learning problems. The inference in so-called Sum-Product Networks can also be accelerated efficiently using a pipelined FPGA architecture. However, as Sum-Product Networks compute exact probability values, the required arithmetic precision poses different challenges than those encountered with Neural Networks. In previous work, this precision was maintained by using double-precision floating-point number formats, which are expensive to implement in FPGAs. In this work, we propose the use of a logarithmic number system format tailored specifically towards the inference in Sum-Product Networks. The evaluation of our optimized arithmetic hardware operators shows that the use of logarithmic number formats allows to save up to 50% hardware resources compared to double-precision floating point, while maintaining sufficient precision for SPN inference at almost identical performance. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/ICFPT47387.2019.00040 | 2019 International Conference on Field-Programmable Technology (ICFPT) |
Keywords | Field | DocType |
FPGA,SPN,Machine Learning,Graphical Models,Deep Models | Floating point,Computer science,Inference,Parallel computing,Field-programmable gate array,Arithmetic,Operator (computer programming),Logarithmic number system,Logarithm,Graphical model,Artificial neural network | Conference |
ISBN | Citations | PageRank |
978-1-7281-2944-0 | 0 | 0.34 |
References | Authors | |
4 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Lukas Weber | 1 | 3 | 4.89 |
Lukas Sommer | 2 | 8 | 7.53 |
Julian Oppermann | 3 | 30 | 6.88 |
Alejandro Molina | 4 | 46 | 15.04 |
Kristian Kersting | 5 | 1932 | 154.03 |
Andreas Koch | 6 | 155 | 29.56 |