Title
Designing by Training - Acceleration Neural Network for Fast High-Dimensional Convolution.
Abstract
The high-dimensional convolution is widely used in various disciplines but has a serious performance problem due to its high computational complexity. Over the decades, people took a handmade approach to design fast algorithms for the Gaussian convolution. Recently, requirements for various non-Gaussian convolutions have emerged and are continuously getting higher. However, the handmade acceleration approach is no longer feasible for so many different convolutions since it is a time-consuming and painstaking job. Instead, we propose an Acceleration Network (AccNet) which turns the work of designing new fast algorithms to training the AccNet. This is done by: 1, interpreting splatting, blurring, slicing operations as convolutions; 2, turning these convolutions to gCP layers to build AccNet. After training, the activation function g together with AccNet weights automatically define the new splatting, blurring and slicing operations. Experiments demonstrate AccNet is able to design acceleration algorithms for a ton of convolutions including Gaussian/non-Gaussian convolutions and produce state-of-the-art results.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
fast algorithms,neural network,activation function
Field
DocType
Volume
Mathematical optimization,Activation function,Convolution,Computer science,Slicing,Algorithm,Gaussian,Acceleration,Artificial neural network,Computational complexity theory
Conference
31
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Longquan Dai1232.73
Liang Tang24514.11
Yuan Xie340727.48
Jinhui Tang45180212.18