Title
SALO: an efficient spatial accelerator enabling hybrid sparse attention mechanisms for long sequences
Abstract
The attention mechanisms of transformers effectively extract pertinent information from the input sequence. However, the quadratic complexity of self-attention w.r.t the sequence length incurs heavy computational and memory burdens, especially for tasks with long sequences. Existing accelerators face performance degradation in these tasks. To this end, we propose SALO to enable hybrid sparse attention mechanisms for long sequences. SALO contains a data scheduler to map hybrid sparse attention patterns onto hardware and a spatial accelerator to perform the efficient attention computation. We show that SALO achieves 17.66x and 89.33x speedup on average compared to GPU and CPU implementations, respectively, on typical workloads, i.e., Longformer and ViL.
Year
DOI
Venue
2022
10.1145/3489517.3530504
Design Automation Conference (DAC)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Guan Shen100.34
Jieru Zhao222.09
Quan Chen3125.97
Jingwen Leng44912.97
Chao Li534437.85
Minyi Guo63514.13