Title
A Lightweight Transformer with Convolutional Attention
Abstract
Neural machine translation (NMT) goes through rapid development because of the application of various deep learning techs. Especially, how to construct a more effective structure of NMT attracts more and more attention. Transformer is a state-of-the-art architecture in NMT. It replies on the self-attention mechanism exactly instead of recurrent neural networks (RNN). The Multi-head attention is a crucial part that implements the self-attention mechanism, and it also dramatically affects the scale of the model. In this paper, we present a new Multi-head attention by combining convolution operation. In comparison with the base Transformer, our approach can reduce the number of parameters effectively. And we perform a reasoned experiment. The result shows that the performance of the new model is similar to the base model.
Year
DOI
Venue
2020
10.1109/iCAST51195.2020.9319489
2020 11th International Conference on Awareness Science and Technology (iCAST)
Keywords
DocType
ISSN
neural machine translation,Transformer,CNN,Muti-head attention
Conference
2325-5986
ISBN
Citations 
PageRank 
978-1-7281-9120-1
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Kungan Zeng100.34
Incheon Paik224138.80