Title
Self-Attention with Structural Position Representations
Abstract
Although self-attention networks (SANs) have advanced the state-of-the-art on various NLP tasks, one criticism of SANs is their ability of encoding positions of input words (Shaw et al., 2018). In this work, we propose to augment SANs with structural position representations to model the latent structure of the input sentence, which is complementary to the standard sequential positional representations. Specifically, we use dependency tree to represent the grammatical structure of a sentence, and propose two strategies to encode the positional relationships among words in the dependency tree. Experimental results on NIST Chinese-to-English and WMT14 English-to-German translation tasks show that the proposed approach consistently boosts performance over both the absolute and relative sequential position representations.
Year
DOI
Venue
2019
10.18653/v1/D19-1145
EMNLP/IJCNLP (1)
DocType
Volume
Citations 
Conference
D19-1
1
PageRank 
References 
Authors
0.39
0
4
Name
Order
Citations
PageRank
Xing Wang15810.07
Zhaopeng Tu251839.95
Longyue Wang37218.24
Shuming Shi462058.27