Title
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data.
Abstract
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on large corpora, as base features. Efforts to obtain embeddings for larger chunks of text, such as sentences, have however not been so successful. Several attempts at learning unsupervised representations of sentences have not reached satisfactory enough performance to be widely adopted. In this paper, we show how universal sentence representations trained using the supervised data of the Stanford Natural Language Inference datasets can consistently outperform unsupervised methods like SkipThought vectors on a wide range of transfer tasks. Much like how computer vision uses ImageNet to obtain features, which can then be transferred to other tasks, our work tends to indicate the suitability of natural language inference for transfer learning to other NLP tasks. Our encoder is publicly available.
Year
DOI
Venue
2017
10.18653/v1/D17-1070
EMNLP
DocType
Volume
Citations 
Conference
abs/1705.02364
147
PageRank 
References 
Authors
3.54
35
5
Search Limit
100147
Name
Order
Citations
PageRank
Alexis Conneau134215.03
Douwe Kiela254940.86
Holger Schwenk32533228.83
Loïc Barrault428422.91
Antoine Bordes53289157.12