Title
Sequential Network Transfer: Adapting Sentence Embeddings to Human Activities and Beyond.
Abstract
We study the problem of adapting neural sentence embedding models to the domain of human activities to capture their relations in different dimensions. We introduce a novel approach, Sequential Network Transfer, and show that it largely improves the performance on all dimensions. We also extend this approach to other semantic similarity datasets, and show that the resulting embeddings outperform traditional transfer learning approaches in many cases, achieving state-of-the-art results on the Semantic Textual Similarity (STS) Benchmark. To account for the improvements, we provide some interpretation of what the networks have learned. Our results suggest that Sequential Network Transfer is highly effective for various sentence embedding models and tasks.
Year
Venue
Field
2018
arXiv: Computation and Language
Semantic similarity,Embedding,Computer science,Transfer of learning,Artificial intelligence,Natural language processing,Sentence
DocType
Volume
Citations 
Journal
abs/1804.07835
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Li Zhang144561.28
Steven R. Wilson2127.21
Rada Mihalcea36460445.54