Title
Sentence transition matrix: An efficient approach that preserves sentence semantics
Abstract
Sentence embedding is an influential research topic in natural language processing (NLP). Generation of sentence vectors that reflect the intrinsic meaning of sentences is crucial for improving performance in various NLP tasks. Therefore, numerous supervised and unsupervised sentence-representation approaches have been proposed since the advent of the distributed representation of words. These approaches have been evaluated on semantic textual similarity (STS) tasks designed to measure the degree of semantic information preservation; neural network-based supervised embedding models typically deliver state-of-the-art performance. However, these models have limitations in that they have numerous learnable parameters and thus require large amounts of specific types of labeled training data. Pretrained language modelbased approaches, which have become a predominant trend in the NLP field, alleviate this issue to some extent; however, it is still necessary to collect sufficient labeled data for the fine-tuning process is still necessary. Herein, we propose an efficient approach that learns a transition matrix tuning a sentence embedding vector to capture the latent semantic meaning. Our proposed method has two practical advantages: (1) it can be applied to any sentence embedding method, and (2) it can deliver robust performance in STS tasks with only a few training examples.
Year
DOI
Venue
2019
10.1016/j.csl.2021.101266
COMPUTER SPEECH AND LANGUAGE
Keywords
DocType
Volume
Sentence embedding, Sentence semantics, Transition matrix, Paraphrase, Natural language processing
Journal
71
ISSN
Citations 
PageRank 
0885-2308
0
0.34
References 
Authors
30
2
Name
Order
Citations
PageRank
Myeongjun Jang120.74
Pilsung Kang233928.22