Title
Semantic Composition via Probabilistic Model Theory.
Abstract
Semantic composition remains an open problem for vector space models of semantics. In this paper, we explain how the probabilistic graphical model used in the framework of Functional Distributional Semantics can be interpreted as a probabilistic version of model theory. Building on this, we explain how various semantic phenomena can be recast in terms of conditional probabilities in the graphical model. This connection between formal semantics and machine learning is helpful in both directions: it gives us an explicit mechanism for modelling context-dependent meanings (a challenge for formal semantics), and also gives us well-motivated techniques for composing distributed representations (a challenge for distributional semantics). We present results on two datasets that go beyond word similarity, showing how these semantically-motivated techniques improve on the performance of vector models.
Year
DOI
Venue
2017
10.17863/CAM.17024
IWCS
DocType
Volume
Citations 
Conference
abs/1709.00226
1
PageRank 
References 
Authors
0.36
28
2
Name
Order
Citations
PageRank
guy emerson1153.62
Ann Copestake286295.10