Title
Knowledge-Based Distant Regularization in Learning Probabilistic Models.
Abstract
Exploiting the appropriate inductive bias based on the knowledge of data is essential for achieving good performance in statistical machine learning. In practice, however, the domain knowledge of interest often provides information on the relationship of data attributes only distantly, which hinders direct utilization of such domain knowledge in popular regularization methods. In this paper, we propose the knowledge-based distant regularization framework, in which we utilize the distant information encoded in a knowledge graph for regularization of probabilistic model estimation. In particular, we propose to impose prior distributions on model parameters specified by knowledge graph embeddings. As an instance of the proposed framework, we present the factor analysis model with the knowledge-based distant regularization. We show the results of preliminary experiments on the improvement of the generalization capability of such model.
Year
Venue
Field
2018
arXiv: Learning
Inductive bias,Knowledge graph,Domain knowledge,Regularization (mathematics),Artificial intelligence,Statistical model,Probabilistic logic,Machine learning,Mathematics
DocType
Volume
Citations 
Journal
abs/1806.11332
1
PageRank 
References 
Authors
0.35
3
2
Name
Order
Citations
PageRank
Naoya Takeishi1307.16
Kosuke Akimoto240.78