Title
Multi-relational Learning Using Weighted Tensor Decomposition with Modular Loss
Abstract
We propose a modular framework for multi-relational learning via tensor decomposition. In our learning setting, the training data contains multiple types of relationships among a set of objects, which we represent by a sparse three-mode tensor. The goal is to predict the values of the missing entries. To do so, we model each relationship as a function of a linear combination of latent factors. We learn this latent representation by computing a low-rank tensor decomposition, using quasi-Newton optimization of a weighted objective function. Sparsity in the observed data is captured by the weighted objective, leading to improved accuracy when training data is limited. Exploiting sparsity also improves efficiency, potentially up to an order of magnitude over unweighted approaches. In addition, our framework accommodates arbitrary combinations of smooth, task-specific loss functions, making it better suited for learning different types of relations. For the typical cases of real-valued functions and binary relations, we propose several loss functions and derive the associated parameter gradients. We evaluate our method on synthetic and real data, showing significant improvements in both accuracy and scalability over related factorization techniques.
Year
Venue
Keywords
2013
CoRR
relational learning
Field
DocType
Volume
Linear combination,Tensor,Binary relation,Statistical relational learning,Artificial intelligence,Mathematical optimization,Pattern recognition,Factorization,Modular design,Mathematics,Machine learning,Tensor decomposition,Scalability
Journal
abs/1303.1733
Citations 
PageRank 
References 
10
0.77
12
Authors
4
Name
Order
Citations
PageRank
Ben London1777.01
Theodoros Rekatsinas217818.65
Bert Huang356339.09
Lise Getoor44365320.21