Title
Learning functions on multiple sets using multi-set transformers.
Abstract
We propose a general deep architecture for learning functions on multiple permutation-invariant sets. We also show how to generalize this architecture to sets of elements of any dimension by dimension equivariance. We demonstrate that our architecture is a universal approximator of these functions, and show superior results to existing methods on a variety of tasks including counting tasks, alignment tasks, distinguishability tasks and statistical distance measurements. This last task is quite important in Machine Learning. Although our approach is quite general, we demonstrate that it can generate approximate estimates of KL divergence and mutual information that are more accurate than previous techniques that are specifically designed to approximate those statistical distances.
Year
Venue
DocType
2022
International Conference on Uncertainty in Artificial Intelligence
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Kira Selby100.34
Ahmad Azad Ab Rashid235.03
Ivan Kobyzev300.34
Mehdi Rezagholizadeh438.82
Pascal Poupart51352105.24