Title | ||
---|---|---|
Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation. |
Abstract | ||
---|---|---|
We present a novel family of deep neural archi-tectures, named partially exchangeable networks(PENs) that leverage probabilistic symmetries.By design, PENs are invariant to block-switchtransformations, which characterize the partial ex-changeability properties of conditionally Marko-vian processes. Moreover, we show that anyblock-switch invariant function has a PEN-likerepresentation. The DeepSets architecture is aspecial case of PEN and we can therefore also tar-get fully exchangeable data. We employ PENs tolearn summary statistics in approximate Bayesiancomputation (ABC). When comparing PENs toprevious deep learning methods for learning sum-mary statistics, our results are highly competitive,both considering time series and static models. In-deed, PENs provide more reliable posterior sam-ples even when using less training data. |
Year | Venue | Field |
---|---|---|
2019 | arXiv: Machine Learning | Training set,Approximate Bayesian computation,Computer science,Invariant (mathematics),Artificial intelligence,Summary statistics,Probabilistic logic,Deep learning,Machine learning |
DocType | Volume | Citations |
Journal | abs/1901.10230 | 0 |
PageRank | References | Authors |
0.34 | 4 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Samuel Wiqvist | 1 | 0 | 0.34 |
Pierre-Alexandre Mattei | 2 | 1 | 2.04 |
Umberto Picchini | 3 | 9 | 2.99 |
Jes Frellsen | 4 | 46 | 5.77 |