Title | ||
---|---|---|
Scalable transformed additive signal decomposition by non-conjugate Gaussian process inference |
Abstract | ||
---|---|---|
Many functions and signals of interest are formed by the addition of multiple underlying components, often nonlinearly transformed and modified by noise. Examples may be found in the literature on Generalized Additive Models [1] and Underdetermined Source Separation [2] or other mode decomposition techniques. Recovery of the underlying component processes often depends on finding and exploiting statistical regularities within them. Gaussian Processes (GPs) [3] have become the dominant way to model statistical expectations over functions. Recent advances make inference of the GP posterior efficient for large scale datasets and arbitrary likelihoods [4,5]. Here we extend these methods to the additive GP case [6, 7], thus achieving scalable marginal posterior inference over each latent function in settings such as those above. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1109/MLSP.2016.7738855 | 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP) |
Keywords | Field | DocType |
scalable transformed additive signal decomposition,nonconjugate Gaussian process inference,nonlinear transformation,generalized additive model,underdetermined source separation,mode decomposition,statistical regularity,statistical expectation,scalable marginal posterior inference | Kernel (linear algebra),Random variable,Underdetermined system,Inference,Computer science,Artificial intelligence,Gaussian process,Generalized additive model,Machine learning,Source separation,Scalability | Conference |
ISSN | ISBN | Citations |
2161-0363 | 978-1-5090-0747-9 | 5 |
PageRank | References | Authors |
0.53 | 8 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vincent Adam | 1 | 25 | 2.54 |
James Hensman | 2 | 265 | 20.05 |
Maneesh Sahani | 3 | 441 | 50.70 |