Abstract | ||
---|---|---|
Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion. However, updating a GP posterior to accommodate even a single new observation after having observed n points incurs at least O(n) computations in the exact setting. We show how to use structured kernel interpolation to efficiently reuse computations for constant-time O(1) online updates with respect to the number of points n, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting. Code is available at https://github.com/wjmaddox/online_gp. |
Year | Venue | DocType |
---|---|---|
2021 | 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS) | Conference |
Volume | ISSN | Citations |
130 | 2640-3498 | 1 |
PageRank | References | Authors |
0.34 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Stanton Samuel | 1 | 1 | 1.69 |
Wesley J. Maddox | 2 | 1 | 0.68 |
Ian Delbridge | 3 | 1 | 0.34 |
Andrew Gordon Wilson | 4 | 277 | 32.68 |