Abstract | ||
---|---|---|
We develop a method to construct distribution-free prediction intervals for dynamic time-series, called EnbPI that wraps around any bootstrap ensemble estimator to construct sequential prediction intervals. EnbPI is closely related to the conformal prediction (CP) framework but does not require data exchangeability. Theoretically, these intervals attain finite-sample, approximately valid marginal coverage for broad classes of regression functions and time-series with strongly mixing stochastic errors. Computationally, EnbPI avoids overfitting and requires neither data-splitting nor training multiple ensemble estimators; it efficiently aggregates bootstrap estimators that have been trained. In general, EnbPI is easy to implement, scalable to producing arbitrarily many prediction intervals sequentially, and well-suited to a wide range of regression functions. We perform extensive real-data analyses to demonstrate its effectiveness. |
Year | Venue | DocType |
---|---|---|
2021 | INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139 | Conference |
Volume | ISSN | Citations |
139 | 2640-3498 | 0 |
PageRank | References | Authors |
0.34 | 0 | 2 |