File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Conformal Bayesian Computation
Title | Conformal Bayesian Computation |
---|---|
Authors | |
Issue Date | 2021 |
Citation | Advances in Neural Information Processing Systems, 2021, v. 22, p. 18268-18279 How to Cite? |
Abstract | We develop scalable methods for producing conformal Bayesian predictive intervals with finite sample calibration guarantees. Bayesian posterior predictive distributions, p(y | x), characterize subjective beliefs on outcomes of interest, y, conditional on predictors, x. Bayesian prediction is well-calibrated when the model is true, but the predictive intervals may exhibit poor empirical coverage when the model is misspecified, under the so calledM-open perspective. In contrast, conformal inference provides finite sample frequentist guarantees on predictive confidence intervals without the requirement of model fidelity. Using 'add-one-in' importance sampling, we show that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters. Our approach contrasts with existing conformal methods that require expensive refitting of models or data-splitting to achieve computational efficiency. We demonstrate the utility on a range of examples including extensions to partially exchangeable settings such as hierarchical models. |
Persistent Identifier | http://hdl.handle.net/10722/330822 |
ISSN | 2020 SCImago Journal Rankings: 1.399 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Fong, Edwin | - |
dc.contributor.author | Holmes, Chris | - |
dc.date.accessioned | 2023-09-05T12:14:55Z | - |
dc.date.available | 2023-09-05T12:14:55Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Advances in Neural Information Processing Systems, 2021, v. 22, p. 18268-18279 | - |
dc.identifier.issn | 1049-5258 | - |
dc.identifier.uri | http://hdl.handle.net/10722/330822 | - |
dc.description.abstract | We develop scalable methods for producing conformal Bayesian predictive intervals with finite sample calibration guarantees. Bayesian posterior predictive distributions, p(y | x), characterize subjective beliefs on outcomes of interest, y, conditional on predictors, x. Bayesian prediction is well-calibrated when the model is true, but the predictive intervals may exhibit poor empirical coverage when the model is misspecified, under the so calledM-open perspective. In contrast, conformal inference provides finite sample frequentist guarantees on predictive confidence intervals without the requirement of model fidelity. Using 'add-one-in' importance sampling, we show that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters. Our approach contrasts with existing conformal methods that require expensive refitting of models or data-splitting to achieve computational efficiency. We demonstrate the utility on a range of examples including extensions to partially exchangeable settings such as hierarchical models. | - |
dc.language | eng | - |
dc.relation.ispartof | Advances in Neural Information Processing Systems | - |
dc.relation.ispartof | NeurIPS 2021 (06/12/2021-14/12/2021, Virtual) | - |
dc.title | Conformal Bayesian Computation | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85132048467 | - |
dc.identifier.volume | 22 | - |
dc.identifier.spage | 18268 | - |
dc.identifier.epage | 18279 | - |