File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Conformal Bayesian Computation

TitleConformal Bayesian Computation
Authors
Issue Date2021
Citation
Advances in Neural Information Processing Systems, 2021, v. 22, p. 18268-18279 How to Cite?
AbstractWe develop scalable methods for producing conformal Bayesian predictive intervals with finite sample calibration guarantees. Bayesian posterior predictive distributions, p(y | x), characterize subjective beliefs on outcomes of interest, y, conditional on predictors, x. Bayesian prediction is well-calibrated when the model is true, but the predictive intervals may exhibit poor empirical coverage when the model is misspecified, under the so calledM-open perspective. In contrast, conformal inference provides finite sample frequentist guarantees on predictive confidence intervals without the requirement of model fidelity. Using 'add-one-in' importance sampling, we show that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters. Our approach contrasts with existing conformal methods that require expensive refitting of models or data-splitting to achieve computational efficiency. We demonstrate the utility on a range of examples including extensions to partially exchangeable settings such as hierarchical models.
Persistent Identifierhttp://hdl.handle.net/10722/330822
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorFong, Edwin-
dc.contributor.authorHolmes, Chris-
dc.date.accessioned2023-09-05T12:14:55Z-
dc.date.available2023-09-05T12:14:55Z-
dc.date.issued2021-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2021, v. 22, p. 18268-18279-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/330822-
dc.description.abstractWe develop scalable methods for producing conformal Bayesian predictive intervals with finite sample calibration guarantees. Bayesian posterior predictive distributions, p(y | x), characterize subjective beliefs on outcomes of interest, y, conditional on predictors, x. Bayesian prediction is well-calibrated when the model is true, but the predictive intervals may exhibit poor empirical coverage when the model is misspecified, under the so calledM-open perspective. In contrast, conformal inference provides finite sample frequentist guarantees on predictive confidence intervals without the requirement of model fidelity. Using 'add-one-in' importance sampling, we show that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters. Our approach contrasts with existing conformal methods that require expensive refitting of models or data-splitting to achieve computational efficiency. We demonstrate the utility on a range of examples including extensions to partially exchangeable settings such as hierarchical models.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.relation.ispartofNeurIPS 2021 (06/12/2021-14/12/2021, Virtual)-
dc.titleConformal Bayesian Computation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85132048467-
dc.identifier.volume22-
dc.identifier.spage18268-
dc.identifier.epage18279-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats