File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Scalable nonparametric sampling from multimodal posteriors with the posterior bootstrap

TitleScalable nonparametric sampling from multimodal posteriors with the posterior bootstrap
Authors
Issue Date2019
Citation
36th International Conference on Machine Learning, ICML 2019, 2019, v. 2019-June, p. 3443-3464 How to Cite?
AbstractIncreasingly complex datasets pose a number of challenges for Bayesian inference. Conventional posterior sampling based on Markov chain Monte Carlo can be too computationally intensive, is serial in nature and mixes poorly between posterior modes. Furthermore, all models are misspecified, which brings into question the validity of the conventional Bayesian update. We present a scalable Bayesian nonparametric learning routine that enables posterior sampling through the optimization of suitably randomized objective functions. A Dirichlet process prior on the unknown data distribution accounts for model misspecification, and admits an embarrassingly parallel posterior bootstrap algorithm that generates independent and exact samples from the nonparametric posterior distribution. Our method is particularly adept at sampling from multimodal posterior distributions via a random restart mechanism, and we demonstrate this on Gaussian mixture model and sparse logistic regression examples.
Persistent Identifierhttp://hdl.handle.net/10722/330632

 

DC FieldValueLanguage
dc.contributor.authorFong, Edwin-
dc.contributor.authorLyddon, Simon-
dc.contributor.authorHolmes, Chris-
dc.date.accessioned2023-09-05T12:12:30Z-
dc.date.available2023-09-05T12:12:30Z-
dc.date.issued2019-
dc.identifier.citation36th International Conference on Machine Learning, ICML 2019, 2019, v. 2019-June, p. 3443-3464-
dc.identifier.urihttp://hdl.handle.net/10722/330632-
dc.description.abstractIncreasingly complex datasets pose a number of challenges for Bayesian inference. Conventional posterior sampling based on Markov chain Monte Carlo can be too computationally intensive, is serial in nature and mixes poorly between posterior modes. Furthermore, all models are misspecified, which brings into question the validity of the conventional Bayesian update. We present a scalable Bayesian nonparametric learning routine that enables posterior sampling through the optimization of suitably randomized objective functions. A Dirichlet process prior on the unknown data distribution accounts for model misspecification, and admits an embarrassingly parallel posterior bootstrap algorithm that generates independent and exact samples from the nonparametric posterior distribution. Our method is particularly adept at sampling from multimodal posterior distributions via a random restart mechanism, and we demonstrate this on Gaussian mixture model and sparse logistic regression examples.-
dc.languageeng-
dc.relation.ispartof36th International Conference on Machine Learning (10/06/2019-15/06/2019, Long Beach Convention Center, Long Beach)-
dc.titleScalable nonparametric sampling from multimodal posteriors with the posterior bootstrap-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85079453005-
dc.identifier.volume2019-June-
dc.identifier.spage3443-
dc.identifier.epage3464-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats