File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Distributed, partially collapsed MCMC for Bayesian Nonparametrics
Title | Distributed, partially collapsed MCMC for Bayesian Nonparametrics |
---|---|
Authors | |
Issue Date | 2020 |
Publisher | ML Research Press. The Proceedings' web site is located at http://proceedings.mlr.press/ |
Citation | The 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020, Virtual Conference, Palermo, Italy, 26-28 August 2020. In Proceedings of Machine Learning Research (PMLR), v. 108, p. 3685-3695 How to Cite? |
Abstract | Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow. We exploit the fact that completely random measures, which commonly-used models like the Dirichlet process and the beta-Bernoulli process can be expressed using, are decomposable into independent sub-measures. We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees. |
Persistent Identifier | http://hdl.handle.net/10722/306011 |
ISSN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Dubey, KA | - |
dc.contributor.author | Zhang, MM | - |
dc.contributor.author | Xing, EP | - |
dc.contributor.author | Williamson, SA | - |
dc.date.accessioned | 2021-10-20T10:17:33Z | - |
dc.date.available | 2021-10-20T10:17:33Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | The 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020, Virtual Conference, Palermo, Italy, 26-28 August 2020. In Proceedings of Machine Learning Research (PMLR), v. 108, p. 3685-3695 | - |
dc.identifier.issn | 2640-3498 | - |
dc.identifier.uri | http://hdl.handle.net/10722/306011 | - |
dc.description.abstract | Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow. We exploit the fact that completely random measures, which commonly-used models like the Dirichlet process and the beta-Bernoulli process can be expressed using, are decomposable into independent sub-measures. We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees. | - |
dc.language | eng | - |
dc.publisher | ML Research Press. The Proceedings' web site is located at http://proceedings.mlr.press/ | - |
dc.relation.ispartof | Proceedings of Machine Learning Research (PMLR) | - |
dc.relation.ispartof | The 23rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020 | - |
dc.title | Distributed, partially collapsed MCMC for Bayesian Nonparametrics | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Zhang, MM: mzhang18@hku.hk | - |
dc.identifier.authority | Zhang, MM=rp02776 | - |
dc.identifier.hkuros | 327706 | - |
dc.identifier.volume | 108: Proceedings of AISTATS 2020 | - |
dc.identifier.spage | 3685 | - |
dc.identifier.epage | 3695 | - |
dc.publisher.place | United States | - |