File Download
Links for fulltext
(May Require Subscription)
- Scopus: eid_2-s2.0-85057281190
- WOS: WOS:000683379206017
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Stochastic variance-reduced Hamilton Monte Carlo methods
Title | Stochastic variance-reduced Hamilton Monte Carlo methods |
---|---|
Authors | |
Issue Date | 2018 |
Citation | 35th International Conference on Machine Learning, ICML 2018, 2018, v. 13, p. 9647-9656 How to Cite? |
Abstract | We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve e accuracy in 2-Wasserstein distance, our algorithm achieves Õ(n + κ2d1/2/ϵ + κ4/3d1/3n2/3/ϵ2/3) gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm. |
Persistent Identifier | http://hdl.handle.net/10722/316505 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zou, Difan | - |
dc.contributor.author | Xu, Pan | - |
dc.contributor.author | Gu, Quanquan | - |
dc.date.accessioned | 2022-09-14T11:40:38Z | - |
dc.date.available | 2022-09-14T11:40:38Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | 35th International Conference on Machine Learning, ICML 2018, 2018, v. 13, p. 9647-9656 | - |
dc.identifier.uri | http://hdl.handle.net/10722/316505 | - |
dc.description.abstract | We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve e accuracy in 2-Wasserstein distance, our algorithm achieves Õ(n + κ2d1/2/ϵ + κ4/3d1/3n2/3/ϵ2/3) gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm. | - |
dc.language | eng | - |
dc.relation.ispartof | 35th International Conference on Machine Learning, ICML 2018 | - |
dc.title | Stochastic variance-reduced Hamilton Monte Carlo methods | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85057281190 | - |
dc.identifier.volume | 13 | - |
dc.identifier.spage | 9647 | - |
dc.identifier.epage | 9656 | - |
dc.identifier.isi | WOS:000683379206017 | - |