File Download
Links for fulltext
(May Require Subscription)
- Scopus: eid_2-s2.0-85090170599
- WOS: WOS:000534424303078
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Stochastic gradient hamiltonian monte carlo methods with recursive variance reduction
Title | Stochastic gradient hamiltonian monte carlo methods with recursive variance reduction |
---|---|
Authors | |
Issue Date | 2019 |
Citation | Advances in Neural Information Processing Systems, 2019, v. 32 How to Cite? |
Abstract | Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) algorithms have received increasing attention in both theory and practice. In this paper, we propose a Stochastic Recursive Variance-Reduced gradient HMC (SRVR-HMC) algorithm. It makes use of a semi-stochastic gradient estimator that recursively accumulates the gradient information to reduce the variance of the stochastic gradient. We provide a convergence analysis of SRVR-HMC for sampling from a class of non-log-concave distributions and show that SRVR-HMC converges faster than all existing HMC-type algorithms based on underdamped Langevin dynamics. Thorough experiments on synthetic and real-world datasets validate our theory and demonstrate the superiority of SRVR-HMC. |
Persistent Identifier | http://hdl.handle.net/10722/316553 |
ISSN | 2020 SCImago Journal Rankings: 1.399 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zou, Difan | - |
dc.contributor.author | Xu, Pan | - |
dc.contributor.author | Gu, Quanquan | - |
dc.date.accessioned | 2022-09-14T11:40:44Z | - |
dc.date.available | 2022-09-14T11:40:44Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Advances in Neural Information Processing Systems, 2019, v. 32 | - |
dc.identifier.issn | 1049-5258 | - |
dc.identifier.uri | http://hdl.handle.net/10722/316553 | - |
dc.description.abstract | Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) algorithms have received increasing attention in both theory and practice. In this paper, we propose a Stochastic Recursive Variance-Reduced gradient HMC (SRVR-HMC) algorithm. It makes use of a semi-stochastic gradient estimator that recursively accumulates the gradient information to reduce the variance of the stochastic gradient. We provide a convergence analysis of SRVR-HMC for sampling from a class of non-log-concave distributions and show that SRVR-HMC converges faster than all existing HMC-type algorithms based on underdamped Langevin dynamics. Thorough experiments on synthetic and real-world datasets validate our theory and demonstrate the superiority of SRVR-HMC. | - |
dc.language | eng | - |
dc.relation.ispartof | Advances in Neural Information Processing Systems | - |
dc.title | Stochastic gradient hamiltonian monte carlo methods with recursive variance reduction | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85090170599 | - |
dc.identifier.volume | 32 | - |
dc.identifier.isi | WOS:000534424303078 | - |