File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Stochastic gradient hamiltonian monte carlo methods with recursive variance reduction

TitleStochastic gradient hamiltonian monte carlo methods with recursive variance reduction
Authors
Issue Date2019
Citation
Advances in Neural Information Processing Systems, 2019, v. 32 How to Cite?
AbstractStochastic Gradient Hamiltonian Monte Carlo (SGHMC) algorithms have received increasing attention in both theory and practice. In this paper, we propose a Stochastic Recursive Variance-Reduced gradient HMC (SRVR-HMC) algorithm. It makes use of a semi-stochastic gradient estimator that recursively accumulates the gradient information to reduce the variance of the stochastic gradient. We provide a convergence analysis of SRVR-HMC for sampling from a class of non-log-concave distributions and show that SRVR-HMC converges faster than all existing HMC-type algorithms based on underdamped Langevin dynamics. Thorough experiments on synthetic and real-world datasets validate our theory and demonstrate the superiority of SRVR-HMC.
Persistent Identifierhttp://hdl.handle.net/10722/316553
ISSN
2020 SCImago Journal Rankings: 1.399
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZou, Difan-
dc.contributor.authorXu, Pan-
dc.contributor.authorGu, Quanquan-
dc.date.accessioned2022-09-14T11:40:44Z-
dc.date.available2022-09-14T11:40:44Z-
dc.date.issued2019-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2019, v. 32-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/316553-
dc.description.abstractStochastic Gradient Hamiltonian Monte Carlo (SGHMC) algorithms have received increasing attention in both theory and practice. In this paper, we propose a Stochastic Recursive Variance-Reduced gradient HMC (SRVR-HMC) algorithm. It makes use of a semi-stochastic gradient estimator that recursively accumulates the gradient information to reduce the variance of the stochastic gradient. We provide a convergence analysis of SRVR-HMC for sampling from a class of non-log-concave distributions and show that SRVR-HMC converges faster than all existing HMC-type algorithms based on underdamped Langevin dynamics. Thorough experiments on synthetic and real-world datasets validate our theory and demonstrate the superiority of SRVR-HMC.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleStochastic gradient hamiltonian monte carlo methods with recursive variance reduction-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.scopuseid_2-s2.0-85090170599-
dc.identifier.volume32-
dc.identifier.isiWOS:000534424303078-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats