File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Laplacian smoothing stochastic gradient markov chain monte carlo

TitleLaplacian smoothing stochastic gradient markov chain monte carlo
Authors
KeywordsLangevin dynamics
Laplacian smoothing
Stochastic gradient
Issue Date2021
Citation
SIAM Journal on Scientific Computing, 2021, v. 43, n. 1, p. A26-A53 How to Cite?
AbstractAs an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from a slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian smoothing technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression, and training Bayesian convolutional neural networks.
Persistent Identifierhttp://hdl.handle.net/10722/316575
ISSN
2023 Impact Factor: 3.0
2023 SCImago Journal Rankings: 1.803
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWANG, BAO-
dc.contributor.authorZOU, DIFAN-
dc.contributor.authorGU, QUANQUAN-
dc.contributor.authorOSHER, STANLEY J.-
dc.date.accessioned2022-09-14T11:40:47Z-
dc.date.available2022-09-14T11:40:47Z-
dc.date.issued2021-
dc.identifier.citationSIAM Journal on Scientific Computing, 2021, v. 43, n. 1, p. A26-A53-
dc.identifier.issn1064-8275-
dc.identifier.urihttp://hdl.handle.net/10722/316575-
dc.description.abstractAs an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from a slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian smoothing technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression, and training Bayesian convolutional neural networks.-
dc.languageeng-
dc.relation.ispartofSIAM Journal on Scientific Computing-
dc.subjectLangevin dynamics-
dc.subjectLaplacian smoothing-
dc.subjectStochastic gradient-
dc.titleLaplacian smoothing stochastic gradient markov chain monte carlo-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1137/19M1294356-
dc.identifier.scopuseid_2-s2.0-85102541128-
dc.identifier.volume43-
dc.identifier.issue1-
dc.identifier.spageA26-
dc.identifier.epageA53-
dc.identifier.eissn1095-7197-
dc.identifier.isiWOS:000623833100011-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats