File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1137/19M1294356
- Scopus: eid_2-s2.0-85102541128
- WOS: WOS:000623833100011
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Laplacian smoothing stochastic gradient markov chain monte carlo
Title | Laplacian smoothing stochastic gradient markov chain monte carlo |
---|---|
Authors | |
Keywords | Langevin dynamics Laplacian smoothing Stochastic gradient |
Issue Date | 2021 |
Citation | SIAM Journal on Scientific Computing, 2021, v. 43, n. 1, p. A26-A53 How to Cite? |
Abstract | As an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from a slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian smoothing technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression, and training Bayesian convolutional neural networks. |
Persistent Identifier | http://hdl.handle.net/10722/316575 |
ISSN | 2023 Impact Factor: 3.0 2023 SCImago Journal Rankings: 1.803 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | WANG, BAO | - |
dc.contributor.author | ZOU, DIFAN | - |
dc.contributor.author | GU, QUANQUAN | - |
dc.contributor.author | OSHER, STANLEY J. | - |
dc.date.accessioned | 2022-09-14T11:40:47Z | - |
dc.date.available | 2022-09-14T11:40:47Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | SIAM Journal on Scientific Computing, 2021, v. 43, n. 1, p. A26-A53 | - |
dc.identifier.issn | 1064-8275 | - |
dc.identifier.uri | http://hdl.handle.net/10722/316575 | - |
dc.description.abstract | As an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from a slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian smoothing technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression, and training Bayesian convolutional neural networks. | - |
dc.language | eng | - |
dc.relation.ispartof | SIAM Journal on Scientific Computing | - |
dc.subject | Langevin dynamics | - |
dc.subject | Laplacian smoothing | - |
dc.subject | Stochastic gradient | - |
dc.title | Laplacian smoothing stochastic gradient markov chain monte carlo | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1137/19M1294356 | - |
dc.identifier.scopus | eid_2-s2.0-85102541128 | - |
dc.identifier.volume | 43 | - |
dc.identifier.issue | 1 | - |
dc.identifier.spage | A26 | - |
dc.identifier.epage | A53 | - |
dc.identifier.eissn | 1095-7197 | - |
dc.identifier.isi | WOS:000623833100011 | - |