File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Distributed Metropolis algorithm: convergence condition and optimal parallel speed-up

TitleDistributed Metropolis algorithm: convergence condition and optimal parallel speed-up
Authors
KeywordsCoupling
Distributed sampling
Markov chain Monte Carlo
Mixing time
Spin system
Issue Date2022
Citation
Scientia Sinica Informationis, 2022, v. 52, n. 2, p. 287-313 How to Cite?
AbstractThe Metropolis algorithm is a fundamental Markov chain Monte Carlo (MCMC) sampling technique, which can be used for drawing random samples from high-dimensional probability distributions (i.e., Gibbs distributions) represented by probabilistic graphical models. The traditional Metropolis algorithm is a sequential algorithm. A classic result regarding the fast convergence of the Metropolis algorithm is: when the Dobrushin- Shlosman condition for the Metropolis algorithm is satisfied, the algorithm converges rapidly within O(n log n) step, where n is the number of variables. This paper studies a distributed variant of the Metropolis algorithm, called the local-Metropolis algorithm. We provide an analysis of the correctness and convergence of this new algorithm, and show: the algorithm always converges to the correct Gibbs distribution; moreover, for a natural class of triangle-free probabilistic graphical models, as long as the same Dobrushin-Shlosman condition is satisfied, the local-Metropolis algorithm converges within O(log n) rounds of distributed computing. Compared to the traditional sequential Metropolis algorithm, this achieves an asymptotically optimal Ω(n) factor of speed-ups. Concrete applications include the distributed sampling algorithms for graph coloring, hardcore model, and Ising model.
Persistent Identifierhttp://hdl.handle.net/10722/355015
ISSN
2023 SCImago Journal Rankings: 0.310

 

DC FieldValueLanguage
dc.contributor.authorFeng, Weiming-
dc.contributor.authorYi, Yitong-
dc.date.accessioned2025-03-21T09:10:37Z-
dc.date.available2025-03-21T09:10:37Z-
dc.date.issued2022-
dc.identifier.citationScientia Sinica Informationis, 2022, v. 52, n. 2, p. 287-313-
dc.identifier.issn1674-7267-
dc.identifier.urihttp://hdl.handle.net/10722/355015-
dc.description.abstractThe Metropolis algorithm is a fundamental Markov chain Monte Carlo (MCMC) sampling technique, which can be used for drawing random samples from high-dimensional probability distributions (i.e., Gibbs distributions) represented by probabilistic graphical models. The traditional Metropolis algorithm is a sequential algorithm. A classic result regarding the fast convergence of the Metropolis algorithm is: when the Dobrushin- Shlosman condition for the Metropolis algorithm is satisfied, the algorithm converges rapidly within O(n log n) step, where n is the number of variables. This paper studies a distributed variant of the Metropolis algorithm, called the local-Metropolis algorithm. We provide an analysis of the correctness and convergence of this new algorithm, and show: the algorithm always converges to the correct Gibbs distribution; moreover, for a natural class of triangle-free probabilistic graphical models, as long as the same Dobrushin-Shlosman condition is satisfied, the local-Metropolis algorithm converges within O(log n) rounds of distributed computing. Compared to the traditional sequential Metropolis algorithm, this achieves an asymptotically optimal Ω(n) factor of speed-ups. Concrete applications include the distributed sampling algorithms for graph coloring, hardcore model, and Ising model.-
dc.languageeng-
dc.relation.ispartofScientia Sinica Informationis-
dc.subjectCoupling-
dc.subjectDistributed sampling-
dc.subjectMarkov chain Monte Carlo-
dc.subjectMixing time-
dc.subjectSpin system-
dc.titleDistributed Metropolis algorithm: convergence condition and optimal parallel speed-up-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1360/SSI-2021-0127-
dc.identifier.scopuseid_2-s2.0-85127464169-
dc.identifier.volume52-
dc.identifier.issue2-
dc.identifier.spage287-
dc.identifier.epage313-
dc.identifier.eissn2095-9486-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats