File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Distributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions

TitleDistributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions
Authors
Keywordsconvergence analysis
distributed inference
Gaussian graphical models
message schedule
Variational message passing
Issue Date8-Apr-2024
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Signal Processing, 2024, v. 72, p. 2021-2035 How to Cite?
AbstractMessage passing algorithms on graphical models offer a low-complexity and distributed paradigm for performing marginalization from a high-dimensional distribution. However, the convergence behaviors of message passing algorithms can be heavily affected by the adopted message update schedule. In this paper, we focus on the variational message passing (VMP) applied to Gaussian graphical models and its convergence under different schedules is analyzed. In particular, based on the update equations of VMP under the mean-field assumption, we prove that the mean vectors obtained from VMP are the exact marginal mean vectors under any valid message passing schedule, giving the legitimacy of using VMP in Gaussian graphical models. Furthermore, three categories of valid message passing schedules, namely serial schedule, parallel schedule and randomized schedule are considered for VMP update. In the basic serial schedule, VMP unconditionally converges, but could be slow in large-scale distributed networks. To speed up the serial schedule, a group serial schedule is proposed while guaranteeing the VMP convergence. On the other hand, parallel schedule and its damped variant are applied to accelerate VMP, where the necessary and sufficient convergence conditions are derived. To allow nodes with different local computation resources to compute messages more flexibly and efficiently, a randomized schedule is proposed for VMP update, and the probabilistic necessary and sufficient convergence conditions are presented. Finally, numerical results and applications are presented to illustrate the trade-offs in the ease and speed of convergence.
Persistent Identifierhttp://hdl.handle.net/10722/351697
ISSN
2023 Impact Factor: 4.6
2023 SCImago Journal Rankings: 2.520

 

DC FieldValueLanguage
dc.contributor.authorLi, Bin-
dc.contributor.authorWu, Nan-
dc.contributor.authorWu, Yik Chung-
dc.date.accessioned2024-11-22T00:35:13Z-
dc.date.available2024-11-22T00:35:13Z-
dc.date.issued2024-04-08-
dc.identifier.citationIEEE Transactions on Signal Processing, 2024, v. 72, p. 2021-2035-
dc.identifier.issn1053-587X-
dc.identifier.urihttp://hdl.handle.net/10722/351697-
dc.description.abstractMessage passing algorithms on graphical models offer a low-complexity and distributed paradigm for performing marginalization from a high-dimensional distribution. However, the convergence behaviors of message passing algorithms can be heavily affected by the adopted message update schedule. In this paper, we focus on the variational message passing (VMP) applied to Gaussian graphical models and its convergence under different schedules is analyzed. In particular, based on the update equations of VMP under the mean-field assumption, we prove that the mean vectors obtained from VMP are the exact marginal mean vectors under any valid message passing schedule, giving the legitimacy of using VMP in Gaussian graphical models. Furthermore, three categories of valid message passing schedules, namely serial schedule, parallel schedule and randomized schedule are considered for VMP update. In the basic serial schedule, VMP unconditionally converges, but could be slow in large-scale distributed networks. To speed up the serial schedule, a group serial schedule is proposed while guaranteeing the VMP convergence. On the other hand, parallel schedule and its damped variant are applied to accelerate VMP, where the necessary and sufficient convergence conditions are derived. To allow nodes with different local computation resources to compute messages more flexibly and efficiently, a randomized schedule is proposed for VMP update, and the probabilistic necessary and sufficient convergence conditions are presented. Finally, numerical results and applications are presented to illustrate the trade-offs in the ease and speed of convergence.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectconvergence analysis-
dc.subjectdistributed inference-
dc.subjectGaussian graphical models-
dc.subjectmessage schedule-
dc.subjectVariational message passing-
dc.titleDistributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions-
dc.typeArticle-
dc.identifier.doi10.1109/TSP.2024.3385576-
dc.identifier.scopuseid_2-s2.0-85190173643-
dc.identifier.volume72-
dc.identifier.spage2021-
dc.identifier.epage2035-
dc.identifier.eissn1941-0476-
dc.identifier.issnl1053-587X-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats