File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Dynamic inference in probabilistic graphical models

TitleDynamic inference in probabilistic graphical models
Authors
KeywordsDynamic inference
Gibbs sampling
Markov random filed
Probabilistic graphical model
Issue Date2021
Citation
Leibniz International Proceedings in Informatics, LIPIcs, 2021, v. 185, article no. 25 How to Cite?
AbstractProbabilistic graphical models, such as Markov random fields (MRFs), are useful for describing high-dimensional distributions in terms of local dependence structures. The probabilistic inference is a fundamental problem related to graphical models, and sampling is a main approach for the problem. In this paper, we study probabilistic inference problems when the graphical model itself is changing dynamically with time. Such dynamic inference problems arise naturally in today’s application, e.g. multivariate time-series data analysis and practical learning procedures. We give a dynamic algorithm for sampling-based probabilistic inferences in MRFs, where each dynamic update can change the underlying graph and all parameters of the MRF simultaneously, as long as the total amount of changes is bounded. More precisely, suppose that the MRF has n variables and polylogarithmic-bounded maximum degree, and N(n) independent samples are sufficient for the inference for a polynomial function N(·). Our algorithm dynamically maintains an answer to the inference problem using Oe(nN(n)) space cost, and Oe(N(n) + n) incremental time cost upon each update to the MRF, as long as the Dobrushin-Shlosman condition is satisfied by the MRFs. This well-known condition has long been used for guaranteeing the efficiency of Markov chain Monte Carlo (MCMC) sampling in the traditional static setting. Compared to the static case, which requires Ω(nN(n)) time cost for redrawing all N(n) samples whenever the MRF changes, our dynamic algorithm gives a Ω(min e {n, N(n)})-factor speedup. Our approach relies on a novel dynamic sampling technique, which transforms local Markov chains (a.k.a. single-site dynamics) to dynamic sampling algorithms, and an “algorithmic Lipschitz” condition that we establish for sampling from graphical models, namely, when the MRF changes by a small difference, samples can be modified to reflect the new distribution, with cost proportional to the difference on MRF.
Persistent Identifierhttp://hdl.handle.net/10722/355007
ISSN
2023 SCImago Journal Rankings: 0.796

 

DC FieldValueLanguage
dc.contributor.authorFeng, Weiming-
dc.contributor.authorHe, Kun-
dc.contributor.authorSun, Xiaoming-
dc.contributor.authorYin, Yitong-
dc.date.accessioned2025-03-21T09:10:35Z-
dc.date.available2025-03-21T09:10:35Z-
dc.date.issued2021-
dc.identifier.citationLeibniz International Proceedings in Informatics, LIPIcs, 2021, v. 185, article no. 25-
dc.identifier.issn1868-8969-
dc.identifier.urihttp://hdl.handle.net/10722/355007-
dc.description.abstractProbabilistic graphical models, such as Markov random fields (MRFs), are useful for describing high-dimensional distributions in terms of local dependence structures. The probabilistic inference is a fundamental problem related to graphical models, and sampling is a main approach for the problem. In this paper, we study probabilistic inference problems when the graphical model itself is changing dynamically with time. Such dynamic inference problems arise naturally in today’s application, e.g. multivariate time-series data analysis and practical learning procedures. We give a dynamic algorithm for sampling-based probabilistic inferences in MRFs, where each dynamic update can change the underlying graph and all parameters of the MRF simultaneously, as long as the total amount of changes is bounded. More precisely, suppose that the MRF has n variables and polylogarithmic-bounded maximum degree, and N(n) independent samples are sufficient for the inference for a polynomial function N(·). Our algorithm dynamically maintains an answer to the inference problem using Oe(nN(n)) space cost, and Oe(N(n) + n) incremental time cost upon each update to the MRF, as long as the Dobrushin-Shlosman condition is satisfied by the MRFs. This well-known condition has long been used for guaranteeing the efficiency of Markov chain Monte Carlo (MCMC) sampling in the traditional static setting. Compared to the static case, which requires Ω(nN(n)) time cost for redrawing all N(n) samples whenever the MRF changes, our dynamic algorithm gives a Ω(min e {n, N(n)})-factor speedup. Our approach relies on a novel dynamic sampling technique, which transforms local Markov chains (a.k.a. single-site dynamics) to dynamic sampling algorithms, and an “algorithmic Lipschitz” condition that we establish for sampling from graphical models, namely, when the MRF changes by a small difference, samples can be modified to reflect the new distribution, with cost proportional to the difference on MRF.-
dc.languageeng-
dc.relation.ispartofLeibniz International Proceedings in Informatics, LIPIcs-
dc.subjectDynamic inference-
dc.subjectGibbs sampling-
dc.subjectMarkov random filed-
dc.subjectProbabilistic graphical model-
dc.titleDynamic inference in probabilistic graphical models-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.4230/LIPIcs.ITCS.2021.25-
dc.identifier.scopuseid_2-s2.0-85115245563-
dc.identifier.volume185-
dc.identifier.spagearticle no. 25-
dc.identifier.epagearticle no. 25-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats