Dynamic Inference in Probabilistic Graphical Models

Authors Weiming Feng, Kun He, Xiaoming Sun , Yitong Yin



PDF
Thumbnail PDF

File

LIPIcs.ITCS.2021.25.pdf
  • Filesize: 0.66 MB
  • 20 pages

Document Identifiers

Author Details

Weiming Feng
  • State Key Laboratory for Novel Software Technology, Nanjing University, China
Kun He
  • Shenzhen University, China
  • Shenzhen Institute of Computing Sciences, China
Xiaoming Sun
  • State Key Laboratory of Computer Architecture, Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China
  • University of Chinese Academy of Sciences, Beijing, China
Yitong Yin
  • State Key Laboratory for Novel Software Technology, Nanjing University, China

Cite AsGet BibTex

Weiming Feng, Kun He, Xiaoming Sun, and Yitong Yin. Dynamic Inference in Probabilistic Graphical Models. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)
https://doi.org/10.4230/LIPIcs.ITCS.2021.25

Abstract

Probabilistic graphical models, such as Markov random fields (MRFs), are useful for describing high-dimensional distributions in terms of local dependence structures. The {probabilistic inference} is a fundamental problem related to graphical models, and sampling is a main approach for the problem. In this paper, we study probabilistic inference problems when the graphical model itself is changing dynamically with time. Such dynamic inference problems arise naturally in today’s application, e.g. multivariate time-series data analysis and practical learning procedures. We give a dynamic algorithm for sampling-based probabilistic inferences in MRFs, where each dynamic update can change the underlying graph and all parameters of the MRF simultaneously, as long as the total amount of changes is bounded. More precisely, suppose that the MRF has n variables and polylogarithmic-bounded maximum degree, and N(n) independent samples are sufficient for the inference for a polynomial function N(⋅). Our algorithm dynamically maintains an answer to the inference problem using Õ(n N(n)) space cost, and Õ(N(n) + n) incremental time cost upon each update to the MRF, as long as the Dobrushin-Shlosman condition is satisfied by the MRFs. This well-known condition has long been used for guaranteeing the efficiency of Markov chain Monte Carlo (MCMC) sampling in the traditional static setting. Compared to the static case, which requires Ω(n N(n)) time cost for redrawing all N(n) samples whenever the MRF changes, our dynamic algorithm gives a 𝛺^~(min{n, N(n)})-factor speedup. Our approach relies on a novel dynamic sampling technique, which transforms local Markov chains (a.k.a. single-site dynamics) to dynamic sampling algorithms, and an "algorithmic Lipschitz" condition that we establish for sampling from graphical models, namely, when the MRF changes by a small difference, samples can be modified to reflect the new distribution, with cost proportional to the difference on MRF.

Subject Classification

ACM Subject Classification
  • Theory of computation → Design and analysis of algorithms
  • Computing methodologies → Machine learning
Keywords
  • Dynamic inference
  • probabilistic graphical model
  • Gibbs sampling
  • Markov random filed

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Ittai Abraham, David Durfee, Ioannis Koutis, Sebastian Krinninger, and Richard Peng. On fully dynamic graph sparsifiers. In FOCS, 2016. Google Scholar
  2. Osvaldo Anacleto, Catriona Queen, et al. Dynamic chain graph models for time series network data. Bayesian Anal., 12(2):491-509, 2017. Google Scholar
  3. Aaron Bernstein and Shiri Chechik. Deterministic decremental single source shortest paths: beyond the o(mn) bound. In STOC, 2016. Google Scholar
  4. Russ Bubley and Martin Dyer. Path coupling: A technique for proving rapid mixing in Markov chains. In FOCS, 1997. Google Scholar
  5. Carlos M. Carvalho and Mike West. Dynamic matrix-variate graphical models. Bayesian Anal., 2(1):69-97, 2007. Google Scholar
  6. Christopher De Sa, Kunle Olukotun, and Christopher Ré. Ensuring rapid mixing and low bias for asynchronous Gibbs sampling. In ICML, 2016. Google Scholar
  7. RL Dobrushin and SB Shlosman. Completely analytical interactions: constructive description. J. Statist. Phys., 46(5-6):983-1014, 1987. Google Scholar
  8. Roland L Dobrushin and Senya B Shlosman. Completely analytical Gibbs fields. In Statistical Physics and Dynamical Systems, pages 371-403. Springer, 1985. Google Scholar
  9. Roland Lvovich Dobrushin and Senya B Shlosman. Constructive criterion for the uniqueness of Gibbs field. In Statistical Physics and Dynamical Systems, pages 347-370. Springer, 1985. Google Scholar
  10. David Durfee, Yu Gao, Gramoz Goranci, and Richard Peng. Fully dynamic effective resistances. arXiv preprint, 2018. URL: http://arxiv.org/abs/1804.04038.
  11. David Durfee, Yu Gao, Gramoz Goranci, and Richard Peng. Fully dynamic spectral vertex sparsifiers and applications. In STOC, 2019. Google Scholar
  12. Martin Dyer, Leslie Ann Goldberg, and Mark Jerrum. Dobrushin conditions and systematic scan. Combin. Probab. Comput., 17(6):761-779, 2008. Google Scholar
  13. Martin Dyer and Catherine Greenhill. On Markov chains for independent sets. J. Algorithms, 35(1):17-49, 2000. Google Scholar
  14. Weiming Feng, Nisheeth K Vishnoi, and Yitong Yin. Dynamic sampling from graphical models. In STOC, 2019. Google Scholar
  15. Sebastian Forster and Gramoz Goranci. Dynamic low-stretch trees via dynamic low-diameter decompositions. In STOC, pages 377-388, 2019. Google Scholar
  16. Andreas Galanis, Daniel Štefankovič, and Eric Vigoda. Inapproximability for antiferromagnetic spin systems in the tree nonuniqueness region. J. ACM, 62(6):50, 2015. Google Scholar
  17. Andreas Galanis, Daniel Štefankovič, and Eric Vigoda. Inapproximability of the partition function for the antiferromagnetic Ising and hard-core models. Combin. Probab. Comput., 25(04):500-559, 2016. Google Scholar
  18. Gramoz Goranci, Monika Henzinger, and Pan Peng. Dynamic Effective Resistances and Approximate Schur Complement on Separable Graphs. In ESA, volume 112, 2018. Google Scholar
  19. Thomas P Hayes. A simple condition implying rapid mixing of single-site dynamics on spin systems. In FOCS, 2006. Google Scholar
  20. Monika Henzinger, Sebastian Krinninger, and Danupon Nanongkai. Decremental single-source shortest paths on undirected graphs in near-linear total update time. In FOCS, 2014. Google Scholar
  21. Monika Henzinger, Sebastian Krinninger, and Danupon Nanongkai. Dynamic approximate all-pairs shortest paths: Breaking the O(mn) barrier and derandomization. SIAM J. Comput., 45(3):947-1006, 2016. Google Scholar
  22. Geoffrey E Hinton. A practical guide to training restricted boltzmann machines. In Neural Networks: Tricks of the Trade, pages 599-619. Springer, 2012. Google Scholar
  23. Mark Jerrum. A very simple algorithm for estimating the number of k-colorings of a low-degree graph. Random Structures & Algorithms, 7(2):157-165, 1995. Google Scholar
  24. Mark Jerrum, Leslie G. Valiant, and Vijay V. Vazirani. Random generation of combinatorial structures from a uniform distribution. Theoret. Comput. Sci., 43:169-188, 1986. Google Scholar
  25. Daphne Koller, Nir Friedman, and Francis Bach. Probabilistic graphical models: principles and techniques. MIT press, 2009. Google Scholar
  26. Holden Lee, Oren Mangoubi, and Nisheeth Vishnoi. Online sampling from log-concave distributions. In NIPS, 2019. Google Scholar
  27. David A Levin and Yuval Peres. Markov chains and mixing times. American Mathematical Soc., 2017. Google Scholar
  28. Michael Luby and Eric Vigoda. Fast convergence of the Glauber dynamics for sampling independent sets. Random Structures & Algorithms, 15(3-4):229-241, 1999. Google Scholar
  29. Marc Mezard and Andrea Montanari. Information, physics, and computation. Oxford University Press, 2009. Google Scholar
  30. Danupon Nanongkai, Thatchaphol Saranurak, and Christian Wulff-Nilsen. Dynamic minimum spanning forest with subpolynomial worst-case update time. In FOCS, 2017. Google Scholar
  31. Hariharan Narayanan and Alexander Rakhlin. Efficient sampling from time-varying log-concave distributions. J. Mach. Learn. Res., 18(1):4017-4045, 2017. Google Scholar
  32. Catriona M. Queen and Jim Q. Smith. Multiregression dynamic models. J. Roy. Statist. Soc. Ser. B, 55(4):849-870, 1993. Google Scholar
  33. Cedric Renggli, Bojan Karlaš, Bolin Ding, Feng Liu, Kevin Schawinski, Wentao Wu, and Ce Zhang. Continuous integration of machine learning models: A rigorous yet practical treatment. In SysML, 2019. Google Scholar
  34. Padhraic Smyth, Max Welling, and Arthur U Asuncion. Asynchronous distributed learning of topic models. In NIPS, 2009. Google Scholar
  35. Daniel Štefankovič, Santosh Vempala, and Eric Vigoda. Adaptive simulated annealing: A near-optimal connection between sampling and counting. J. ACM, 56(3):18, 2009. Google Scholar
  36. Eric Vigoda. Fast convergence of the Glauber dynamics for sampling independent sets: Part II. Technical Report TR-99-003, International Computer Science Institute, 1999. Google Scholar
  37. Martin J. Wainwright and Michael I. Jordan. Graphical models, exponential families, and variational inference. Now Publishers Inc, 2008. Google Scholar
  38. Christian Wulff-Nilsen. Fully-dynamic minimum spanning forest with improved worst-case update time. In STOC, 2017. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail