,
Pierre Schaus
,
Siegfried Nijssen
Creative Commons Attribution 4.0 International license
Weighted model counting (WMC) plays a central role in probabilistic reasoning. Given that this problem is #P-hard, harder instances can generally only be addressed using approximate techniques based on sampling, which provide statistical convergence guarantees: the longer a sampling process runs, the more accurate the WMC is likely to be. In this work, we propose a deterministic search-based approach that can also be stopped at any time and provides hard lower- and upper-bound guarantees on the true WMC. This approach uses a value heuristic that guides exploration first towards models with a high weight and leverages Limited Discrepancy Search to make the bounds converge faster. The validity, scalability, and convergence of our approach are tested and compared with state-of-the-art baseline methods on the problem of computing marginal probabilities in Bayesian networks and reliability estimation in probabilistic graphs.
@InProceedings{dubray_et_al:LIPIcs.CP.2024.10,
author = {Dubray, Alexandre and Schaus, Pierre and Nijssen, Siegfried},
title = {{Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference}},
booktitle = {30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
pages = {10:1--10:16},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {978-3-95977-336-2},
ISSN = {1868-8969},
year = {2024},
volume = {307},
editor = {Shaw, Paul},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.10},
URN = {urn:nbn:de:0030-drops-206956},
doi = {10.4230/LIPIcs.CP.2024.10},
annote = {Keywords: Projected Weighted Model Counting, Limited Discrepancy Search, Approximate Method, Probabilistic Inference}
}
archived version
archived version
archived version
archived version