LIPIcs.CP.2024.10.pdf
- Filesize: 0.87 MB
- 16 pages
Weighted model counting (WMC) plays a central role in probabilistic reasoning. Given that this problem is #P-hard, harder instances can generally only be addressed using approximate techniques based on sampling, which provide statistical convergence guarantees: the longer a sampling process runs, the more accurate the WMC is likely to be. In this work, we propose a deterministic search-based approach that can also be stopped at any time and provides hard lower- and upper-bound guarantees on the true WMC. This approach uses a value heuristic that guides exploration first towards models with a high weight and leverages Limited Discrepancy Search to make the bounds converge faster. The validity, scalability, and convergence of our approach are tested and compared with state-of-the-art baseline methods on the problem of computing marginal probabilities in Bayesian networks and reliability estimation in probabilistic graphs.
Feedback for Dagstuhl Publishing