19 Search Results for "Capelli, Florent"


Document
Research
Designing Output Sensitive Algorithms for Subgraph Enumeration

Authors: Alessio Conte, Kazuhiro Kurita, Andrea Marino, Giulia Punzi, Takeaki Uno, and Kunihiro Wasa

Published in: OASIcs, Volume 132, From Strings to Graphs, and Back Again: A Festschrift for Roberto Grossi's 60th Birthday (2025)


Abstract
The enumeration of all subgraphs respecting some structural property is a fundamental task in theoretical computer science, with practical applications in many branches of data mining and network analysis. It is often of interest to only consider solutions (subgraphs) that are maximal under inclusion, and to achieve output-sensitive complexity, i.e., bounding the running time with respect to the number of subgraphs produced. In this paper, we provide a survey of techniques for designing output-sensitive algorithms for subgraph enumeration, including partition-based approaches such as flashlight search, solution-graph traversal methods such as reverse search, and cost amortization strategies such as push-out amortization. We also briefly discuss classes of efficiency, hardness of enumeration, and variants such as approximate enumeration. The paper is meant as an accessible handbook for learning the basics of the field and as a practical reference for selecting state-of-the-art subgraph enumeration strategies fitting to one’s needs.

Cite as

Alessio Conte, Kazuhiro Kurita, Andrea Marino, Giulia Punzi, Takeaki Uno, and Kunihiro Wasa. Designing Output Sensitive Algorithms for Subgraph Enumeration. In From Strings to Graphs, and Back Again: A Festschrift for Roberto Grossi's 60th Birthday. Open Access Series in Informatics (OASIcs), Volume 132, pp. 19:1-19:40, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{conte_et_al:OASIcs.Grossi.19,
  author =	{Conte, Alessio and Kurita, Kazuhiro and Marino, Andrea and Punzi, Giulia and Uno, Takeaki and Wasa, Kunihiro},
  title =	{{Designing Output Sensitive Algorithms for Subgraph Enumeration}},
  booktitle =	{From Strings to Graphs, and Back Again: A Festschrift for Roberto Grossi's 60th Birthday},
  pages =	{19:1--19:40},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-391-1},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{132},
  editor =	{Conte, Alessio and Marino, Andrea and Rosone, Giovanna and Vitter, Jeffrey Scott},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.Grossi.19},
  URN =		{urn:nbn:de:0030-drops-238180},
  doi =		{10.4230/OASIcs.Grossi.19},
  annote =	{Keywords: Graph algorithms, Graph enumeration, Output sensitive enumeration}
}
Document
Practically Feasible Proof Logging for Pseudo-Boolean Optimization

Authors: Wietze Koops, Daniel Le Berre, Magnus O. Myreen, Jakob Nordström, Andy Oertel, Yong Kiam Tan, and Marc Vinyals

Published in: LIPIcs, Volume 340, 31st International Conference on Principles and Practice of Constraint Programming (CP 2025)


Abstract
Certifying solvers have long been standard for decision problems in Boolean satisfiability (SAT), allowing for proof logging and checking with very limited overhead, but developing similar tools for combinatorial optimization has remained a challenge. A recent promising approach covering a wide range of solving paradigms is pseudo-Boolean proof logging, but this has mostly consisted of proof-of-concept works far from delivering the performance required for real-world deployment. In this work, we present an efficient toolchain based on VeriPB and CakePB for formally verified pseudo-Boolean optimization. We implement proof logging for the full range of techniques in the state-of-the-art solvers RoundingSat and Sat4j, including core-guided search and linear programming integration with Farkas certificates and cut generation. Our experimental evaluation shows that proof logging and checking performance in this much more expressive paradigm is now quite close to the level of SAT solving, and hence is clearly practically feasible.

Cite as

Wietze Koops, Daniel Le Berre, Magnus O. Myreen, Jakob Nordström, Andy Oertel, Yong Kiam Tan, and Marc Vinyals. Practically Feasible Proof Logging for Pseudo-Boolean Optimization. In 31st International Conference on Principles and Practice of Constraint Programming (CP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 340, pp. 21:1-21:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{koops_et_al:LIPIcs.CP.2025.21,
  author =	{Koops, Wietze and Le Berre, Daniel and Myreen, Magnus O. and Nordstr\"{o}m, Jakob and Oertel, Andy and Tan, Yong Kiam and Vinyals, Marc},
  title =	{{Practically Feasible Proof Logging for Pseudo-Boolean Optimization}},
  booktitle =	{31st International Conference on Principles and Practice of Constraint Programming (CP 2025)},
  pages =	{21:1--21:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-380-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{340},
  editor =	{de la Banda, Maria Garcia},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2025.21},
  URN =		{urn:nbn:de:0030-drops-238825},
  doi =		{10.4230/LIPIcs.CP.2025.21},
  annote =	{Keywords: proof logging, certifying algorithms, combinatorial optimization, certification, pseudo-Boolean solving, 0-1 integer linear programming}
}
Document
Certifying Projected Knowledge Compilation

Authors: Randal E. Bryant, Yong Kiam Tan, and Marijn J. H. Heule

Published in: LIPIcs, Volume 341, 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)


Abstract
Knowledge compilers convert Boolean formulas, given in conjunctive normal form (CNF), into representations that enable efficient evaluation of unweighted and weighted model counts, as well as a variety of other useful properties. With projected knowledge compilation, the generated representation describes the restriction of the formula to a designated set of data variables, with the remaining ones eliminated by existential quantification. Projected knowledge compilation has applications in a variety of domains, including formal verification and synthesis. This paper describes a formally verified proof framework for certifying the output of a projected knowledge compiler. It builds on an earlier clausal proof framework for certifying the output of a standard knowledge compiler. Extending the framework to projected compilation requires a method to represent Skolem assignments, describing how the quantified variables can be assigned, given an assignment for the data variables. We do so by extending the representation generated by the knowledge compiler to also encode Skolem assignments. We also refine the earlier framework, moving beyond purely clausal proofs to enable scaling certification to larger formulas. We present experimental results obtained by making small modifications to the D4 projected knowledge compiler and extensions of our earlier proof generator. We detail a soundness argument stating that a compiler output that passes our certifier is logically equivalent to the quantified input formula; the soundness argument has been formally validated using the HOL4 proof assistant. The checker also ensures that the compiler output satisfies the properties required for efficient unweighted and weighted model counting. We have developed two proof checkers for the certification framework: one written in C and designed for high performance and one written in CakeML and formally verified in HOL4.

Cite as

Randal E. Bryant, Yong Kiam Tan, and Marijn J. H. Heule. Certifying Projected Knowledge Compilation. In 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 341, pp. 8:1-8:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bryant_et_al:LIPIcs.SAT.2025.8,
  author =	{Bryant, Randal E. and Tan, Yong Kiam and Heule, Marijn J. H.},
  title =	{{Certifying Projected Knowledge Compilation}},
  booktitle =	{28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)},
  pages =	{8:1--8:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-381-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{341},
  editor =	{Berg, Jeremias and Nordstr\"{o}m, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2025.8},
  URN =		{urn:nbn:de:0030-drops-237422},
  doi =		{10.4230/LIPIcs.SAT.2025.8},
  annote =	{Keywords: Knowledge Compilation, Propositional model counting, Proof checking}
}
Document
A Simple Algorithm for Worst Case Optimal Join and Sampling

Authors: Florent Capelli, Oliver Irwin, and Sylvain Salvati

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
We present an elementary branch and bound algorithm with a simple analysis of why it achieves worstcase optimality for join queries on classes of databases defined respectively by cardinality or acyclic degree constraints. We then show that if one is given a reasonable way for recursively estimating upper bounds on the number of answers of the join queries, our algorithm can be turned into algorithm for uniformly sampling answers with expected running time Õ(UP/OUT) where UP is the upper bound, OUT is the actual number of answers and Õ(⋅) ignores polylogarithmic factors. Our approach recovers recent results on worstcase optimal join algorithm and sampling in a modular, clean and elementary way.

Cite as

Florent Capelli, Oliver Irwin, and Sylvain Salvati. A Simple Algorithm for Worst Case Optimal Join and Sampling. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 23:1-23:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{capelli_et_al:LIPIcs.ICDT.2025.23,
  author =	{Capelli, Florent and Irwin, Oliver and Salvati, Sylvain},
  title =	{{A Simple Algorithm for Worst Case Optimal Join and Sampling}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{23:1--23:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.23},
  URN =		{urn:nbn:de:0030-drops-229647},
  doi =		{10.4230/LIPIcs.ICDT.2025.23},
  annote =	{Keywords: join queries, worst-case optimality, uniform sampling}
}
Document
Database Theory in Action
Database Theory in Action: Making Provenance and Probabilistic Database Theory Work in Practice (Invited Talk)

Authors: Silviu Maniu and Pierre Senellart

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
There has been a rich literature in database theory on how to model and manage the provenance of data (for instance using the semiring framework) and its uncertainty (in particular via probabilistic databases). In this article, we explain how these results have been used as the basis for practical implementations, notably in the ProvSQL system, and how these implementations need to be adapted for the efficient management of provenance and probability for real-world data.

Cite as

Silviu Maniu and Pierre Senellart. Database Theory in Action: Making Provenance and Probabilistic Database Theory Work in Practice (Invited Talk). In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 33:1-33:6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{maniu_et_al:LIPIcs.ICDT.2025.33,
  author =	{Maniu, Silviu and Senellart, Pierre},
  title =	{{Database Theory in Action: Making Provenance and Probabilistic Database Theory Work in Practice (Invited Talk)}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{33:1--33:6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.33},
  URN =		{urn:nbn:de:0030-drops-229746},
  doi =		{10.4230/LIPIcs.ICDT.2025.33},
  annote =	{Keywords: provenance, probabilistic data, ProvSQL}
}
Document
An FPRAS for Model Counting for Non-Deterministic Read-Once Branching Programs

Authors: Kuldeep S. Meel and Alexis de Colnet

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
Non-deterministic read-once branching programs, also known as non-deterministic free binary decision diagrams (nFBDD), are a fundamental data structure in computer science for representing Boolean functions. In this paper, we focus on #nFBDD, the problem of model counting for non-deterministic read-once branching programs. The #nFBDD problem is #P-hard, and it is known that there exists a quasi-polynomial randomized approximation scheme for #nFBDD. In this paper, we provide the first FPRAS for #nFBDD. Our result relies on the introduction of new analysis techniques that focus on bounding the dependence of samples.

Cite as

Kuldeep S. Meel and Alexis de Colnet. An FPRAS for Model Counting for Non-Deterministic Read-Once Branching Programs. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 30:1-30:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{meel_et_al:LIPIcs.ICDT.2025.30,
  author =	{Meel, Kuldeep S. and de Colnet, Alexis},
  title =	{{An FPRAS for Model Counting for Non-Deterministic Read-Once Branching Programs}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{30:1--30:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.30},
  URN =		{urn:nbn:de:0030-drops-229717},
  doi =		{10.4230/LIPIcs.ICDT.2025.30},
  annote =	{Keywords: Approximate model counting, FPRAS, Knowledge compilation, nFBDD}
}
Document
Generalized Covers for Conjunctive Queries

Authors: Paraschos Koutris

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
Covers of query results were introduced as succinct lossless representations of join query outputs. A cover is a subset of the query result from which we can efficiently enumerate the output with constant delay and linear preprocessing time. However, covers are dependent on a single tree decomposition of the query. In this work, we generalize the notion of a cover to a set of multiple tree decompositions. We show that this generalization can potentially produce asymptotically smaller covers while maintaining the properties of constant-delay enumeration and linear preprocessing time. In particular, given a set of tree decompositions, we can determine exactly the asymptotic size of a minimum cover, which is tied to the notion of entropic width of the query. We also provide a simple greedy algorithm that computes this cover efficiently. Finally, we relate covers to semiring circuits when the semiring is idempotent.

Cite as

Paraschos Koutris. Generalized Covers for Conjunctive Queries. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 28:1-28:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{koutris:LIPIcs.ICDT.2025.28,
  author =	{Koutris, Paraschos},
  title =	{{Generalized Covers for Conjunctive Queries}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{28:1--28:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.28},
  URN =		{urn:nbn:de:0030-drops-229698},
  doi =		{10.4230/LIPIcs.ICDT.2025.28},
  annote =	{Keywords: Conjunctive Query, tree decomposition, cover}
}
Document
Dynamic Direct Access of MSO Query Evaluation over Strings

Authors: Pierre Bourhis, Florent Capelli, Stefan Mengel, and Cristian Riveros

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
We study the problem of evaluating a Monadic Second Order (MSO) query over strings under updates in the setting of direct access. We present an algorithm that, given an MSO query with first-order free variables represented by an unambiguous variable-set automaton 𝒜 with state set Q and variables X and a string s, computes a data structure in time 𝒪(|Q|^ω⋅ |X|² ⋅ |s|) and, then, given an index i retrieves, using the data structure, the i-th output of the evaluation of 𝒜 over s in time 𝒪(|Q|^ω ⋅ |X|³ ⋅ log(|s|)²) where ω is the exponent for matrix multiplication. Ours is the first efficient direct access algorithm for MSO query evaluation over strings; such algorithms so far had only been studied for first-order queries and conjunctive queries over relational data. Our algorithm gives the answers in lexicographic order where, in contrast to the setting of conjunctive queries, the order between variables can be freely chosen by the user without degrading the runtime. Moreover, our data structure can be updated efficiently after changes to the input string, allowing more powerful updates than in the enumeration literature, e.g. efficient deletion of substrings, concatenation and splitting of strings, and cut-and-paste operations. Our approach combines a matrix representation of MSO queries and a novel data structure for dynamic word problems over semi-groups which yields an overall algorithm that is elegant and easy to formulate.

Cite as

Pierre Bourhis, Florent Capelli, Stefan Mengel, and Cristian Riveros. Dynamic Direct Access of MSO Query Evaluation over Strings. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 26:1-26:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bourhis_et_al:LIPIcs.ICDT.2025.26,
  author =	{Bourhis, Pierre and Capelli, Florent and Mengel, Stefan and Riveros, Cristian},
  title =	{{Dynamic Direct Access of MSO Query Evaluation over Strings}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{26:1--26:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.26},
  URN =		{urn:nbn:de:0030-drops-229675},
  doi =		{10.4230/LIPIcs.ICDT.2025.26},
  annote =	{Keywords: Query evaluation, direct access, MSO queries}
}
Document
Structure-Guided Automated Reasoning

Authors: Max Bannach and Markus Hecher

Published in: LIPIcs, Volume 327, 42nd International Symposium on Theoretical Aspects of Computer Science (STACS 2025)


Abstract
Algorithmic meta-theorems state that problems definable in a fixed logic can be solved efficiently on structures with certain properties. An example is Courcelle’s Theorem, which states that all problems expressible in monadic second-order logic can be solved efficiently on structures of small treewidth. Such theorems are usually proven by algorithms for the model-checking problem of the logic, which is often complex and rarely leads to highly efficient solutions. Alternatively, we can solve the model-checking problem by grounding the given logic to propositional logic, for which dedicated solvers are available. Such encodings will, however, usually not preserve the input’s treewidth. This paper investigates whether all problems definable in monadic second-order logic can efficiently be encoded into SAT such that the input’s treewidth bounds the treewidth of the resulting formula. We answer this in the affirmative and, hence, provide an alternative proof of Courcelle’s Theorem. Our technique can naturally be extended: There are treewidth-aware reductions from the optimization version of Courcelle’s Theorem to MAXSAT and from the counting version of the theorem to #SAT. By using encodings to SAT, we obtain, ignoring polynomial factors, the same running time for the model-checking problem as we would with dedicated algorithms. Another immediate consequence is a treewidth-preserving reduction from the model-checking problem of monadic second-order logic to integer linear programming (ILP). We complement our upper bounds with new lower bounds based on ETH; and we show that the block size of the input’s formula and the treewidth of the input’s structure are tightly linked. Finally, we present various side results needed to prove the main theorems: A treewidth-preserving cardinality constraints, treewidth-preserving encodings from CNFs into DNFs, and a treewidth-aware quantifier elimination scheme for QBF implying a treewidth-preserving reduction from QSAT to SAT. We also present a reduction from projected model counting to #SAT that increases the treewidth by at most a factor of 2^{k+3.59}, yielding a algorithm for projected model counting that beats the currently best running time of 2^{2^{k+4}}⋅poly(|ψ|).

Cite as

Max Bannach and Markus Hecher. Structure-Guided Automated Reasoning. In 42nd International Symposium on Theoretical Aspects of Computer Science (STACS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 327, pp. 15:1-15:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bannach_et_al:LIPIcs.STACS.2025.15,
  author =	{Bannach, Max and Hecher, Markus},
  title =	{{Structure-Guided Automated Reasoning}},
  booktitle =	{42nd International Symposium on Theoretical Aspects of Computer Science (STACS 2025)},
  pages =	{15:1--15:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-365-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{327},
  editor =	{Beyersdorff, Olaf and Pilipczuk, Micha{\l} and Pimentel, Elaine and Thắng, Nguy\~{ê}n Kim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2025.15},
  URN =		{urn:nbn:de:0030-drops-228408},
  doi =		{10.4230/LIPIcs.STACS.2025.15},
  annote =	{Keywords: automated reasoning, treewidth, satisfiability, max-sat, sharp-sat, monadic second-order logic, fixed-parameter tractability}
}
Document
Circuits, Proofs and Propositional Model Counting

Authors: Sravanthi Chede, Leroy Chew, and Anil Shukla

Published in: LIPIcs, Volume 323, 44th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2024)


Abstract
In this paper we present a new proof system framework CLIP (Circuit Linear Induction Proposition) for propositional model counting (#SAT). A CLIP proof firstly involves a Boolean circuit, calculating the cumulative function (or running count) of models counted up to a point, and secondly a propositional proof arguing for the correctness of the circuit. This concept is remarkably simple and CLIP is modular so it allows us to use existing checking formats from propositional logic, especially strong proof systems. CLIP has polynomial-size proofs for XOR-pairs which are known to require exponential-size proofs in MICE [Fichte et al., 2022]. The existence of a strong proof system that can tackle these hard problems was posed as an open problem in Beyersdorff et al. [Olaf Beyersdorff et al., 2023]. In addition, CLIP systems can p-simulate all other existing #SAT proofs systems (KCPS(#SAT) [Florent Capelli, 2019], CPOG [Bryant et al., 2023], MICE). Furthermore, CLIP has a theoretical advantage over the other #SAT proof systems in the sense that CLIP only has lower bounds from its propositional proof system or if 𝖯^#𝖯 is not contained in P/poly, which is a major open problem in circuit complexity. CLIP uses unrestricted circuits in its proof as compared to restricted structures used by the existing #SAT proof systems. In this way, CLIP avoids hardness or limitations due to circuit restrictions.

Cite as

Sravanthi Chede, Leroy Chew, and Anil Shukla. Circuits, Proofs and Propositional Model Counting. In 44th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 323, pp. 18:1-18:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chede_et_al:LIPIcs.FSTTCS.2024.18,
  author =	{Chede, Sravanthi and Chew, Leroy and Shukla, Anil},
  title =	{{Circuits, Proofs and Propositional Model Counting}},
  booktitle =	{44th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2024)},
  pages =	{18:1--18:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-355-3},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{323},
  editor =	{Barman, Siddharth and Lasota, S{\l}awomir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2024.18},
  URN =		{urn:nbn:de:0030-drops-222079},
  doi =		{10.4230/LIPIcs.FSTTCS.2024.18},
  annote =	{Keywords: Propositional model counting, Boolean circuits, #SAT, Proof Systems, Certified Partition Operation Graph (CPOG)}
}
Document
Direct Access for Conjunctive Queries with Negations

Authors: Florent Capelli and Oliver Irwin

Published in: LIPIcs, Volume 290, 27th International Conference on Database Theory (ICDT 2024)


Abstract
Given a conjunctive query Q and a database 𝐃, a direct access to the answers of Q over 𝐃 is the operation of returning, given an index j, the j-th answer for some order on its answers. While this problem is #P-hard in general with respect to combined complexity, many conjunctive queries have an underlying structure that allows for a direct access to their answers for some lexicographical ordering that takes polylogarithmic time in the size of the database after a polynomial time precomputation. Previous work has precisely characterised the tractable classes and given fine-grained lower bounds on the precomputation time needed depending on the structure of the query. In this paper, we generalise these tractability results to the case of signed conjunctive queries, that is, conjunctive queries that may contain negative atoms. Our technique is based on a class of circuits that can represent relational data. We first show that this class supports tractable direct access after a polynomial time preprocessing. We then give bounds on the size of the circuit needed to represent the answer set of signed conjunctive queries depending on their structure. Both results combined together allow us to prove the tractability of direct access for a large class of conjunctive queries. On the one hand, we recover the known tractable classes from the literature in the case of positive conjunctive queries. On the other hand, we generalise and unify known tractability results about negative conjunctive queries - that is, queries having only negated atoms. In particular, we show that the class of β-acyclic negative conjunctive queries and the class of bounded nest set width negative conjunctive queries admit tractable direct access.

Cite as

Florent Capelli and Oliver Irwin. Direct Access for Conjunctive Queries with Negations. In 27th International Conference on Database Theory (ICDT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 290, pp. 13:1-13:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{capelli_et_al:LIPIcs.ICDT.2024.13,
  author =	{Capelli, Florent and Irwin, Oliver},
  title =	{{Direct Access for Conjunctive Queries with Negations}},
  booktitle =	{27th International Conference on Database Theory (ICDT 2024)},
  pages =	{13:1--13:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-312-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{290},
  editor =	{Cormode, Graham and Shekelyan, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2024.13},
  URN =		{urn:nbn:de:0030-drops-197958},
  doi =		{10.4230/LIPIcs.ICDT.2024.13},
  annote =	{Keywords: Conjunctive queries, factorized databases, direct access, hypertree decomposition}
}
Document
Ranked Enumeration for MSO on Trees via Knowledge Compilation

Authors: Antoine Amarilli, Pierre Bourhis, Florent Capelli, and Mikaël Monet

Published in: LIPIcs, Volume 290, 27th International Conference on Database Theory (ICDT 2024)


Abstract
We study the problem of enumerating the satisfying assignments for certain circuit classes from knowledge compilation, where assignments are ranked in a specific order. In particular, we show how this problem can be used to efficiently perform ranked enumeration of the answers to MSO queries over trees, with the order being given by a ranking function satisfying a subset-monotonicity property. Assuming that the number of variables is constant, we show that we can enumerate the satisfying assignments in ranked order for so-called multivalued circuits that are smooth, decomposable, and in negation normal form (smooth multivalued DNNF). There is no preprocessing and the enumeration delay is linear in the size of the circuit times the number of values, plus a logarithmic term in the number of assignments produced so far. If we further assume that the circuit is deterministic (smooth multivalued d-DNNF), we can achieve linear-time preprocessing in the circuit, and the delay only features the logarithmic term.

Cite as

Antoine Amarilli, Pierre Bourhis, Florent Capelli, and Mikaël Monet. Ranked Enumeration for MSO on Trees via Knowledge Compilation. In 27th International Conference on Database Theory (ICDT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 290, pp. 25:1-25:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{amarilli_et_al:LIPIcs.ICDT.2024.25,
  author =	{Amarilli, Antoine and Bourhis, Pierre and Capelli, Florent and Monet, Mika\"{e}l},
  title =	{{Ranked Enumeration for MSO on Trees via Knowledge Compilation}},
  booktitle =	{27th International Conference on Database Theory (ICDT 2024)},
  pages =	{25:1--25:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-312-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{290},
  editor =	{Cormode, Graham and Shekelyan, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2024.25},
  URN =		{urn:nbn:de:0030-drops-198079},
  doi =		{10.4230/LIPIcs.ICDT.2024.25},
  annote =	{Keywords: Enumeration, knowledge compilation, monadic second-order logic}
}
Document
Enumerating Regular Languages with Bounded Delay

Authors: Antoine Amarilli and Mikaël Monet

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
We study the task, for a given language L, of enumerating the (generally infinite) sequence of its words, without repetitions, while bounding the delay between two consecutive words. To allow for delay bounds that do not depend on the current word length, we assume a model where we produce each word by editing the preceding word with a small edit script, rather than writing out the word from scratch. In particular, this witnesses that the language is orderable, i.e., we can write its words as an infinite sequence such that the Levenshtein edit distance between any two consecutive words is bounded by a value that depends only on the language. For instance, (a+b)^* is orderable (with a variant of the Gray code), but a^* + b^* is not. We characterize which regular languages are enumerable in this sense, and show that this can be decided in PTIME in an input deterministic finite automaton (DFA) for the language. In fact, we show that, given a DFA A, we can compute in PTIME automata A₁, …, A_t such that L(A) is partitioned as L(A₁) ⊔ … ⊔ L(A_t) and every L(A_i) is orderable in this sense. Further, we show that the value of t obtained is optimal, i.e., we cannot partition L(A) into less than t orderable languages. In the case where L(A) is orderable (i.e., t = 1), we show that the ordering can be produced by a bounded-delay algorithm: specifically, the algorithm runs in a suitable pointer machine model, and produces a sequence of bounded-length edit scripts to visit the words of L(A) without repetitions, with bounded delay - exponential in |A| - between each script. In fact, we show that we can achieve this while only allowing the edit operations push and pop at the beginning and end of the word, which implies that the word can in fact be maintained in a double-ended queue. By contrast, when fixing the distance bound d between consecutive words and the number of classes of the partition, it is NP-hard in the input DFA A to decide if L(A) is orderable in this sense, already for finite languages. Last, we study the model where push-pop edits are only allowed at the end of the word, corresponding to a case where the word is maintained on a stack. We show that these operations are strictly weaker and that the slender languages are precisely those that can be partitioned into finitely many languages that are orderable in this sense. For the slender languages, we can again characterize the minimal number of languages in the partition, and achieve bounded-delay enumeration.

Cite as

Antoine Amarilli and Mikaël Monet. Enumerating Regular Languages with Bounded Delay. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 8:1-8:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{amarilli_et_al:LIPIcs.STACS.2023.8,
  author =	{Amarilli, Antoine and Monet, Mika\"{e}l},
  title =	{{Enumerating Regular Languages with Bounded Delay}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{8:1--8:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.8},
  URN =		{urn:nbn:de:0030-drops-176609},
  doi =		{10.4230/LIPIcs.STACS.2023.8},
  annote =	{Keywords: Regular language, constant-delay enumeration, edit distance}
}
Document
Geometric Amortization of Enumeration Algorithms

Authors: Florent Capelli and Yann Strozecki

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
In this paper, we introduce a technique we call geometric amortization for enumeration algorithms, which can be used to make the delay of enumeration algorithms more regular with little overhead on the space it uses. More precisely, we consider enumeration algorithms having incremental linear delay, that is, algorithms enumerating, on input x, a set A(x) such that for every t ≤ ♯ A(x), it outputs at least t solutions in time O(t⋅p(|x|)), where p is a polynomial. We call p the incremental delay of the algorithm. While it is folklore that one can transform such an algorithm into an algorithm with maximal delay O(p(|x|)), the naive transformation may use exponential space. We show that, using geometric amortization, such an algorithm can be transformed into an algorithm with delay O(p(|x|)log(♯A(x))) and space O(s log(♯A(x))) where s is the space used by the original algorithm. In terms of complexity, we prove that classes DelayP and IncP₁ with polynomial space coincide. We apply geometric amortization to show that one can trade the delay of flashlight search algorithms for their average delay up to a factor of O(log(♯A(x))). We illustrate how this tradeoff is advantageous for the enumeration of solutions of DNF formulas.

Cite as

Florent Capelli and Yann Strozecki. Geometric Amortization of Enumeration Algorithms. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 18:1-18:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{capelli_et_al:LIPIcs.STACS.2023.18,
  author =	{Capelli, Florent and Strozecki, Yann},
  title =	{{Geometric Amortization of Enumeration Algorithms}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{18:1--18:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.18},
  URN =		{urn:nbn:de:0030-drops-176703},
  doi =		{10.4230/LIPIcs.STACS.2023.18},
  annote =	{Keywords: Enumeration, Polynomial Delay, Incremental Delay, Amortization}
}
Document
Weighted Counting of Matchings in Unbounded-Treewidth Graph Families

Authors: Antoine Amarilli and Mikaël Monet

Published in: LIPIcs, Volume 241, 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)


Abstract
We consider a weighted counting problem on matchings, denoted PrMatching(𝒢), on an arbitrary fixed graph family 𝒢. The input consists of a graph G ∈ 𝒢 and of rational probabilities of existence on every edge of G, assuming independence. The output is the probability of obtaining a matching of G in the resulting distribution, i.e., a set of edges that are pairwise disjoint. It is known that, if 𝒢 has bounded treewidth, then PrMatching(𝒢) can be solved in polynomial time. In this paper we show that, under some assumptions, bounded treewidth in fact characterizes the tractable graph families for this problem. More precisely, we show intractability for all graph families 𝒢 satisfying the following treewidth-constructibility requirement: given an integer k in unary, we can construct in polynomial time a graph G ∈ 𝒢 with treewidth at least k. Our hardness result is then the following: for any treewidth-constructible graph family 𝒢, the problem PrMatching(𝒢) is intractable. This generalizes known hardness results for weighted matching counting under some restrictions that do not bound treewidth, e.g., being planar, 3-regular, or bipartite; it also answers a question left open in [Amarilli et al., 2016]. We also obtain a similar lower bound for the weighted counting of edge covers.

Cite as

Antoine Amarilli and Mikaël Monet. Weighted Counting of Matchings in Unbounded-Treewidth Graph Families. In 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 241, pp. 9:1-9:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{amarilli_et_al:LIPIcs.MFCS.2022.9,
  author =	{Amarilli, Antoine and Monet, Mika\"{e}l},
  title =	{{Weighted Counting of Matchings in Unbounded-Treewidth Graph Families}},
  booktitle =	{47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)},
  pages =	{9:1--9:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-256-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{241},
  editor =	{Szeider, Stefan and Ganian, Robert and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2022.9},
  URN =		{urn:nbn:de:0030-drops-168078},
  doi =		{10.4230/LIPIcs.MFCS.2022.9},
  annote =	{Keywords: Treewidth, counting complexity, matchings, Fibonacci sequence}
}
  • Refine by Type
  • 19 Document/PDF
  • 8 Document/HTML

  • Refine by Publication Year
  • 9 2025
  • 3 2024
  • 2 2023
  • 2 2022
  • 1 2019
  • Show More...

  • Refine by Author
  • 9 Capelli, Florent
  • 4 Mengel, Stefan
  • 3 Amarilli, Antoine
  • 3 Monet, Mikaël
  • 2 Bourhis, Pierre
  • Show More...

  • Refine by Series/Journal
  • 18 LIPIcs
  • 1 OASIcs

  • Refine by Classification
  • 3 Information systems → Relational database model
  • 2 Theory of computation → Database theory
  • 1 Information systems → Database management system engines
  • 1 Mathematics of computing → Combinatorial optimization
  • 1 Mathematics of computing → Graph algorithms
  • Show More...

  • Refine by Keyword
  • 2 Enumeration
  • 2 Propositional model counting
  • 2 direct access
  • 2 knowledge compilation
  • 2 monadic second-order logic
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail