15 Search Results for "Kautz, Henry"


Document
Parameterized Complexity of MinCSP over the Point Algebra

Authors: George Osipov, Marcin Pilipczuk, and Magnus Wahlström

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
The input in the Minimum-Cost Constraint Satisfaction Problem (MinCSP) over the Point Algebra contains a set of variables, a collection of constraints of the form x < y, x = y, x ≤ y and x ≠ y, and a budget k. The goal is to check whether it is possible to assign rational values to the variables while breaking constraints of total cost at most k. This problem generalizes several prominent graph separation and transversal problems: - MinCSP({<}) is equivalent to Directed Feedback Arc Set, - MinCSP({< , ≤}) is equivalent to Directed Subset Feedback Arc Set, - MinCSP({= ,≠}) is equivalent to Edge Multicut, and - MinCSP({≤ ,≠}) is equivalent to Directed Symmetric Multicut. Apart from trivial cases, MinCSP({Γ}) for Γ ⊆ {< , = , ≤ ,≠} is NP-hard even to approximate within any constant factor under the Unique Games Conjecture. Hence, we study parameterized complexity of this problem under a natural parameterization by the solution cost k. We obtain a complete classification: if Γ ⊆ {< , = , ≤ ,≠} contains both ≤ and ≠, then MinCSP({Γ}) is W[1]-hard, otherwise it is fixed-parameter tractable. For the positive cases, we solve MinCSP({< , = ,≠}), generalizing the FPT results for Directed Feedback Arc Set and Edge Multicut as well as their weighted versions. Our algorithm works by reducing the problem into a Boolean MinCSP, which is in turn solved by flow augmentation. For the lower bounds, we prove that Directed Symmetric Multicut is W[1]-hard, solving an open problem.

Cite as

George Osipov, Marcin Pilipczuk, and Magnus Wahlström. Parameterized Complexity of MinCSP over the Point Algebra. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 93:1-93:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{osipov_et_al:LIPIcs.ESA.2024.93,
  author =	{Osipov, George and Pilipczuk, Marcin and Wahlstr\"{o}m, Magnus},
  title =	{{Parameterized Complexity of MinCSP over the Point Algebra}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{93:1--93:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.93},
  URN =		{urn:nbn:de:0030-drops-211640},
  doi =		{10.4230/LIPIcs.ESA.2024.93},
  annote =	{Keywords: parameterized complexity, constraint satisfaction, point algebra, multicut, feedback arc set}
}
Document
Short Paper
A Logic of East and West for Intervals (Short Paper)

Authors: Zekai Li, Amin Farjudian, and Heshan Du

Published in: LIPIcs, Volume 315, 16th International Conference on Spatial Information Theory (COSIT 2024)


Abstract
This paper proposes a logic of east and west for intervals (LEWI), which extends the logic of east and west for points. For intervals in 1D Euclidean space, the logic LEWI formalises the qualitative direction relations "east", "west", "definitely east", "definitely west", "partially east", "partially west", etc. To cope with imprecision in geometry representations, the logic LEWI is parameterized by a margin of error σ ∈ ℝ_{> 0} and a level of indeterminacy in directions τ ∈ ℕ_{> 1}. For every τ, we provide an axiomatisation of the logic LEWI, and prove that it is sound and complete with respect to 1D Euclidean space.

Cite as

Zekai Li, Amin Farjudian, and Heshan Du. A Logic of East and West for Intervals (Short Paper). In 16th International Conference on Spatial Information Theory (COSIT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 315, pp. 17:1-17:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{li_et_al:LIPIcs.COSIT.2024.17,
  author =	{Li, Zekai and Farjudian, Amin and Du, Heshan},
  title =	{{A Logic of East and West for Intervals}},
  booktitle =	{16th International Conference on Spatial Information Theory (COSIT 2024)},
  pages =	{17:1--17:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-330-0},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{315},
  editor =	{Adams, Benjamin and Griffin, Amy L. and Scheider, Simon and McKenzie, Grant},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.COSIT.2024.17},
  URN =		{urn:nbn:de:0030-drops-208320},
  doi =		{10.4230/LIPIcs.COSIT.2024.17},
  annote =	{Keywords: Qualitative Spatial Logic, Soundness, Completeness}
}
Document
Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference

Authors: Alexandre Dubray, Pierre Schaus, and Siegfried Nijssen

Published in: LIPIcs, Volume 307, 30th International Conference on Principles and Practice of Constraint Programming (CP 2024)


Abstract
Weighted model counting (WMC) plays a central role in probabilistic reasoning. Given that this problem is #P-hard, harder instances can generally only be addressed using approximate techniques based on sampling, which provide statistical convergence guarantees: the longer a sampling process runs, the more accurate the WMC is likely to be. In this work, we propose a deterministic search-based approach that can also be stopped at any time and provides hard lower- and upper-bound guarantees on the true WMC. This approach uses a value heuristic that guides exploration first towards models with a high weight and leverages Limited Discrepancy Search to make the bounds converge faster. The validity, scalability, and convergence of our approach are tested and compared with state-of-the-art baseline methods on the problem of computing marginal probabilities in Bayesian networks and reliability estimation in probabilistic graphs.

Cite as

Alexandre Dubray, Pierre Schaus, and Siegfried Nijssen. Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 10:1-10:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dubray_et_al:LIPIcs.CP.2024.10,
  author =	{Dubray, Alexandre and Schaus, Pierre and Nijssen, Siegfried},
  title =	{{Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{10:1--10:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.10},
  URN =		{urn:nbn:de:0030-drops-206956},
  doi =		{10.4230/LIPIcs.CP.2024.10},
  annote =	{Keywords: Projected Weighted Model Counting, Limited Discrepancy Search, Approximate Method, Probabilistic Inference}
}
Document
Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning

Authors: Adam Francis Green, J. Christopher Beck, and Amanda Coles

Published in: LIPIcs, Volume 307, 30th International Conference on Principles and Practice of Constraint Programming (CP 2024)


Abstract
We present a novel scheduling model that leverages Constraint Programming (CP) to enhance problem solving performance in Temporal Planning. Building on the established strategy of decomposing causal and temporal reasoning, our approach abstracts two common fact structures present in many Temporal Planning problems - Semaphores and Envelopes - and performs temporal reasoning in a CP-based scheduler. At each search node in a heuristic search for a temporal plan, we construct and solve a Constraint Satisfaction Problem (CSP) and integrate feedback from the CP-based scheduler to guide the causal planning search towards a solution. Through experimental analysis, we validate the impact of these advances, demonstrating a significant reduction in both the number of states searched and in search time alongside an increase in problem-solving coverage.

Cite as

Adam Francis Green, J. Christopher Beck, and Amanda Coles. Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 12:1-12:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{francisgreen_et_al:LIPIcs.CP.2024.12,
  author =	{Francis Green, Adam and Beck, J. Christopher and Coles, Amanda},
  title =	{{Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{12:1--12:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.12},
  URN =		{urn:nbn:de:0030-drops-206974},
  doi =		{10.4230/LIPIcs.CP.2024.12},
  annote =	{Keywords: AI Planning, Temporal-Numeric Planning, Constraint Programming, Scheduling}
}
Document
Polynomial Calculus for Quantified Boolean Logic: Lower Bounds Through Circuits and Degree

Authors: Olaf Beyersdorff, Tim Hoffmann, Kaspar Kasche, and Luc Nicolas Spachmann

Published in: LIPIcs, Volume 306, 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)


Abstract
We initiate an in-depth proof-complexity analysis of polynomial calculus (𝒬-PC) for Quantified Boolean Formulas (QBF). In the course of this we establish a tight proof-size characterisation of 𝒬-PC in terms of a suitable circuit model (polynomial decision lists). Using this correspondence we show a size-degree relation for 𝒬-PC, similar in spirit, yet different from the classic size-degree formula for propositional PC by Impagliazzo, Pudlák and Sgall (1999). We use the circuit characterisation together with the size-degree relation to obtain various new lower bounds on proof size in 𝒬-PC. This leads to incomparability results for 𝒬-PC systems over different fields.

Cite as

Olaf Beyersdorff, Tim Hoffmann, Kaspar Kasche, and Luc Nicolas Spachmann. Polynomial Calculus for Quantified Boolean Logic: Lower Bounds Through Circuits and Degree. In 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 306, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{beyersdorff_et_al:LIPIcs.MFCS.2024.27,
  author =	{Beyersdorff, Olaf and Hoffmann, Tim and Kasche, Kaspar and Spachmann, Luc Nicolas},
  title =	{{Polynomial Calculus for Quantified Boolean Logic: Lower Bounds Through Circuits and Degree}},
  booktitle =	{49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-335-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{306},
  editor =	{Kr\'{a}lovi\v{c}, Rastislav and Ku\v{c}era, Anton{\'\i}n},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2024.27},
  URN =		{urn:nbn:de:0030-drops-205834},
  doi =		{10.4230/LIPIcs.MFCS.2024.27},
  annote =	{Keywords: proof complexity, QBF, polynomial calculus, circuits, lower bounds}
}
Document
The Relative Strength of #SAT Proof Systems

Authors: Olaf Beyersdorff, Johannes K. Fichte, Markus Hecher, Tim Hoffmann, and Kaspar Kasche

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
The propositional model counting problem #SAT asks to compute the number of satisfying assignments for a given propositional formula. Recently, three #SAT proof systems kcps (knowledge compilation proof system), MICE (model counting induction by claim extension), and CPOG (certified partitioned-operation graphs) have been introduced with the aim to model #SAT solving and enable proof logging for solvers. Prior to this paper, the relations between these proof systems have been unclear and very few proof complexity results are known. We completely determine the simulation order of the three systems, establishing that CPOG simulates both MICE and kcps, while MICE and kcps are exponentially incomparable. This implies that CPOG is strictly stronger than the other two systems.

Cite as

Olaf Beyersdorff, Johannes K. Fichte, Markus Hecher, Tim Hoffmann, and Kaspar Kasche. The Relative Strength of #SAT Proof Systems. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 5:1-5:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{beyersdorff_et_al:LIPIcs.SAT.2024.5,
  author =	{Beyersdorff, Olaf and Fichte, Johannes K. and Hecher, Markus and Hoffmann, Tim and Kasche, Kaspar},
  title =	{{The Relative Strength of #SAT Proof Systems}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{5:1--5:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.5},
  URN =		{urn:nbn:de:0030-drops-205276},
  doi =		{10.4230/LIPIcs.SAT.2024.5},
  annote =	{Keywords: Model Counting, #SAT, Proof Complexity, Proof Systems, Lower Bounds, Knowledge Compilation}
}
Document
Clausal Congruence Closure

Authors: Armin Biere, Katalin Fazekas, Mathias Fleury, and Nils Froleyks

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Many practical applications of satisfiability solving employ multiple steps to encode an original problem formulation into conjunctive normal form. Often circuits are used as intermediate representation before encoding those circuits into clausal form. These circuits however might contain redundant isomorphic sub-circuits. If blindly translated into clausal form, this redundancy is retained and increases solving time unless specific preprocessing algorithms are used. Furthermore, such redundant sub-formula structure might only emerge during solving and needs to be addressed by inprocessing. This paper presents a new approach which extracts gate information from the formula and applies congruence closure to match and eliminate redundant gates. Besides new algorithms for gate extraction, we also describe previous unpublished attempts to tackle this problem. Experiments focus on the important problem of combinational equivalence checking for hardware designs and show that our new approach yields a substantial gain in CNF solver performance.

Cite as

Armin Biere, Katalin Fazekas, Mathias Fleury, and Nils Froleyks. Clausal Congruence Closure. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 6:1-6:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{biere_et_al:LIPIcs.SAT.2024.6,
  author =	{Biere, Armin and Fazekas, Katalin and Fleury, Mathias and Froleyks, Nils},
  title =	{{Clausal Congruence Closure}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{6:1--6:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.6},
  URN =		{urn:nbn:de:0030-drops-205287},
  doi =		{10.4230/LIPIcs.SAT.2024.6},
  annote =	{Keywords: Satisfiability Solving, Congruence Closure, Structural Hashing, SAT Sweeping, Conjunctive Normal Form, Combinational Equivalence Checking, Hardware Equivalence Checking}
}
Document
Enhancing MaxSAT Local Search via a Unified Soft Clause Weighting Scheme

Authors: Yi Chu, Chu-Min Li, Furong Ye, and Shaowei Cai

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Local search has been widely applied to solve the well-known (weighted) partial MaxSAT problem, significantly influencing many real-world applications. The main difficulty to overcome when designing a local search algorithm is that it can easily fall into local optima. Clause weighting is a beneficial technique that dynamically adjusts the landscape of search space to help the algorithm escape from local optima. Existing works tend to increase the weights of falsified clauses, and such strategies may result in an unpredictable landscape of search space during the optimization process. Therefore, in this paper, we propose a Unified Soft Clause Weighting Scheme called Unified-SW, which increases the weights of all soft clauses in feasible local optima, whether they are satisfied or not, while preserving the hierarchy among them. We implemented Unified-SW in a new local search solver called USW-LS. Experimental results demonstrate that USW-LS, outperforms the state-of-the-art local search solvers across benchmarks from anytime tracks of recent MaxSAT Evaluations. More promisingly, a hybrid solver combining USW-LS and TT-Open-WBO-Inc won all four categories in the anytime track of MaxSAT Evaluation 2023.

Cite as

Yi Chu, Chu-Min Li, Furong Ye, and Shaowei Cai. Enhancing MaxSAT Local Search via a Unified Soft Clause Weighting Scheme. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 8:1-8:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chu_et_al:LIPIcs.SAT.2024.8,
  author =	{Chu, Yi and Li, Chu-Min and Ye, Furong and Cai, Shaowei},
  title =	{{Enhancing MaxSAT Local Search via a Unified Soft Clause Weighting Scheme}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{8:1--8:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.8},
  URN =		{urn:nbn:de:0030-drops-205301},
  doi =		{10.4230/LIPIcs.SAT.2024.8},
  annote =	{Keywords: Weighted Partial MaxSAT, Local Search Method, Weighting Scheme}
}
Document
On the Relative Efficiency of Dynamic and Static Top-Down Compilation to Decision-DNNF

Authors: Alexis de Colnet

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Top-down compilers of CNF formulas to circuits in decision-DNNF (Decomposable Negation Normal Form) have proved to be useful for model counting. These compilers rely on a common set of techniques including DPLL-style exploration of the set of models, caching of residual formulas, and connected components detection. Differences between compilers lie in the variable selection heuristics and in the additional processing techniques they may use. We investigate, from a theoretical perspective, the ability of top-down compilation algorithms to find small decision-DNNF circuits for two different variable selection strategies. Both strategies are guided by a graph of the CNF formula and are inspired by what is done in practice. The first uses a dynamic graph-partitioning approach while the second works with a static tree decomposition. We show that the dynamic approach performs significantly better than the static approach for some formulas, and that the opposite also holds for other formulas. Our lower bounds are proved despite loose settings where the compilation algorithm is only forced to follow its designed variable selection strategy and where everything else, including the many opportunities for tie-breaking, can be handled non-deterministically.

Cite as

Alexis de Colnet. On the Relative Efficiency of Dynamic and Static Top-Down Compilation to Decision-DNNF. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 11:1-11:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{decolnet:LIPIcs.SAT.2024.11,
  author =	{de Colnet, Alexis},
  title =	{{On the Relative Efficiency of Dynamic and Static Top-Down Compilation to Decision-DNNF}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{11:1--11:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.11},
  URN =		{urn:nbn:de:0030-drops-205339},
  doi =		{10.4230/LIPIcs.SAT.2024.11},
  annote =	{Keywords: Knowledge compilation, top-down compilation, decision-DNNF Circuits}
}
Document
Parallel Clause Sharing Strategy Based on Graph Structure of SAT Problem

Authors: Yoichiro Iida, Tomohiro Sonobe, and Mary Inaba

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Parallelization of SAT solvers is an important technique for improving solver performance. The selection of the learnt clauses to share among parallel workers is crucial for its efficiency. Literal block distance (LBD) is often used to evaluate the quality of clauses to select. We propose a new method, Parallel Clause sharing based on graph Structure (PaCS), to select good clauses for sharing. First, we conducted three preliminary experiments to assess the performance of LBD in parallel clause sharing: a performance comparison between the LBD and clause size, an analysis of the utilization of shared clauses, and a comparison of the LBD values of shared clauses at originating and receiving workers. These experiments indicate that the LBD may not be optimal for learnt clause sharing. We attribute the results to the LBD’s inherent dependency on decision trees. Each parallel worker has a unique decision tree; thus, a sharing clause that is good for its originating worker may not be good for others. Therefore, we propose PaCS, a search-independent method that uses the graph structure derived from the input CNF of SAT problems. PaCS evaluates clauses using their edges' weight in the variable incidence graph. Using the input CNF’s graph is effective for parallel clause sharing because it is the common input for all parallel workers. Furthermore, using edge weight can select clauses whose variables' Boolean values are more likely to be determined. Performance evaluation experiments demonstrate that our strategy outperforms LBD by 4% in the number of solved instances and by 12% in PAR-2. This study opens avenues for further improvements in parallel-solving strategies using the structure of SAT problems and reinterpretations of the quality of learnt clauses.

Cite as

Yoichiro Iida, Tomohiro Sonobe, and Mary Inaba. Parallel Clause Sharing Strategy Based on Graph Structure of SAT Problem. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 17:1-17:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{iida_et_al:LIPIcs.SAT.2024.17,
  author =	{Iida, Yoichiro and Sonobe, Tomohiro and Inaba, Mary},
  title =	{{Parallel Clause Sharing Strategy Based on Graph Structure of SAT Problem}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{17:1--17:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.17},
  URN =		{urn:nbn:de:0030-drops-205392},
  doi =		{10.4230/LIPIcs.SAT.2024.17},
  annote =	{Keywords: SAT Solver, Structure of SAT, Parallel application, Clause Learning}
}
Document
Optimal Layout Synthesis for Deep Quantum Circuits on NISQ Processors with 100+ Qubits

Authors: Irfansha Shaik and Jaco van de Pol

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Layout synthesis is mapping a quantum circuit to a quantum processor. SWAP gate insertions are needed for scheduling 2-qubit gates only on connected physical qubits. With the ever-increasing number of qubits in NISQ processors, scalable layout synthesis is of utmost importance. With large optimality gaps observed in heuristic approaches, scalable exact methods are needed. While recent exact and near-optimal approaches scale to moderate circuits, large deep circuits are still out of scope. In this work, we propose a SAT encoding based on parallel plans that apply 1 SWAP and a group of CNOTs at each time step. Using domain-specific information, we maintain optimality in parallel plans while scaling to large and deep circuits. From our results, we show the scalability of our approach which significantly outperforms leading exact and near-optimal approaches (up to 100x). For the first time, we can optimally map several 8, 14, and 16 qubit circuits onto 54, 80, and 127 qubit platforms with up to 17 SWAPs. While adding optimal SWAPs, we also report near-optimal depth in our mapped circuits.

Cite as

Irfansha Shaik and Jaco van de Pol. Optimal Layout Synthesis for Deep Quantum Circuits on NISQ Processors with 100+ Qubits. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 26:1-26:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{shaik_et_al:LIPIcs.SAT.2024.26,
  author =	{Shaik, Irfansha and van de Pol, Jaco},
  title =	{{Optimal Layout Synthesis for Deep Quantum Circuits on NISQ Processors with 100+ Qubits}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{26:1--26:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.26},
  URN =		{urn:nbn:de:0030-drops-205487},
  doi =		{10.4230/LIPIcs.SAT.2024.26},
  annote =	{Keywords: Layout Synthesis, Transpiling, Qubit Mapping and Routing, Quantum Circuits, Propositional Satisfiability, Parallel Plans}
}
Document
Position
Standardizing Knowledge Engineering Practices with a Reference Architecture

Authors: Bradley P. Allen and Filip Ilievski

Published in: TGDK, Volume 2, Issue 1 (2024): Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge, Volume 2, Issue 1


Abstract
Knowledge engineering is the process of creating and maintaining knowledge-producing systems. Throughout the history of computer science and AI, knowledge engineering workflows have been widely used given the importance of high-quality knowledge for reliable intelligent agents. Meanwhile, the scope of knowledge engineering, as apparent from its target tasks and use cases, has been shifting, together with its paradigms such as expert systems, semantic web, and language modeling. The intended use cases and supported user requirements between these paradigms have not been analyzed globally, as new paradigms often satisfy prior pain points while possibly introducing new ones. The recent abstraction of systemic patterns into a boxology provides an opening for aligning the requirements and use cases of knowledge engineering with the systems, components, and software that can satisfy them best, however, this direction has not been explored to date. This paper proposes a vision of harmonizing the best practices in the field of knowledge engineering by leveraging the software engineering methodology of creating reference architectures. We describe how a reference architecture can be iteratively designed and implemented to associate user needs with recurring systemic patterns, building on top of existing knowledge engineering workflows and boxologies. We provide a six-step roadmap that can enable the development of such an architecture, consisting of scope definition, selection of information sources, architectural analysis, synthesis of an architecture based on the information source analysis, evaluation through instantiation, and, ultimately, instantiation into a concrete software architecture. We provide an initial design and outcome of the definition of architectural scope, selection of information sources, and analysis. As the remaining steps of design, evaluation, and instantiation of the architecture are largely use-case specific, we provide a detailed description of their procedures and point to relevant examples. We expect that following through on this vision will lead to well-grounded reference architectures for knowledge engineering, will advance the ongoing initiatives of organizing the neurosymbolic knowledge engineering space, and will build new links to the software architectures and data science communities.

Cite as

Bradley P. Allen and Filip Ilievski. Standardizing Knowledge Engineering Practices with a Reference Architecture. In Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge (TGDK), Volume 2, Issue 1, pp. 5:1-5:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{allen_et_al:TGDK.2.1.5,
  author =	{Allen, Bradley P. and Ilievski, Filip},
  title =	{{Standardizing Knowledge Engineering Practices with a Reference Architecture}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{5:1--5:23},
  ISSN =	{2942-7517},
  year =	{2024},
  volume =	{2},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.2.1.5},
  URN =		{urn:nbn:de:0030-drops-198623},
  doi =		{10.4230/TGDK.2.1.5},
  annote =	{Keywords: knowledge engineering, knowledge graphs, quality attributes, software architectures, sociotechnical systems}
}
Document
Plan Recognition (Dagstuhl Seminar 11141)

Authors: Robert P. Goldman, Christopher W. Geib, Henry Kautz, and Tamim Asfour

Published in: Dagstuhl Reports, Volume 1, Issue 4 (2011)


Abstract
This Dagstuhl seminar brought together researchers with a wide range of interests and backgrounds related to plan and activity recognition. It featured a substantial set of longer tutorials on aspects of plan and activity recognition, and related topics and useful methods, as a way of establishing a common vocabulary and shared basis of understanding. Building on this shared understanding, individual researchers presented talks about their work in the area. There were also panel discussions which addressed questions about how to best foster progress in the field --- specifically how to improve our ability to compare different plan and activity recognition algorithms --- and address the question of whether to assume rationality in the modeled agents (a question that is of great concern in many fields at this time). This report presents a summary of the talks and discussions at the seminar.

Cite as

Robert P. Goldman, Christopher W. Geib, Henry Kautz, and Tamim Asfour. Plan Recognition (Dagstuhl Seminar 11141). In Dagstuhl Reports, Volume 1, Issue 4, pp. 1-22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@Article{goldman_et_al:DagRep.1.4.1,
  author =	{Goldman, Robert P. and Geib, Christopher W. and Kautz, Henry and Asfour, Tamim},
  title =	{{Plan Recognition (Dagstuhl Seminar 11141)}},
  pages =	{1--22},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2011},
  volume =	{1},
  number =	{4},
  editor =	{Goldman, Robert P. and Geib, Christopher W. and Kautz, Henry and Asfour, Tamim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagRep.1.4.1},
  URN =		{urn:nbn:de:0030-drops-31958},
  doi =		{10.4230/DagRep.1.4.1},
  annote =	{Keywords: Artificial intelligence, plan recognition, intent recognition, activity recognition}
}
Document
05241 Abstracts Collection – Synthesis and Planning

Authors: Henry Kautz, Wolfgang Thomas, and Moshe Y. Vardi

Published in: Dagstuhl Seminar Proceedings, Volume 5241, Synthesis and Planning (2006)


Abstract
From 12.06.05 to 17.06.2005 the Dagstuhl Seminar 05241 ``Synthesis and Planning'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Henry Kautz, Wolfgang Thomas, and Moshe Y. Vardi. 05241 Abstracts Collection – Synthesis and Planning. In Synthesis and Planning. Dagstuhl Seminar Proceedings, Volume 5241, pp. 1-13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{kautz_et_al:DagSemProc.05241.1,
  author =	{Kautz, Henry and Thomas, Wolfgang and Vardi, Moshe Y.},
  title =	{{05241 Abstracts Collection – Synthesis and Planning}},
  booktitle =	{Synthesis and Planning},
  pages =	{1--13},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5241},
  editor =	{Henry Kautz and Wolfgang Thomas and Moshe Y. Vardi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05241.1},
  URN =		{urn:nbn:de:0030-drops-4531},
  doi =		{10.4230/DagSemProc.05241.1},
  annote =	{Keywords: AI planning, controller synthesis, partially observed domains, reactive computation, program analysis, games, model checking, satisfiability, Markov decision processes}
}
Document
05241 Executive Summary – Synthesis and Planning

Authors: Henry Kautz, Wolfgang Thomas, and Moshe Y. Vardi

Published in: Dagstuhl Seminar Proceedings, Volume 5241, Synthesis and Planning (2006)


Abstract
This seminar has brought together researchers working in two complementary fields: automatic synthesis of (control) programs, and methods for devising planning algorithms in artifical intelligence (AI). This combines a strong thread of current research in automata theory with an area of possible but so far unexplored applications.

Cite as

Henry Kautz, Wolfgang Thomas, and Moshe Y. Vardi. 05241 Executive Summary – Synthesis and Planning. In Synthesis and Planning. Dagstuhl Seminar Proceedings, Volume 5241, pp. 1-4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{kautz_et_al:DagSemProc.05241.2,
  author =	{Kautz, Henry and Thomas, Wolfgang and Vardi, Moshe Y.},
  title =	{{05241 Executive Summary – Synthesis and Planning}},
  booktitle =	{Synthesis and Planning},
  pages =	{1--4},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5241},
  editor =	{Henry Kautz and Wolfgang Thomas and Moshe Y. Vardi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05241.2},
  URN =		{urn:nbn:de:0030-drops-4527},
  doi =		{10.4230/DagSemProc.05241.2},
  annote =	{Keywords: Synthesis, planning}
}
  • Refine by Author
  • 3 Kautz, Henry
  • 2 Beyersdorff, Olaf
  • 2 Hoffmann, Tim
  • 2 Kasche, Kaspar
  • 2 Thomas, Wolfgang
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Automated reasoning
  • 2 Theory of computation → Proof complexity
  • 1 Computing methodologies → Heuristic function construction
  • 1 Computing methodologies → Knowledge representation and reasoning
  • 1 Computing methodologies → Planning and scheduling
  • Show More...

  • Refine by Keyword
  • 1 #SAT
  • 1 AI Planning
  • 1 AI planning
  • 1 Approximate Method
  • 1 Artificial intelligence
  • Show More...

  • Refine by Type
  • 15 document

  • Refine by Publication Year
  • 12 2024
  • 2 2006
  • 1 2011

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail