LIPIcs, Volume 307

30th International Conference on Principles and Practice of Constraint Programming (CP 2024)



Thumbnail PDF

Event

CP 2024, September 2-6, 2024, Girona, Spain

Editor

Paul Shaw
  • IBM, Biot, France

Publication Details

  • published at: 2024-08-29
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-336-2
  • DBLP: db/conf/cp/cp2024

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 307, CP 2024, Complete Volume

Authors: Paul Shaw


Abstract
LIPIcs, Volume 307, CP 2024, Complete Volume

Cite as

30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 1-728, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Proceedings{shaw:LIPIcs.CP.2024,
  title =	{{LIPIcs, Volume 307, CP 2024, Complete Volume}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{1--728},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024},
  URN =		{urn:nbn:de:0030-drops-206842},
  doi =		{10.4230/LIPIcs.CP.2024},
  annote =	{Keywords: LIPIcs, Volume 307, CP 2024, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Paul Shaw


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 0:i-0:xxii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{shaw:LIPIcs.CP.2024.0,
  author =	{Shaw, Paul},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{0:i--0:xxii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.0},
  URN =		{urn:nbn:de:0030-drops-206854},
  doi =		{10.4230/LIPIcs.CP.2024.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Solving Patience and Solitaire Games with Good Old Fashioned AI (Invited Talk)

Authors: Ian P. Gent


Abstract
While games like Chess, Checkers and Go have been the subject of extensive research in AI for decades, there has been comparatively little study of single player card games. These games are generally called "Patience" in British English and "Solitaire" in US English, and have been popular for hundreds of years and remain so today. In fact, our ignorance of the winnability percentage of just one such game - "Klondike" - has been described as "one of the embarrassments of applied mathematics" by the distinguished statistician Persi Diaconis. I will talk about "Solvitaire", a program to solve patience games given a simple JSON description of the rules of the game and the initial layout. We have used Solvitaire to determine the winnability percentage of dozens different single-player card games with a 95% confidence interval of ± 0.1% or better. For example, we now know the winnability of Klondike as 81.945% ± 0.084% (in the "thoughtful" variant where the player knows the rank and suit of all cards), a 30-fold reduction in confidence interval over the best previous result. The vast majority of results we obtained with Solvitaire are either entirely new or represent significant improvements on previous knowledge. Solvitaire is very much a "Good Old Fashioned AI" approach to solving patience games, without using Machine Learning or Neural networks. It uses exhaustive depth-first search to explore all possible ways that a game could possibly be won, ensuring that games reported unwinnable really are so. This can involve searching extraordinary seach spaces with depths in the millions even including cases where unwinnability is proven. Numerous techniques imported from AI search play an important role in making this search practicable. Particularly important ones are: the use of a transposition tables; the exploitation of symmetry in search; the use of dominances to force certain moves to be made when it is safe to do so; and the use of streamliners. Solvitaire does have some games it performs poorly on, where exhaustive search is unable to prove that no win is possible but an alternative simple proof is in fact available. I will also talk about using constraint models do this, leading to slight improvements in some variants of Klondike but dramatic improvements in others. This talk will include personal anecdotes, explaining for example why it is dedicated to my mother Margaret Gent (1923-2021) for her patience in teaching me to love the game of patience.

Cite as

Ian P. Gent. Solving Patience and Solitaire Games with Good Old Fashioned AI (Invited Talk). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, p. 1:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{gent:LIPIcs.CP.2024.1,
  author =	{Gent, Ian P.},
  title =	{{Solving Patience and Solitaire Games with Good Old Fashioned AI}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{1:1--1:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.1},
  URN =		{urn:nbn:de:0030-drops-206863},
  doi =		{10.4230/LIPIcs.CP.2024.1},
  annote =	{Keywords: AI Search, Solitaire and Patience Games}
}
Document
Invited Talk
Thinking Fast and Slow in AI: A Cognitive Architecture to Augment Both AI and Human Reasoning (Invited Talk)

Authors: Francesca Rossi


Abstract
AI systems are very useful in practically every sector, but they also have several limitations, mostly related to the lack of reasoning capabilities. According to the fast and slow thinking theory of human decision making, we can say that data-driven AI, including generative AI, are providing fast thinking capabilities, but they do not have slow thinking ones. Existing cognitive theories of human decision making, such as the thinking fast and slow theory, can provide insights on how to advance AI systems towards some of these capabilities. In this talk I will present a general architecture, called SOFAI, that is based on fast/slow solvers and a meta-cognitive component that provides a centralized governance of the solvers. I will describe two instances of this architecture, for constrained grid navigation and planning, showing experimentally that SOFAI generates better decisions than each of the individual solvers. Emerging behavior related to adaptability, skill learning, and cognitive control are also showed in the analysis of SOFAI’s behavior. I will also describe how the thinking fast and slow theory can help design a value-based human-machine collaborative decision environment.

Cite as

Francesca Rossi. Thinking Fast and Slow in AI: A Cognitive Architecture to Augment Both AI and Human Reasoning (Invited Talk). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{rossi:LIPIcs.CP.2024.2,
  author =	{Rossi, Francesca},
  title =	{{Thinking Fast and Slow in AI: A Cognitive Architecture to Augment Both AI and Human Reasoning}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.2},
  URN =		{urn:nbn:de:0030-drops-206874},
  doi =		{10.4230/LIPIcs.CP.2024.2},
  annote =	{Keywords: Artificial Intelligence, Meta-reasoning}
}
Document
The Complexity of Symmetry Breaking Beyond Lex-Leader

Authors: Markus Anders, Sofia Brenner, and Gaurav Rattan


Abstract
Symmetry breaking is a widely popular approach to enhance solvers in constraint programming, such as those for SAT or MIP. Symmetry breaking predicates (SBPs) typically impose an order on variables and single out the lexicographic leader (lex-leader) in each orbit of assignments. Although it is NP-hard to find complete lex-leader SBPs, incomplete lex-leader SBPs are widely used in practice. In this paper, we investigate the complexity of computing complete SBPs, lex-leader or otherwise, for SAT. Our main result proves a natural barrier for efficiently computing SBPs: efficient certification of graph non-isomorphism. Our results explain the difficulty of obtaining short SBPs for important CP problems, such as matrix-models with row-column symmetries and graph generation problems. Our results hold even when SBPs are allowed to introduce additional variables. We show polynomial upper bounds for breaking certain symmetry groups, namely automorphism groups of trees and wreath products of groups with efficient SBPs.

Cite as

Markus Anders, Sofia Brenner, and Gaurav Rattan. The Complexity of Symmetry Breaking Beyond Lex-Leader. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 3:1-3:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{anders_et_al:LIPIcs.CP.2024.3,
  author =	{Anders, Markus and Brenner, Sofia and Rattan, Gaurav},
  title =	{{The Complexity of Symmetry Breaking Beyond Lex-Leader}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{3:1--3:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.3},
  URN =		{urn:nbn:de:0030-drops-206881},
  doi =		{10.4230/LIPIcs.CP.2024.3},
  annote =	{Keywords: symmetry breaking, boolean satisfiability, matrix models, graph isomorphism}
}
Document
Certifying Without Loss of Generality Reasoning in Solution-Improving Maximum Satisfiability

Authors: Jeremias Berg, Bart Bogaerts, Jakob Nordström, Andy Oertel, Tobias Paxian, and Dieter Vandesande


Abstract
Proof logging has long been the established method to certify correctness of Boolean satisfiability (SAT) solvers, but has only recently been introduced for SAT-based optimization (MaxSAT). The focus of this paper is solution-improving search (SIS), in which a SAT solver is iteratively queried for increasingly better solutions until an optimal one is found. A challenging aspect of modern SIS solvers is that they make use of complex "without loss of generality" arguments that are quite involved to understand even at a human meta-level, let alone to express in a simple, machine-verifiable proof. In this work, we develop pseudo-Boolean proof logging methods for solution-improving MaxSAT solving, and use them to produce a certifying version of the state-of-the-art solver Pacose with VeriPB proofs. Our experimental evaluation demonstrates that this approach works in practice. We hope that this is yet another step towards general adoption of proof logging in MaxSAT solving.

Cite as

Jeremias Berg, Bart Bogaerts, Jakob Nordström, Andy Oertel, Tobias Paxian, and Dieter Vandesande. Certifying Without Loss of Generality Reasoning in Solution-Improving Maximum Satisfiability. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 4:1-4:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{berg_et_al:LIPIcs.CP.2024.4,
  author =	{Berg, Jeremias and Bogaerts, Bart and Nordstr\"{o}m, Jakob and Oertel, Andy and Paxian, Tobias and Vandesande, Dieter},
  title =	{{Certifying Without Loss of Generality Reasoning in Solution-Improving Maximum Satisfiability}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{4:1--4:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.4},
  URN =		{urn:nbn:de:0030-drops-206895},
  doi =		{10.4230/LIPIcs.CP.2024.4},
  annote =	{Keywords: proof logging, certifying algorithms, MaxSAT, solution-improving search, SAT-UNSAT, maximum satisfiability, combinatorial optimization, certification, pseudo-Boolean}
}
Document
ParLS-PBO: A Parallel Local Search Solver for Pseudo Boolean Optimization

Authors: Zhihan Chen, Peng Lin, Hao Hu, and Shaowei Cai


Abstract
As a broadly applied technique in numerous optimization problems, recently, local search has been employed to solve Pseudo-Boolean Optimization (PBO) problem. A representative local search solver for PBO is LS-PBO. In this paper, firstly, we improve LS-PBO by a dynamic scoring mechanism, which dynamically strikes a balance between score on hard constraints and score on the objective function. Moreover, on top of this improved LS-PBO, we develop the first parallel local search PBO solver. The main idea is to share good solutions among different threads to guide the search, by maintaining a pool of feasible solutions. For evaluating solutions when updating the pool, we propose a function that considers both the solution quality and the diversity of the pool. Furthermore, we calculate the polarity density in the pool to enhance the scoring function of local search. Our empirical experiments show clear benefits of the proposed parallel approach, making it competitive with the parallel version of the famous commercial solver Gurobi.

Cite as

Zhihan Chen, Peng Lin, Hao Hu, and Shaowei Cai. ParLS-PBO: A Parallel Local Search Solver for Pseudo Boolean Optimization. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 5:1-5:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.CP.2024.5,
  author =	{Chen, Zhihan and Lin, Peng and Hu, Hao and Cai, Shaowei},
  title =	{{ParLS-PBO: A Parallel Local Search Solver for Pseudo Boolean Optimization}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{5:1--5:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.5},
  URN =		{urn:nbn:de:0030-drops-206900},
  doi =		{10.4230/LIPIcs.CP.2024.5},
  annote =	{Keywords: Pseudo-Boolean Optimization, Parallel Solving, Local Search, Scoring Function, Solution Pool}
}
Document
Deep Cooperation of Local Search and Unit Propagation Techniques

Authors: Xiamin Chen, Zhendong Lei, and Pinyan Lu


Abstract
Local search (LS) is an efficient method for solving combinatorial optimization problems such as MaxSAT and Pseudo Boolean Problems (PBO). However, due to a lack of reasoning power and global information, LS methods get stuck at local optima easily. In contrast to the LS, Systematic Search utilizes unit propagation and clause learning techniques with strong reasoning capabilities to avoid falling into local optima. Nevertheless, the complete search is generally time-consuming to obtain a global optimal solution. This work proposes a deep cooperation framework combining local search and unit propagation to address their inherent disadvantages. First, we design a mechanism to detect when LS gets stuck, and then a well-designed unit propagation procedure is called upon to help escape the local optima. To the best of our knowledge, we are the first to integrate unit propagation technique within LS to overcome local optima. Experiments based on a broad range of benchmarks from MaxSAT Evaluations, PBO competitions, the Mixed Integer Programming Library, and three real-life cases validate that our method significantly improves three state-of-the-art MaxSAT and PBO local search solvers.

Cite as

Xiamin Chen, Zhendong Lei, and Pinyan Lu. Deep Cooperation of Local Search and Unit Propagation Techniques. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 6:1-6:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.CP.2024.6,
  author =	{Chen, Xiamin and Lei, Zhendong and Lu, Pinyan},
  title =	{{Deep Cooperation of Local Search and Unit Propagation Techniques}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{6:1--6:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.6},
  URN =		{urn:nbn:de:0030-drops-206918},
  doi =		{10.4230/LIPIcs.CP.2024.6},
  annote =	{Keywords: PBO, Partial MaxSAT, LS, CDCL}
}
Document
Cumulative Scheduling with Calendars and Overtime

Authors: Samuel Cloutier and Claude-Guy Quimper


Abstract
In project scheduling, calendar considerations can increase the duration of a task when its execution overlaps with holidays. On the other hand, the use of overtime may decrease the task’s duration. We introduce the CalendarOvertime constraint which verifies that a task follows a calendar with overtime and holidays. We also introduce the CumulativeOvertime constraint, a variant of the Cumulative constraint, that also reasons with the calendars when propagating according to the resource consumption, the overtime, and the holidays. Experimental results of a RCPSP model on the PSPLIB, BL, and PACK instances augmented with calendars and overtime show that the use of the CalendarOvertime constraint offers a speedup greater than 2.9 on the instances optimally solved and finds better solutions on more than 79% of the remaining instances when compared to a decomposition of the constraint. We also show that the use of our CumulativeOvertime constraint further improves these results.

Cite as

Samuel Cloutier and Claude-Guy Quimper. Cumulative Scheduling with Calendars and Overtime. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 7:1-7:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cloutier_et_al:LIPIcs.CP.2024.7,
  author =	{Cloutier, Samuel and Quimper, Claude-Guy},
  title =	{{Cumulative Scheduling with Calendars and Overtime}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{7:1--7:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.7},
  URN =		{urn:nbn:de:0030-drops-206927},
  doi =		{10.4230/LIPIcs.CP.2024.7},
  annote =	{Keywords: Constraint programming, Scheduling, Global constraints, Calendars, Overtime, Cumulative constraint, Time-Tabling}
}
Document
Slide&Drill, a New Approach for Multi-Objective Combinatorial Optimization

Authors: João Cortes, Inês Lynce, and Vasco Manquinho


Abstract
Following the successful use of Propositional Satisfiability (SAT) algorithms in Boolean optimization (e.g., Maximum Satisfiability), several SAT-based algorithms have been proposed for Multi-Objective Combinatorial Optimization (MOCO). However, these new algorithms either provide a small subset of the Pareto front or follow a more exploratory search procedure and the solutions found are usually distant from the Pareto front. We extend the state of the art with a new SAT-based MOCO solver, Slide and Drill (Slide&Drill), that hones an upper bound set of the exact solution. Moreover, we show that Slide&Drill neatly complements proposed UNSAT-SAT algorithms for MOCO. These algorithms can work in tandem over the same shared "blackboard" formula, in order to enable a faster convergence. Experimental results in several sets of benchmark instances show that Slide&Drill can outperform other SAT-based algorithms for MOCO, in particular when paired with previously proposed UNSAT-SAT algorithms.

Cite as

João Cortes, Inês Lynce, and Vasco Manquinho. Slide&Drill, a New Approach for Multi-Objective Combinatorial Optimization. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 8:1-8:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cortes_et_al:LIPIcs.CP.2024.8,
  author =	{Cortes, Jo\~{a}o and Lynce, In\^{e}s and Manquinho, Vasco},
  title =	{{Slide\&Drill, a New Approach for Multi-Objective Combinatorial Optimization}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{8:1--8:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.8},
  URN =		{urn:nbn:de:0030-drops-206932},
  doi =		{10.4230/LIPIcs.CP.2024.8},
  annote =	{Keywords: Multi-Objective Combinatorial Optimization, Satisfiability Algorithms}
}
Document
Pseudo-Boolean Reasoning About States and Transitions to Certify Dynamic Programming and Decision Diagram Algorithms

Authors: Emir Demirović, Ciaran McCreesh, Matthew J. McIlree, Jakob Nordström, Andy Oertel, and Konstantin Sidorov


Abstract
Pseudo-Boolean proof logging has been used successfully to provide certificates of optimality from a variety of constraint- and satisifability-style solvers that combine reasoning with a backtracking or clause-learning search. Another paradigm, occurring in dynamic programming and decision diagram solving, instead reasons about partial states and possible transitions between them. We describe a framework for generating clean and efficient pseudo-Boolean proofs for these kinds of algorithm, and use it to produce certifying algorithms for knapsack, longest path, and interval scheduling. Because we use a common proof system, we can also reason about hybrid solving algorithms: we demonstrate this by providing proof logging for a dynamic programming based knapsack propagator inside a constraint programming solver.

Cite as

Emir Demirović, Ciaran McCreesh, Matthew J. McIlree, Jakob Nordström, Andy Oertel, and Konstantin Sidorov. Pseudo-Boolean Reasoning About States and Transitions to Certify Dynamic Programming and Decision Diagram Algorithms. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 9:1-9:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{demirovic_et_al:LIPIcs.CP.2024.9,
  author =	{Demirovi\'{c}, Emir and McCreesh, Ciaran and McIlree, Matthew J. and Nordstr\"{o}m, Jakob and Oertel, Andy and Sidorov, Konstantin},
  title =	{{Pseudo-Boolean Reasoning About States and Transitions to Certify Dynamic Programming and Decision Diagram Algorithms}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{9:1--9:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.9},
  URN =		{urn:nbn:de:0030-drops-206948},
  doi =		{10.4230/LIPIcs.CP.2024.9},
  annote =	{Keywords: Proof logging, dynamic programming, decision diagrams}
}
Document
Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference

Authors: Alexandre Dubray, Pierre Schaus, and Siegfried Nijssen


Abstract
Weighted model counting (WMC) plays a central role in probabilistic reasoning. Given that this problem is #P-hard, harder instances can generally only be addressed using approximate techniques based on sampling, which provide statistical convergence guarantees: the longer a sampling process runs, the more accurate the WMC is likely to be. In this work, we propose a deterministic search-based approach that can also be stopped at any time and provides hard lower- and upper-bound guarantees on the true WMC. This approach uses a value heuristic that guides exploration first towards models with a high weight and leverages Limited Discrepancy Search to make the bounds converge faster. The validity, scalability, and convergence of our approach are tested and compared with state-of-the-art baseline methods on the problem of computing marginal probabilities in Bayesian networks and reliability estimation in probabilistic graphs.

Cite as

Alexandre Dubray, Pierre Schaus, and Siegfried Nijssen. Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 10:1-10:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dubray_et_al:LIPIcs.CP.2024.10,
  author =	{Dubray, Alexandre and Schaus, Pierre and Nijssen, Siegfried},
  title =	{{Anytime Weighted Model Counting with Approximation Guarantees for Probabilistic Inference}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{10:1--10:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.10},
  URN =		{urn:nbn:de:0030-drops-206956},
  doi =		{10.4230/LIPIcs.CP.2024.10},
  annote =	{Keywords: Projected Weighted Model Counting, Limited Discrepancy Search, Approximate Method, Probabilistic Inference}
}
Document
A Multi-Stage Proof Logging Framework to Certify the Correctness of CP Solvers

Authors: Maarten Flippo, Konstantin Sidorov, Imko Marijnissen, Jeff Smits, and Emir Demirović


Abstract
Proof logging is used to increase trust in the optimality and unsatisfiability claims of solvers. However, to this date, no constraint programming solver can practically produce proofs without significantly impacting performance, which hinders mainstream adoption. We address this issue by introducing a novel proof generation framework, together with a CP proof format and proof checker. Our approach is to divide the proof generation into three steps. At runtime, we require the CP solver to only produce a proof sketch, which we call a scaffold. After the solving is done, our proof processor trims and expands the scaffold into a full CP proof, which is subsequently verified. Our framework is agnostic to the solver and the verification approach. Through MiniZinc benchmarks, we demonstrate that with our framework, the overhead of logging during solving is often less than 10%, significantly lower than other approaches, and that our proof processing step can reduce the overall size of the proof by orders of magnitude and by extension the proof checking time. Our results demonstrate that proof logging has the potential to become an integral part of the CP community.

Cite as

Maarten Flippo, Konstantin Sidorov, Imko Marijnissen, Jeff Smits, and Emir Demirović. A Multi-Stage Proof Logging Framework to Certify the Correctness of CP Solvers. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 11:1-11:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{flippo_et_al:LIPIcs.CP.2024.11,
  author =	{Flippo, Maarten and Sidorov, Konstantin and Marijnissen, Imko and Smits, Jeff and Demirovi\'{c}, Emir},
  title =	{{A Multi-Stage Proof Logging Framework to Certify the Correctness of CP Solvers}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{11:1--11:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.11},
  URN =		{urn:nbn:de:0030-drops-206969},
  doi =		{10.4230/LIPIcs.CP.2024.11},
  annote =	{Keywords: proof logging, formal verification, constraint programming}
}
Document
Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning

Authors: Adam Francis Green, J. Christopher Beck, and Amanda Coles


Abstract
We present a novel scheduling model that leverages Constraint Programming (CP) to enhance problem solving performance in Temporal Planning. Building on the established strategy of decomposing causal and temporal reasoning, our approach abstracts two common fact structures present in many Temporal Planning problems - Semaphores and Envelopes - and performs temporal reasoning in a CP-based scheduler. At each search node in a heuristic search for a temporal plan, we construct and solve a Constraint Satisfaction Problem (CSP) and integrate feedback from the CP-based scheduler to guide the causal planning search towards a solution. Through experimental analysis, we validate the impact of these advances, demonstrating a significant reduction in both the number of states searched and in search time alongside an increase in problem-solving coverage.

Cite as

Adam Francis Green, J. Christopher Beck, and Amanda Coles. Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 12:1-12:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{francisgreen_et_al:LIPIcs.CP.2024.12,
  author =	{Francis Green, Adam and Beck, J. Christopher and Coles, Amanda},
  title =	{{Using Constraint Programming for Disjunctive Scheduling in Temporal AI Planning}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{12:1--12:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.12},
  URN =		{urn:nbn:de:0030-drops-206974},
  doi =		{10.4230/LIPIcs.CP.2024.12},
  annote =	{Keywords: AI Planning, Temporal-Numeric Planning, Constraint Programming, Scheduling}
}
Document
Improved Bounds of Integer Solution Counts via Volume and Extending to Mixed-Integer Linear Constraints

Authors: Cunjing Ge and Armin Biere


Abstract
Solution counting and solution space integration over linear constraints are important problems with many applications. Previous works addressed either only counting integer points in polytopes (integer counting) with integer variables or alternatively only computing the volume of polytopes (solution space integration) on variables over the reals, including approximating the integer count via a polytope’s volume. We are not aware of a non-trivial algorithm which addresses the mixed case, where linear constraints are over mixed integer and real variables. In this paper, we propose a new randomized algorithm to approximate guarantees (bounds) of integer solution counts. Then we present upper and lower bounds for solution space integration over mixed-integer linear constraints. Thus, proposed algorithms can be extended to mixed-integer cases as well. The experiments show that approximations are often very close to exact results in practice, and bounds approximated by our algorithm are often tight and useful.

Cite as

Cunjing Ge and Armin Biere. Improved Bounds of Integer Solution Counts via Volume and Extending to Mixed-Integer Linear Constraints. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 13:1-13:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ge_et_al:LIPIcs.CP.2024.13,
  author =	{Ge, Cunjing and Biere, Armin},
  title =	{{Improved Bounds of Integer Solution Counts via Volume and Extending to Mixed-Integer Linear Constraints}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{13:1--13:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.13},
  URN =		{urn:nbn:de:0030-drops-206983},
  doi =		{10.4230/LIPIcs.CP.2024.13},
  annote =	{Keywords: Integer Solution Counting, Mixed-Integer Linear Constraints, #SMT(LA) Problems, Volume of Polytopes}
}
Document
A CP/LS Heuristic Method for Maxmin and Minmax Location Problems with Distance Constraints

Authors: Panteleimon Iosif, Nikolaos Ploskas, Kostas Stergiou, and Dimosthenis C. Tsouros


Abstract
In facility location problems we seek to locate a set of facilities in an area, where clients may be present, so that some criterion is optimized. For instance, in the p-center problem we seek to minimize the maximum distance between any client and its closest facility, whereas in the p-dispersion problem we seek to maximize the minimum distance between any two facilities. Hence, in the former we have a minmax objective, whereas in the latter we have a maxmin objective. Recently, a variant of p-dispersion where distance constraints exist between facilities was studied from a CP and ILP perspective. An incomplete CP solver that uses a greedy heuristic to prune branches was shown to significantly outperform Gurobi and OR-Tools in terms of execution time, although it failed to discover optimal or near-optimal solutions in many instances. We enhance this work in two directions, regarding the effectiveness and the applicability of the approach. We first show how local search can be used to obtain better estimations of the bound at each node, resulting in more focused pruning, which allows for optimal or near-optimal solutions to be discovered in many more instances. Then, we demonstrate how the framework can be applied on the p-center problem with distance constraints, comparing it to ILP and CP models implemented in Gurobi and OR-Tools, respectively.

Cite as

Panteleimon Iosif, Nikolaos Ploskas, Kostas Stergiou, and Dimosthenis C. Tsouros. A CP/LS Heuristic Method for Maxmin and Minmax Location Problems with Distance Constraints. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 14:1-14:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{iosif_et_al:LIPIcs.CP.2024.14,
  author =	{Iosif, Panteleimon and Ploskas, Nikolaos and Stergiou, Kostas and Tsouros, Dimosthenis C.},
  title =	{{A CP/LS Heuristic Method for Maxmin and Minmax Location Problems with Distance Constraints}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{14:1--14:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.14},
  URN =		{urn:nbn:de:0030-drops-206995},
  doi =		{10.4230/LIPIcs.CP.2024.14},
  annote =	{Keywords: Constraint Programming, Local Search, facility location, distance constraints, optimization}
}
Document
CSPs with Few Alien Constraints

Authors: Peter Jonsson, Victor Lagerkvist, and George Osipov


Abstract
The constraint satisfaction problem asks to decide if a set of constraints over a relational structure 𝒜 is satisfiable (CSP(𝒜)). We consider CSP(𝒜 ∪ ℬ) where 𝒜 is a structure and ℬ is an alien structure, and analyse its (parameterized) complexity when at most k alien constraints are allowed. We establish connections and obtain transferable complexity results to several well-studied problems that previously escaped classification attempts. Our novel approach, utilizing logical and algebraic methods, yields an FPT versus pNP dichotomy for arbitrary finite structures and sharper dichotomies for Boolean structures and first-order reducts of (ℕ, =) (equality CSPs), together with many partial results for general ω-categorical structures.

Cite as

Peter Jonsson, Victor Lagerkvist, and George Osipov. CSPs with Few Alien Constraints. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 15:1-15:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{jonsson_et_al:LIPIcs.CP.2024.15,
  author =	{Jonsson, Peter and Lagerkvist, Victor and Osipov, George},
  title =	{{CSPs with Few Alien Constraints}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{15:1--15:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.15},
  URN =		{urn:nbn:de:0030-drops-207005},
  doi =		{10.4230/LIPIcs.CP.2024.15},
  annote =	{Keywords: Constraint satisfaction, parameterized complexity, hybrid theories}
}
Document
A New Optimization Model for Multiple-Control Toffoli Quantum Circuit Design

Authors: Jihye Jung, Kevin Dalmeijer, and Pascal Van Hentenryck


Abstract
As quantum technology is advancing, the efficient design of quantum circuits has become an important area of research. This paper provides an introduction to the MCT quantum circuit design problem for reversible Boolean functions without assuming a prior background in quantum computing. While this is a well-studied problem, optimization models that minimize the true objective have only been explored recently. This paper introduces a new optimization model and symmetry-breaking constraints that improve solving time by up to two orders of magnitude compared to earlier work when a Constraint Programming solver is used. Experiments with up to seven qubits and using up to 15 quantum gates result in several new best-known circuits, obtained by any method, for well-known benchmarks. Finally, an extensive comparison with other approaches shows that optimization models may require more time but can provide superior circuits with optimality guarantees.

Cite as

Jihye Jung, Kevin Dalmeijer, and Pascal Van Hentenryck. A New Optimization Model for Multiple-Control Toffoli Quantum Circuit Design. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 16:1-16:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{jung_et_al:LIPIcs.CP.2024.16,
  author =	{Jung, Jihye and Dalmeijer, Kevin and Van Hentenryck, Pascal},
  title =	{{A New Optimization Model for Multiple-Control Toffoli Quantum Circuit Design}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{16:1--16:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.16},
  URN =		{urn:nbn:de:0030-drops-207010},
  doi =		{10.4230/LIPIcs.CP.2024.16},
  annote =	{Keywords: Constraint Programming, Quantum Circuit Design, Reversible Circuits}
}
Document
Exponential Steepest Ascent from Valued Constraint Graphs of Pathwidth Four

Authors: Artem Kaznatcheev and Melle van Marle


Abstract
We examine the complexity of maximising fitness via local search on valued constraint satisfaction problems (VCSPs). We consider two kinds of local ascents: (1) steepest ascents, where each step changes the domain that produces a maximal increase in fitness; and (2) ≺-ordered ascents, where - of the domains with available fitness increasing changes - each step changes the ≺-minimal domain. We provide a general padding argument to simulate any ordered ascent by a steepest ascent. We construct a VCSP that is a path of binary constraints between alternating 2-state and 3-state domains with exponentially long ordered ascents. We apply our padding argument to this VCSP to obtain a Boolean VCSP that has a constraint (hyper)graph of arity 5 and pathwidth 4 with exponential steepest ascents. This is an improvement on the previous best known construction for long steepest ascents, which had arity 8 and pathwidth 7.

Cite as

Artem Kaznatcheev and Melle van Marle. Exponential Steepest Ascent from Valued Constraint Graphs of Pathwidth Four. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 17:1-17:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kaznatcheev_et_al:LIPIcs.CP.2024.17,
  author =	{Kaznatcheev, Artem and van Marle, Melle},
  title =	{{Exponential Steepest Ascent from Valued Constraint Graphs of Pathwidth Four}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{17:1--17:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.17},
  URN =		{urn:nbn:de:0030-drops-207021},
  doi =		{10.4230/LIPIcs.CP.2024.17},
  annote =	{Keywords: valued constraint satisfaction problem, steepest ascent, local search, bounded treewidth, intractability}
}
Document
Learning Effect and Compound Activities in High Multiplicity RCPSP: Application to Satellite Production

Authors: Duc Anh Le, Stéphanie Roussel, Christophe Lecoutre, and Anouck Chan


Abstract
This paper addresses the High Multiplicity Resource-Constrained Project Scheduling Problem (HM-RCPSP), in which multiple projects are performed iteratively while sharing limited resources. We extend this problem by integrating the learning effect, which makes the duration of some activities decrease when they are repeated. Learning effect can be represented by any decreasing function, allowing us to get flexibility in modeling various scenarios. Additionally, we take composition of activities into consideration for reasoning about precedence and resources in a more abstract way. A Constraint Programming model is proposed for this richer problem, including a symmetry-breaking technique applied to some activities. We also present a heuristic-based search strategy. The effectiveness of these solving approaches is evaluated through an experimentation conducted on data concerning real-world satellite assembly lines, as well as on some adapted literature benchmarks. Obtained results demonstrate that our methods serve as robust baselines for addressing this novel problem (denoted by HM-RCPSP/L-C).

Cite as

Duc Anh Le, Stéphanie Roussel, Christophe Lecoutre, and Anouck Chan. Learning Effect and Compound Activities in High Multiplicity RCPSP: Application to Satellite Production. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 18:1-18:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{le_et_al:LIPIcs.CP.2024.18,
  author =	{Le, Duc Anh and Roussel, St\'{e}phanie and Lecoutre, Christophe and Chan, Anouck},
  title =	{{Learning Effect and Compound Activities in High Multiplicity RCPSP: Application to Satellite Production}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{18:1--18:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.18},
  URN =		{urn:nbn:de:0030-drops-207037},
  doi =		{10.4230/LIPIcs.CP.2024.18},
  annote =	{Keywords: High-multiplicity Project Scheduling, Learning Effect, Compound Activities, Satellite Assembly Line, Constraint Programming, Symmetry Breaking}
}
Document
An Efficient Local Search Solver for Mixed Integer Programming

Authors: Peng Lin, Mengchuan Zou, and Shaowei Cai


Abstract
Mixed integer programming (MIP) is a fundamental model in operations research. Local search is a powerful method for solving hard problems, but the development of local search solvers for MIP still needs to be explored. This work develops an efficient local search solver for solving MIP, called Local-MIP. We propose two new operators for MIP to adaptively modify variables for optimizing the objective function and satisfying constraints, respectively. Furthermore, we design a new weighting scheme to dynamically balance the priority between the objective function and each constraint, and propose a two-level scoring function structure to hierarchically guide the search for high-quality feasible solutions. Experiments are conducted on seven public benchmarks to compare Local-MIP with state-of-the-art MIP solvers, which demonstrate that Local-MIP significantly outperforms CPLEX, HiGHS, SCIP and Feasibility Jump, and is competitive with the most powerful commercial solver Gurobi. Moreover, Local-MIP establishes 4 new records for MIPLIB open instances.

Cite as

Peng Lin, Mengchuan Zou, and Shaowei Cai. An Efficient Local Search Solver for Mixed Integer Programming. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lin_et_al:LIPIcs.CP.2024.19,
  author =	{Lin, Peng and Zou, Mengchuan and Cai, Shaowei},
  title =	{{An Efficient Local Search Solver for Mixed Integer Programming}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{19:1--19:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.19},
  URN =		{urn:nbn:de:0030-drops-207041},
  doi =		{10.4230/LIPIcs.CP.2024.19},
  annote =	{Keywords: Mixed Integer Programming, Local Search, Operator, Scoring Function}
}
Document
Constraint Modelling with LLMs Using In-Context Learning

Authors: Kostis Michailidis, Dimos Tsouros, and Tias Guns


Abstract
Constraint Programming (CP) allows for the modelling and solving of a wide range of combinatorial problems. However, modelling such problems using constraints over decision variables still requires significant expertise, both in conceptual thinking and syntactic use of modelling languages. In this work, we explore the potential of using pre-trained Large Language Models (LLMs) as coding assistants, to transform textual problem descriptions into concrete and executable CP specifications. We present different transformation pipelines with explicit intermediate representations, and we investigate the potential benefit of various retrieval-augmented example selection strategies for in-context learning. We evaluate our approach on 2 datasets from the literature, namely NL4Opt (optimisation) and Logic Grid Puzzles (satisfaction), and a heterogeneous set of exercises from a CP course. The results show that pre-trained LLMs have promising potential for initialising the modelling process, with retrieval-augmented in-context learning significantly enhancing their modelling capabilities.

Cite as

Kostis Michailidis, Dimos Tsouros, and Tias Guns. Constraint Modelling with LLMs Using In-Context Learning. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 20:1-20:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{michailidis_et_al:LIPIcs.CP.2024.20,
  author =	{Michailidis, Kostis and Tsouros, Dimos and Guns, Tias},
  title =	{{Constraint Modelling with LLMs Using In-Context Learning}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{20:1--20:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.20},
  URN =		{urn:nbn:de:0030-drops-207053},
  doi =		{10.4230/LIPIcs.CP.2024.20},
  annote =	{Keywords: Constraint Modelling, Constraint Acquisition, Constraint Programming, Large Language Models, In-Context Learning, Natural Language Processing, Named Entity Recognition, Retrieval-Augmented Generation, Optimisation}
}
Document
Strengthening Relaxed Decision Diagrams for Maximum Independent Set Problem: Novel Variable Ordering and Merge Heuristics

Authors: Mohsen Nafar and Michael Römer


Abstract
Finding high-quality bounds is key to devising efficient exact solution approaches for Discrete Optimization (DO) problems. To this end, Decision Diagrams (DDs) provide strong and generic bounding mechanisms. This paper focuses on so-called relaxed DDs which, by merging nodes, over-approximate the solution space of DO problems and provide dual bounds the quality of which hinges upon the ordering of the variables in the DD compilation and on the selection of the nodes to merge. Addressing the Maximum Independent Set Problem, we present a novel dynamic variable ordering strategy relying on induced subgraphs of the original graph, and a new tie-based merge heuristic. In a set of computational experiments, we show that our strategies yield much stronger bounds than the standard state-of-the-art approaches. Furthermore, implementing our heuristics in a DD-based branch-and-bound, we reduce the solution times by around 33 % on average and by more than 50 % on hard instances.

Cite as

Mohsen Nafar and Michael Römer. Strengthening Relaxed Decision Diagrams for Maximum Independent Set Problem: Novel Variable Ordering and Merge Heuristics. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 21:1-21:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{nafar_et_al:LIPIcs.CP.2024.21,
  author =	{Nafar, Mohsen and R\"{o}mer, Michael},
  title =	{{Strengthening Relaxed Decision Diagrams for Maximum Independent Set Problem: Novel Variable Ordering and Merge Heuristics}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{21:1--21:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.21},
  URN =		{urn:nbn:de:0030-drops-207069},
  doi =		{10.4230/LIPIcs.CP.2024.21},
  annote =	{Keywords: Decision Diagram, Dynamic Programming, Maximum Independent Set Problem, Dual Bound}
}
Document
Learning Lagrangian Multipliers for the Travelling Salesman Problem

Authors: Augustin Parjadis, Quentin Cappart, Bistra Dilkina, Aaron Ferber, and Louis-Martin Rousseau


Abstract
Lagrangian relaxation is a versatile mathematical technique employed to relax constraints in an optimization problem, enabling the generation of dual bounds to prove the optimality of feasible solutions and the design of efficient propagators in constraint programming (such as the weighted circuit constraint). However, the conventional process of deriving Lagrangian multipliers (e.g., using subgradient methods) is often computationally intensive, limiting its practicality for large-scale or time-sensitive problems. To address this challenge, we propose an innovative unsupervised learning approach that harnesses the capabilities of graph neural networks to exploit the problem structure, aiming to generate accurate Lagrangian multipliers efficiently. We apply this technique to the well-known Held-Karp Lagrangian relaxation for the traveling salesman problem. The core idea is to predict accurate Lagrangian multipliers and to employ them as a warm start for generating Held-Karp relaxation bounds. These bounds are subsequently utilized to enhance the filtering process carried out by branch-and-bound algorithms. In contrast to much of the existing literature, which primarily focuses on finding feasible solutions, our approach operates on the dual side, demonstrating that learning can also accelerate the proof of optimality. We conduct experiments across various distributions of the metric traveling salesman problem, considering instances with up to 200 cities. The results illustrate that our approach can improve the filtering level of the weighted circuit global constraint, reduce the optimality gap by a factor two for unsolved instances up to a timeout, and reduce the execution time for solved instances by 10%.

Cite as

Augustin Parjadis, Quentin Cappart, Bistra Dilkina, Aaron Ferber, and Louis-Martin Rousseau. Learning Lagrangian Multipliers for the Travelling Salesman Problem. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 22:1-22:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{parjadis_et_al:LIPIcs.CP.2024.22,
  author =	{Parjadis, Augustin and Cappart, Quentin and Dilkina, Bistra and Ferber, Aaron and Rousseau, Louis-Martin},
  title =	{{Learning Lagrangian Multipliers for the Travelling Salesman Problem}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{22:1--22:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.22},
  URN =		{urn:nbn:de:0030-drops-207076},
  doi =		{10.4230/LIPIcs.CP.2024.22},
  annote =	{Keywords: Lagrangian relaxation, unsupervised learning, graph neural network}
}
Document
Constraint Programming Model for Assembly Line Balancing and Scheduling with Walking Workers and Parallel Stations

Authors: Xavier Pucel and Stéphanie Roussel


Abstract
In the context of aircraft assembly lines, increasing the production rate and decreasing the operating costs are two important, and sometimes contradictory, objectives. In small assembly lines, sharing production resources across workstations is a simple and efficient way to reduce operating costs. Therefore, workers are not assigned to a unique workstation but can walk between them. On the other side, paralleling workstations is an efficient way to increase the production rate. However, the combination of both strategies create complex conditions for tasks to access the production resources. This paper addresses the problem of allocating tasks to workstations and scheduling them in an assembly line where workers can freely walk across workstations, and where workstations can be organized in parallel. We model this novel problem with Constraint Programming. We evaluate it on real world industrial use cases coming from aircraft manufacturers, as well as synthetic use cases adapted from the literature.

Cite as

Xavier Pucel and Stéphanie Roussel. Constraint Programming Model for Assembly Line Balancing and Scheduling with Walking Workers and Parallel Stations. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 23:1-23:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{pucel_et_al:LIPIcs.CP.2024.23,
  author =	{Pucel, Xavier and Roussel, St\'{e}phanie},
  title =	{{Constraint Programming Model for Assembly Line Balancing and Scheduling with Walking Workers and Parallel Stations}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{23:1--23:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.23},
  URN =		{urn:nbn:de:0030-drops-207087},
  doi =		{10.4230/LIPIcs.CP.2024.23},
  annote =	{Keywords: Constraint Programming, Assembly Line, Balancing and Scheduling, Parallel Workstations, Walking Workers}
}
Document
Latency-Aware 2-Opt Monotonic Local Search for Distributed Constraint Optimization

Authors: Ben Rachmut, Roie Zivan, and William Yeoh


Abstract
Researchers recently extended Distributed Constraint Optimization Problems (DCOPs) to Communication-Aware DCOPs so that they are applicable in scenarios in which messages can be arbitrarily delayed. Distributed asynchronous local search and inference algorithms designed for CA-DCOPs are less vulnerable to message latency than their counterparts for regular DCOPs. However, unlike local search algorithms for (regular) DCOPs that converge to k-opt solutions (with k > 1), that is, they converge to solutions that cannot be improved by a group of k agents), local search CA-DCOP algorithms are limited to 1-opt solutions only. In this paper, we introduce Latency-Aware Monotonic Distributed Local Search-2 (LAMDLS-2), where agents form pairs and coordinate bilateral assignment replacements. LAMDLS-2 is monotonic, converges to a 2-opt solution, and is also robust to message latency, making it suitable for CA-DCOPs. Our results indicate that LAMDLS-2 converges faster than MGM-2, a benchmark algorithm, to a similar 2-opt solution, in various message latency scenarios.

Cite as

Ben Rachmut, Roie Zivan, and William Yeoh. Latency-Aware 2-Opt Monotonic Local Search for Distributed Constraint Optimization. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 24:1-24:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{rachmut_et_al:LIPIcs.CP.2024.24,
  author =	{Rachmut, Ben and Zivan, Roie and Yeoh, William},
  title =	{{Latency-Aware 2-Opt Monotonic Local Search for Distributed Constraint Optimization}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{24:1--24:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.24},
  URN =		{urn:nbn:de:0030-drops-207096},
  doi =		{10.4230/LIPIcs.CP.2024.24},
  annote =	{Keywords: Distributed Constraint Optimization Problems, Distributed Local Search Algorithms, Latency Awareness, Multi-Agent Optimization}
}
Document
Combining Constraint Programming Reasoning with Large Language Model Predictions

Authors: Florian Régin, Elisabetta De Maria, and Alexandre Bonlarron


Abstract
Constraint Programming (CP) and Machine Learning (ML) face challenges in text generation due to CP’s struggle with implementing "meaning" and ML’s difficulty with structural constraints. This paper proposes a solution by combining both approaches and embedding a Large Language Model (LLM) in CP. The LLM handles word generation and meaning, while CP manages structural constraints. This approach builds on GenCP, an improved version of On-the-fly Constraint Programming Search (OTFS) using LLM-generated domains. Compared to Beam Search (BS), a standard NLP method, this combined approach (GenCP with LLM) is faster and produces better results, ensuring all constraints are satisfied. This fusion of CP and ML presents new possibilities for enhancing text generation under constraints.

Cite as

Florian Régin, Elisabetta De Maria, and Alexandre Bonlarron. Combining Constraint Programming Reasoning with Large Language Model Predictions. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 25:1-25:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{regin_et_al:LIPIcs.CP.2024.25,
  author =	{R\'{e}gin, Florian and De Maria, Elisabetta and Bonlarron, Alexandre},
  title =	{{Combining Constraint Programming Reasoning with Large Language Model Predictions}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{25:1--25:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.25},
  URN =		{urn:nbn:de:0030-drops-207109},
  doi =		{10.4230/LIPIcs.CP.2024.25},
  annote =	{Keywords: Solver and Tools, ML-augmented CP, Constrained Text Generation, ML alongside CO}
}
Document
Structure-Guided Local Improvement for Maximum Satisfiability

Authors: André Schidler and Stefan Szeider


Abstract
The enhanced performance of today’s MaxSAT solvers has elevated their appeal for many large-scale applications, notably in software analysis and computer-aided design. Our research delves into refining anytime MaxSAT solving by repeatedly identifying and solving with an exact solver smaller subinstances that are chosen based on the graphical structure of the instance. We investigate various strategies to pinpoint these subinstances. This structure-guided selection of subinstances provides an exact solver with a high potential for improving the current solution. Our exhaustive experimental analyses contrast our methodology as instantiated in our tool MaxSLIM with previous studies and benchmark it against leading-edge MaxSAT solvers.

Cite as

André Schidler and Stefan Szeider. Structure-Guided Local Improvement for Maximum Satisfiability. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 26:1-26:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{schidler_et_al:LIPIcs.CP.2024.26,
  author =	{Schidler, Andr\'{e} and Szeider, Stefan},
  title =	{{Structure-Guided Local Improvement for Maximum Satisfiability}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{26:1--26:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.26},
  URN =		{urn:nbn:de:0030-drops-207112},
  doi =		{10.4230/LIPIcs.CP.2024.26},
  annote =	{Keywords: maximum satisfiability, large neighborhood search (LNS), SAT-based local improvement (SLIM), incomplete MaxSAT, graphical structure, metaheuristic}
}
Document
Efficient Implementation of the Global Cardinality Constraint with Costs

Authors: Margaux Schmied and Jean-Charles Régin


Abstract
The success of Constraint Programming relies partly on the global constraints and implementation of the associated filtering algorithms. Recently, new ideas emerged to improve these implementations in practice, especially regarding the all different constraint. In this paper, we consider the cardinality constraint with costs. The cardinality constraint is a generalization of the all different constraint that specifies the number of times each value must be taken by a given set of variables in a solution. The version with costs introduces an assignment cost and bounds the total sum of assignment costs. The arc consistency filtering algorithm of this constraint is difficult to use in practice, as it systematically searches for many shortest paths. We propose a new approach that works with upper bounds on shortest paths based on landmarks. This approach can be seen as a preprocessing. It is fast and avoids, in practice, a large number of explicit computations of shortest paths.

Cite as

Margaux Schmied and Jean-Charles Régin. Efficient Implementation of the Global Cardinality Constraint with Costs. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 27:1-27:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{schmied_et_al:LIPIcs.CP.2024.27,
  author =	{Schmied, Margaux and R\'{e}gin, Jean-Charles},
  title =	{{Efficient Implementation of the Global Cardinality Constraint with Costs}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{27:1--27:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.27},
  URN =		{urn:nbn:de:0030-drops-207126},
  doi =		{10.4230/LIPIcs.CP.2024.27},
  annote =	{Keywords: global constraint, filtering algorithm, cardinality constraints with costs, arc consistency}
}
Document
CP for Bin Packing with Multi-Core and GPUs

Authors: Fabio Tardivo, Laurent Michel, and Enrico Pontelli


Abstract
The BinPacking constraint models the requirements of many logistics, resource allocation, and production scheduling applications. This paper explores new avenues based on the impressive computational power of modern GPUs to propagate the BinPacking constraint. This work showcases how the perspective of massive parallelization can lead to novel approaches, such as the use of a portfolio of lower bounds, to enhance the pruning of the BinPacking constraints. It delivers insights into the design choices and challenges presented by GPU platform for constraint propagation. The paper evaluates a GPU-accelerated propagator against both sequential and parallel CPU versions, as well as state-of-the-art approaches. Comparisons across various benchmarks from the literature show strong performances with respect to both CPU versions and the standard pruning approach. When compared to techniques based on Linear Programming, our approach proves valuable for large instances or when spending extensive time to obtain the best possible bound is not convenient.

Cite as

Fabio Tardivo, Laurent Michel, and Enrico Pontelli. CP for Bin Packing with Multi-Core and GPUs. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 28:1-28:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{tardivo_et_al:LIPIcs.CP.2024.28,
  author =	{Tardivo, Fabio and Michel, Laurent and Pontelli, Enrico},
  title =	{{CP for Bin Packing with Multi-Core and GPUs}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{28:1--28:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.28},
  URN =		{urn:nbn:de:0030-drops-207138},
  doi =		{10.4230/LIPIcs.CP.2024.28},
  annote =	{Keywords: Constraint Propagation, Bin Packing, Parallelism, GPU, Lower Bounds}
}
Document
Mutational Fuzz Testing for Constraint Modeling Systems

Authors: Wout Vanroose, Ignace Bleukx, Jo Devriendt, Dimos Tsouros, Hélène Verhaeghe, and Tias Guns


Abstract
Constraint programming (CP) modeling languages, like MiniZinc, Essence and CPMpy, play a crucial role in making CP technology accessible to non-experts. Both solver-independent modeling frameworks and solvers themselves are complex pieces of software that can contain bugs, which undermines their usefulness. Mutational fuzz testing is a way to test complex systems by stochastically mutating input and verifying preserved properties of the mutated output. We investigate different mutations and verification methods that can be used on the constraint specifications directly. This includes methods proposed in the context of SMT problem specifications, as well as new methods related to global constraints, optimization, and solution counting/preservation. Our results show that such a fuzz testing approach improves the overall code coverage of a modeling system compared to only unit testing, and is able to find bugs in the whole toolchain, from the modeling language transformations themselves to the underlying solvers.

Cite as

Wout Vanroose, Ignace Bleukx, Jo Devriendt, Dimos Tsouros, Hélène Verhaeghe, and Tias Guns. Mutational Fuzz Testing for Constraint Modeling Systems. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 29:1-29:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{vanroose_et_al:LIPIcs.CP.2024.29,
  author =	{Vanroose, Wout and Bleukx, Ignace and Devriendt, Jo and Tsouros, Dimos and Verhaeghe, H\'{e}l\`{e}ne and Guns, Tias},
  title =	{{Mutational Fuzz Testing for Constraint Modeling Systems}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{29:1--29:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.29},
  URN =		{urn:nbn:de:0030-drops-207149},
  doi =		{10.4230/LIPIcs.CP.2024.29},
  annote =	{Keywords: fuzz testing, Constraint modeling language, bugs, mutational testing, modeling, constraint reformulation}
}
Document
Learning Precedences for Scheduling Problems with Graph Neural Networks

Authors: Hélène Verhaeghe, Quentin Cappart, Gilles Pesant, and Claude-Guy Quimper


Abstract
The resource constrained project scheduling problem (RCPSP) consists of scheduling a finite set of resource-consuming tasks within a temporal horizon subject to resource capacities and precedence relations between pairs of tasks. It is NP-hard and many techniques have been introduced to improve the efficiency of CP solvers to solve it. The problem is naturally represented as a directed graph, commonly referred to as the precedence graph, by linking pairs of tasks subject to a precedence. In this paper, we propose to leverage the ability of graph neural networks to extract knowledge from precedence graphs. This is carried out by learning new precedences that can be used either to add new constraints or to design a dedicated variable-selection heuristic. Experiments carried out on RCPSP instances from PSPLIB show the potential of learning to predict precedences and how they can help speed up the search for solutions by a CP solver.

Cite as

Hélène Verhaeghe, Quentin Cappart, Gilles Pesant, and Claude-Guy Quimper. Learning Precedences for Scheduling Problems with Graph Neural Networks. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 30:1-30:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{verhaeghe_et_al:LIPIcs.CP.2024.30,
  author =	{Verhaeghe, H\'{e}l\`{e}ne and Cappart, Quentin and Pesant, Gilles and Quimper, Claude-Guy},
  title =	{{Learning Precedences for Scheduling Problems with Graph Neural Networks}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{30:1--30:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.30},
  URN =		{urn:nbn:de:0030-drops-207150},
  doi =		{10.4230/LIPIcs.CP.2024.30},
  annote =	{Keywords: Scheduling, Precedence graph, Graph neural network}
}
Document
Inverting Step-Reduced SHA-1 and MD5 by Parameterized SAT Solvers

Authors: Oleg Zaikin


Abstract
MD5 and SHA-1 are fundamental cryptographic hash functions proposed in 1990s. Given a message of arbitrary finite size, MD5 produces a 128-bit hash in 64 steps, while SHA-1 produces a 160-bit hash in 80 steps. It is computationally infeasible to invert MD5 and SHA-1, i.e. to find a message given a hash. In 2012, 28-step MD5 and 23-step SHA-1 were inverted by CDCL solvers, yet no progress has been made since then. The present paper proposes to construct 31 intermediate inverse problems for any pair of MD5 or SHA-1 steps (i,i+1), such that the first problem is very close to inverting i steps, while the 31st one is almost inverting i+1 steps. We constructed SAT encodings of intermediate problems for MD5 and SHA-1, and tuned a CDCL solver on the simplest of them. Then the tuned solver was used to design a parallel Cube-and-Conquer solver which for the first time inverted 29-step MD5 and 24-step SHA-1.

Cite as

Oleg Zaikin. Inverting Step-Reduced SHA-1 and MD5 by Parameterized SAT Solvers. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 31:1-31:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{zaikin:LIPIcs.CP.2024.31,
  author =	{Zaikin, Oleg},
  title =	{{Inverting Step-Reduced SHA-1 and MD5 by Parameterized SAT Solvers}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{31:1--31:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.31},
  URN =		{urn:nbn:de:0030-drops-207165},
  doi =		{10.4230/LIPIcs.CP.2024.31},
  annote =	{Keywords: cryptographic hash function, MD5, SHA-1, preimage attack, SAT, Cube-and-Conquer}
}
Document
Solving LBBD Master Problems with Constraint Programming and Domain-Independent Dynamic Programming

Authors: Jiachen Zhang and J. Christopher Beck


Abstract
We investigate using Constraint Programming (CP) and Domain-Independent Dynamic Programming (DIDP) to solve the master problem in Logic-based Benders Decomposition (LBBD) models, in particular addressing the challenge of feasibility cut formulation. For CP, we exploit key variable manipulation, constraint-based expressions, and global constraints to construct three combinatorial cut encodings. For the state-based DIDP model, we propose two cut encoding approaches: using additional preconditions of state transitions or adding state constraints. Each of these approaches can be modeled using integer numeric variables or set variables, resulting in four novel encodings. We apply the three CP variants and four DIDP variants to simple assembly line balancing problems with sequence-dependent setup times type-1 (SUALBP-1). Experimental results show all approaches outperform a mixed-integer programming (MIP) based master problem and the state-of-the-art monolithic MIP model, with the three CP variants being superior to all of the DIDP approaches.

Cite as

Jiachen Zhang and J. Christopher Beck. Solving LBBD Master Problems with Constraint Programming and Domain-Independent Dynamic Programming. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 32:1-32:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{zhang_et_al:LIPIcs.CP.2024.32,
  author =	{Zhang, Jiachen and Beck, J. Christopher},
  title =	{{Solving LBBD Master Problems with Constraint Programming and Domain-Independent Dynamic Programming}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{32:1--32:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.32},
  URN =		{urn:nbn:de:0030-drops-207171},
  doi =		{10.4230/LIPIcs.CP.2024.32},
  annote =	{Keywords: constraint programming, domain-independent dynamic programming, logic-based Benders decomposition, assembly line balancing, sequence-dependent setup}
}
Document
Ex-Ante Constraint Elicitation in Incomplete DCOPs

Authors: Roie Zivan, Shiraz Regev, and William Yeoh


Abstract
Distributed Constraint Optimization Problems (DCOPs) is a framework for representing and solving distributed combinatorial problems, where agents exchange messages to assign variables they own, such that the sum of constraint costs is minimized. When agents represent people (e.g., in meeting scheduling problems), the constraint information that the agents hold may be incomplete. For such scenarios, researchers proposed Incomplete DCOPs (I-DCOPs), which allow agents to elicit from their human users some of the missing information. Existing I-DCOP approaches evaluate solutions not only by their quality, but also the elicitation costs spent to find them (ex-post). Unfortunately, this may result in the agents spending a lot of effort (in terms of elicitation costs) to find high-quality solutions, and then ignoring them because previous lower-quality solutions were found with less effort. Therefore, we propose a different approach for solving I-DCOPs by evaluating solutions based on their quality and considering the elicitation cost beforehand (ex-ante). Agents are limited in the amount of information that they can elicit and, therefore, need to make smart decisions on choosing which missing information to elicit. We propose several heuristics for making these decisions. Our results indicate that some of the heuristics designed produce high-quality solutions, which significantly outperform the previously proposed ex-post heuristics.

Cite as

Roie Zivan, Shiraz Regev, and William Yeoh. Ex-Ante Constraint Elicitation in Incomplete DCOPs. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 33:1-33:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{zivan_et_al:LIPIcs.CP.2024.33,
  author =	{Zivan, Roie and Regev, Shiraz and Yeoh, William},
  title =	{{Ex-Ante Constraint Elicitation in Incomplete DCOPs}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{33:1--33:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.33},
  URN =		{urn:nbn:de:0030-drops-207182},
  doi =		{10.4230/LIPIcs.CP.2024.33},
  annote =	{Keywords: Distributed Constraint Optimization Problems, Preference Elicitation, Multi-Agent Optimization}
}
Document
Short Paper
Minimizing Working-Group Conflicts in Conference Session Scheduling Through Maximum Satisfiability (Short Paper)

Authors: Sami Cherif, Heythem Sattoutah, Chu-Min Li, Corinne Lucet, and Laure Brisoux-Devendeville


Abstract
This paper explores the application of Maximum Satisfiability (Max-SAT) to the complex problem of conference session scheduling, with a particular focus on minimizing working-group conflicts within the context of the ROADEF conference, the largest French-speaking event aimed at bringing together researchers from various fields such as combinatorial optimization and operational research. A Max-SAT model is introduced then enhanced with new variables, and solved through state-of-the-art solvers. The results of applying our formulation to data from ROADEF demonstrate its ability to effectively compute session schedules, while enabling to reduce the number of conflicts and the maximum number of parallel sessions compared to the handmade solutions proposed by the organizing committees. These findings underscore the potential of Max-SAT as a valuable tool for optimizing conference scheduling processes, offering a systematic and efficient solution that ensures a smoother and more productive experience for attendees and organizers alike.

Cite as

Sami Cherif, Heythem Sattoutah, Chu-Min Li, Corinne Lucet, and Laure Brisoux-Devendeville. Minimizing Working-Group Conflicts in Conference Session Scheduling Through Maximum Satisfiability (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 34:1-34:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cherif_et_al:LIPIcs.CP.2024.34,
  author =	{Cherif, Sami and Sattoutah, Heythem and Li, Chu-Min and Lucet, Corinne and Brisoux-Devendeville, Laure},
  title =	{{Minimizing Working-Group Conflicts in Conference Session Scheduling Through Maximum Satisfiability}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{34:1--34:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.34},
  URN =		{urn:nbn:de:0030-drops-207190},
  doi =		{10.4230/LIPIcs.CP.2024.34},
  annote =	{Keywords: Maximum Satisfiability, Scheduling, Modeling}
}
Document
Short Paper
On the Complexity of Integer Programming with Fixed-Coefficient Scaling (Short Paper)

Authors: Jorke M. de Vlas


Abstract
We give a polynomial time algorithm that solves a CSP over 𝐙 with linear inequalities of the form c^{a₁} x - c^{a₂} y ≤ b where x and y are variables, a₁, a₂ and b are parameters, and c is a fixed constant. This is a step in classifying the complexity of CSP(Γ) for first-order reducts Γ from (𝐙, < ,+,1). The algorithm works by first reducing the infinite domain to a finite domain by inferring an upper bound on the size of the smallest solution, then repeatedly merging consecutive constraints into new constraints, and finally solving the problem using arc consistency.

Cite as

Jorke M. de Vlas. On the Complexity of Integer Programming with Fixed-Coefficient Scaling (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 35:1-35:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{devlas:LIPIcs.CP.2024.35,
  author =	{de Vlas, Jorke M.},
  title =	{{On the Complexity of Integer Programming with Fixed-Coefficient Scaling}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{35:1--35:9},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.35},
  URN =		{urn:nbn:de:0030-drops-207203},
  doi =		{10.4230/LIPIcs.CP.2024.35},
  annote =	{Keywords: constraint satisfaction problems, integer programming, CSP dichotomy}
}
Document
Short Paper
Black-Box Value Heuristics for Solving Optimization Problems with Constraint Programming (Short Paper)

Authors: Augustin Delecluse and Pierre Schaus


Abstract
Significant research efforts have focused on black-box variable selection, with less attention given to value heuristics. An ideal value heuristic enables depth-first-search to prioritize high-quality solutions first. The Bound-Impact Value Selection achieves this goal through a look-ahead strategy, trying every value of the selected variable and ranking them based on their impact on the objective. However, this method is generally too computationally intensive for the entire search tree. We introduce two simple yet powerful modifications to improve its scalability. First, a lighter fix point computation involving only the constraints on the shortest path in the constraint graph between the variable and the objective. Second, a reverse look-ahead strategy optimistically fixes the objective variable to its minimum in order to prioritize the remaining values. These two ideas have been empirically validated on a range of academic problems and in the XCSP³ competition, demonstrating significant improvements in scalability.

Cite as

Augustin Delecluse and Pierre Schaus. Black-Box Value Heuristics for Solving Optimization Problems with Constraint Programming (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 36:1-36:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{delecluse_et_al:LIPIcs.CP.2024.36,
  author =	{Delecluse, Augustin and Schaus, Pierre},
  title =	{{Black-Box Value Heuristics for Solving Optimization Problems with Constraint Programming}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{36:1--36:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.36},
  URN =		{urn:nbn:de:0030-drops-207214},
  doi =		{10.4230/LIPIcs.CP.2024.36},
  annote =	{Keywords: Constraint Programming, Value Selection, Look-Ahead, Optimization}
}
Document
Short Paper
Computing Small Rainbow Cycle Numbers with SAT Modulo Symmetries (Short Paper)

Authors: Markus Kirchweger and Stefan Szeider


Abstract
Envy-freeness up to any good (EFX) is a key concept in Computational Social Choice for the fair division of indivisible goods, where no agent envies another’s allocation after removing any single item. A deeper understanding of EFX allocations is facilitated by exploring the rainbow cycle number (R_f(d)), the largest number of independent sets in a certain class of directed graphs. Upper bounds on R_f(d) provide guarantees to the feasibility of EFX allocations (Chaudhury et al., EC 2021). In this work, we precisely compute the numbers R_f(d) for small values of d, employing the SAT modulo Symmetries framework (Kirchweger and Szeider, CP 2021). SAT modulo Symmetries is tailored specifically for the constraint-based isomorph-free generation of combinatorial structures. We provide an efficient encoding for the rainbow cycle number, comparing eager and lazy approaches. To cope with the huge search space, we extend the encoding with invariant pruning, a new method that significantly speeds up computation.

Cite as

Markus Kirchweger and Stefan Szeider. Computing Small Rainbow Cycle Numbers with SAT Modulo Symmetries (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 37:1-37:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kirchweger_et_al:LIPIcs.CP.2024.37,
  author =	{Kirchweger, Markus and Szeider, Stefan},
  title =	{{Computing Small Rainbow Cycle Numbers with SAT Modulo Symmetries}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{37:1--37:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.37},
  URN =		{urn:nbn:de:0030-drops-207221},
  doi =		{10.4230/LIPIcs.CP.2024.37},
  annote =	{Keywords: EFX, rainbow cycle number, SAT modulo Symmetries, combinatorial search}
}
Document
Short Paper
Frugal Algorithm Selection (Short Paper)

Authors: Erdem Kuş, Özgür Akgün, Nguyen Dang, and Ian Miguel


Abstract
When solving decision and optimisation problems, many competing algorithms (model and solver choices) have complementary strengths. Typically, there is no single algorithm that works well for all instances of a problem. Automated algorithm selection has been shown to work very well for choosing a suitable algorithm for a given instance. However, the cost of training can be prohibitively large due to running candidate algorithms on a representative set of training instances. In this work, we explore reducing this cost by choosing a subset of the training instances on which to train. We approach this problem in three ways: using active learning to decide based on prediction uncertainty, augmenting the algorithm predictors with a timeout predictor, and collecting training data using a progressively increasing timeout. We evaluate combinations of these approaches on six datasets from ASLib and present the reduction in labelling cost achieved by each option.

Cite as

Erdem Kuş, Özgür Akgün, Nguyen Dang, and Ian Miguel. Frugal Algorithm Selection (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 38:1-38:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kus_et_al:LIPIcs.CP.2024.38,
  author =	{Ku\c{s}, Erdem and Akg\"{u}n, \"{O}zg\"{u}r and Dang, Nguyen and Miguel, Ian},
  title =	{{Frugal Algorithm Selection}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{38:1--38:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.38},
  URN =		{urn:nbn:de:0030-drops-207239},
  doi =		{10.4230/LIPIcs.CP.2024.38},
  annote =	{Keywords: Algorithm Selection, Active Learning}
}
Document
Short Paper
An Investigation of Generic Approaches to Large Neighbourhood Search (Short Paper)

Authors: Filipe Souza, Diarmuid Grimes, and Barry O'Sullivan


Abstract
A bottleneck in the more wide-spread use of approaches such as Large Neighborhood Search is the need for domain-specific knowledge. To this end, a number of generic LNS methods have previously been proposed that automate the selection of variables in the neighborhood with the aim of reducing the expertise requirement. Recently a new generic approach, Improved Variable-Relationship Guided LNS (iVRG), was proposed that showed promising initial results. This method combines static information regarding problem structure and dynamic information from search performance in its neighborhood selection. In this work, we first show the generalisability of the approach by comparing it on two widely studied problems, car sequencing and steel mill slab, where it outperformed existing generic approaches. We then provide a detailed examination of iVRG, investigating its key components (static/dynamic information, the use of a Tournament Selection operator) to assess their individual impact and provide insight into iVRGs overall behavior.

Cite as

Filipe Souza, Diarmuid Grimes, and Barry O'Sullivan. An Investigation of Generic Approaches to Large Neighbourhood Search (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 39:1-39:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{souza_et_al:LIPIcs.CP.2024.39,
  author =	{Souza, Filipe and Grimes, Diarmuid and O'Sullivan, Barry},
  title =	{{An Investigation of Generic Approaches to Large Neighbourhood Search}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{39:1--39:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.39},
  URN =		{urn:nbn:de:0030-drops-207248},
  doi =		{10.4230/LIPIcs.CP.2024.39},
  annote =	{Keywords: Combinatorial Optimization, Metaheuristics, Large Neighborhood Search (LNS), Machine Reassignment Problem, Car Sequencing Problem, Steel Mill Slab Problem}
}
Document
Short Paper
Encoding the Hamiltonian Cycle Problem into SAT Based on Vertex Elimination (Short Paper)

Authors: Neng-Fa Zhou


Abstract
This paper presents a SAT encoding, called vertex elimination encoding (VEE), for the Hamiltonian Cycle Problem (HCP). The encoding maps a Hamiltonian cycle in the reduced graph after vertex elimination to a Hamiltonian cycle in the original graph. While VEE is not competitive for large dense graphs due to its large encoding sizes, it can be utilized to reduce graphs when they are sparse. This paper compares VEE with the distance encoding, and shows that the hybridization of these two encodings is effective for the benchmarks. For the knight’s tour problem, in particular, the hybrid encoding solves some middle-sized instances that were beyond the reach for previous eager SAT encodings.

Cite as

Neng-Fa Zhou. Encoding the Hamiltonian Cycle Problem into SAT Based on Vertex Elimination (Short Paper). In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 40:1-40:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{zhou:LIPIcs.CP.2024.40,
  author =	{Zhou, Neng-Fa},
  title =	{{Encoding the Hamiltonian Cycle Problem into SAT Based on Vertex Elimination}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{40:1--40:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.40},
  URN =		{urn:nbn:de:0030-drops-207258},
  doi =		{10.4230/LIPIcs.CP.2024.40},
  annote =	{Keywords: Graph constraints, the Hamiltonian cycle problem, SAT encoding, Vertex elimination, Graph synthesis}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail