5 Search Results for "Guns, Tias"


Document
Simplifying Step-Wise Explanation Sequences

Authors: Ignace Bleukx, Jo Devriendt, Emilio Gamba, Bart Bogaerts, and Tias Guns

Published in: LIPIcs, Volume 280, 29th International Conference on Principles and Practice of Constraint Programming (CP 2023)


Abstract
Explaining constraint programs is useful for debugging an unsatisfiable program, to understand why a given solution is optimal, or to understand how to find a unique solution. A recently proposed framework for explaining constraint programs works well to explain the unique solution to a problem step by step. It can also be used to step-wise explain why a model is unsatisfiable, but this may create redundant steps and introduce superfluous information into the explanation sequence. This paper proposes methods to simplify a (step-wise) explanation sequence, to generate simple steps that together form a short, interpretable sequence. We propose an algorithm to greedily construct an initial sequence and two filtering algorithms that eliminate redundant steps and unnecessarily complex parts of explanation sequences. Experiments on diverse benchmark instances show that our techniques can significantly simplify step-wise explanation sequences.

Cite as

Ignace Bleukx, Jo Devriendt, Emilio Gamba, Bart Bogaerts, and Tias Guns. Simplifying Step-Wise Explanation Sequences. In 29th International Conference on Principles and Practice of Constraint Programming (CP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 280, pp. 11:1-11:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bleukx_et_al:LIPIcs.CP.2023.11,
  author =	{Bleukx, Ignace and Devriendt, Jo and Gamba, Emilio and Bogaerts, Bart and Guns, Tias},
  title =	{{Simplifying Step-Wise Explanation Sequences}},
  booktitle =	{29th International Conference on Principles and Practice of Constraint Programming (CP 2023)},
  pages =	{11:1--11:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-300-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{280},
  editor =	{Yap, Roland H. C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2023.11},
  URN =		{urn:nbn:de:0030-drops-190489},
  doi =		{10.4230/LIPIcs.CP.2023.11},
  annote =	{Keywords: explanation, deduction, constraint programming, propagation}
}
Document
Guided Bottom-Up Interactive Constraint Acquisition

Authors: Dimosthenis C. Tsouros, Senne Berden, and Tias Guns

Published in: LIPIcs, Volume 280, 29th International Conference on Principles and Practice of Constraint Programming (CP 2023)


Abstract
Constraint Acquisition (CA) systems can be used to assist in the modeling of constraint satisfaction problems. In (inter)active CA, the system is given a set of candidate constraints and posts queries to the user with the goal of finding the right constraints among the candidates. Current interactive CA algorithms suffer from at least two major bottlenecks. First, in order to converge, they require a large number of queries to be asked to the user. Second, they cannot handle large sets of candidate constraints, since these lead to large waiting times for the user. For this reason, the user must have fairly precise knowledge about what constraints the system should consider. In this paper, we alleviate these bottlenecks by presenting two novel methods that improve the efficiency of CA. First, we introduce a bottom-up approach named GrowAcq that reduces the maximum waiting time for the user and allows the system to handle much larger sets of candidate constraints. It also reduces the total number of queries for problems in which the target constraint network is not sparse. Second, we propose a probability-based method to guide query generation and show that it can significantly reduce the number of queries required to converge. We also propose a new technique that allows the use of openly accessible CP solvers in query generation, removing the dependency of existing methods on less well-maintained custom solvers that are not publicly available. Experimental results show that our proposed methods outperform state-of-the-art CA methods, reducing the number of queries by up to 60%. Our methods work well even in cases where the set of candidate constraints is 50 times larger than the ones commonly used in the literature.

Cite as

Dimosthenis C. Tsouros, Senne Berden, and Tias Guns. Guided Bottom-Up Interactive Constraint Acquisition. In 29th International Conference on Principles and Practice of Constraint Programming (CP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 280, pp. 36:1-36:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{tsouros_et_al:LIPIcs.CP.2023.36,
  author =	{Tsouros, Dimosthenis C. and Berden, Senne and Guns, Tias},
  title =	{{Guided Bottom-Up Interactive Constraint Acquisition}},
  booktitle =	{29th International Conference on Principles and Practice of Constraint Programming (CP 2023)},
  pages =	{36:1--36:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-300-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{280},
  editor =	{Yap, Roland H. C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2023.36},
  URN =		{urn:nbn:de:0030-drops-190734},
  doi =		{10.4230/LIPIcs.CP.2023.36},
  annote =	{Keywords: Constraint acquisition, Constraint learning, Active learning, Modelling}
}
Document
Learning MAX-SAT Models from Examples Using Genetic Algorithms and Knowledge Compilation

Authors: Senne Berden, Mohit Kumar, Samuel Kolb, and Tias Guns

Published in: LIPIcs, Volume 235, 28th International Conference on Principles and Practice of Constraint Programming (CP 2022)


Abstract
Many real-world problems can be effectively solved by means of combinatorial optimization. However, appropriate models to give to a solver are not always available, and sometimes must be learned from historical data. Although some research has been done in this area, the task of learning (weighted partial) MAX-SAT models has not received much attention thus far, even though such models can be used in many real-world applications. Furthermore, most existing work is limited to learning models from non-contextual data, where instances are labeled as solutions and non-solutions, but without any specification of the contexts in which those labels apply. A recent approach named hassle-sls has addressed these limitations: it can jointly learn hard constraints and weighted soft constraints from labeled contextual examples. However, it is hindered by long runtimes, as evaluating even a single candidate MAX-SAT model requires solving as many models as there are contexts in the training data, which quickly becomes highly expensive when the size of the model increases. In this work, we address these runtime issues. To this end, we make two contributions. First, we propose a faster model evaluation procedure that makes use of knowledge compilation. Second, we propose a genetic algorithm named hassle-gen that decreases the number of evaluations needed to find good models. We experimentally show that both contributions improve on the state of the art by speeding up learning, which in turn allows higher-quality MAX-SAT models to be found within a given learning time budget.

Cite as

Senne Berden, Mohit Kumar, Samuel Kolb, and Tias Guns. Learning MAX-SAT Models from Examples Using Genetic Algorithms and Knowledge Compilation. In 28th International Conference on Principles and Practice of Constraint Programming (CP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 235, pp. 8:1-8:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{berden_et_al:LIPIcs.CP.2022.8,
  author =	{Berden, Senne and Kumar, Mohit and Kolb, Samuel and Guns, Tias},
  title =	{{Learning MAX-SAT Models from Examples Using Genetic Algorithms and Knowledge Compilation}},
  booktitle =	{28th International Conference on Principles and Practice of Constraint Programming (CP 2022)},
  pages =	{8:1--8:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-240-2},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{235},
  editor =	{Solnon, Christine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2022.8},
  URN =		{urn:nbn:de:0030-drops-166373},
  doi =		{10.4230/LIPIcs.CP.2022.8},
  annote =	{Keywords: Machine learning, constraint learning, MAX-SAT}
}
Document
Learning Constraint Programming Models from Data Using Generate-And-Aggregate

Authors: Mohit Kumar, Samuel Kolb, and Tias Guns

Published in: LIPIcs, Volume 235, 28th International Conference on Principles and Practice of Constraint Programming (CP 2022)


Abstract
Constraint programming (CP) is used widely for solving real-world problems. However, designing these models require substantial expertise. In this paper, we tackle this problem by synthesizing models automatically from past solutions. We introduce COUNT-CP, which uses simple grammars and a generate-and-aggregate approach to learn expressive first-order constraints typically used in CP as well as their parameters from data. The learned constraints generalize across instances over different sizes and can be used to solve unseen instances - e.g., learning constraints from a 4×4 Sudoku to solve a 9×9 Sudoku or learning nurse staffing requirements across hospitals. COUNT-CP is implemented using the CPMpy constraint programming and modelling environment to produce constraints with nested mathematical expressions. The method is empirically evaluated on a set of suitable benchmark problems and shows to learn accurate and compact models quickly.

Cite as

Mohit Kumar, Samuel Kolb, and Tias Guns. Learning Constraint Programming Models from Data Using Generate-And-Aggregate. In 28th International Conference on Principles and Practice of Constraint Programming (CP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 235, pp. 29:1-29:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{kumar_et_al:LIPIcs.CP.2022.29,
  author =	{Kumar, Mohit and Kolb, Samuel and Guns, Tias},
  title =	{{Learning Constraint Programming Models from Data Using Generate-And-Aggregate}},
  booktitle =	{28th International Conference on Principles and Practice of Constraint Programming (CP 2022)},
  pages =	{29:1--29:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-240-2},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{235},
  editor =	{Solnon, Christine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2022.29},
  URN =		{urn:nbn:de:0030-drops-166580},
  doi =		{10.4230/LIPIcs.CP.2022.29},
  annote =	{Keywords: Constraint Learning, Constraint Programming, Model Synthesis}
}
Document
Data Driven VRP: A Neural Network Model to Learn Hidden Preferences for VRP

Authors: Jayanta Mandi, Rocsildes Canoy, Víctor Bucarey, and Tias Guns

Published in: LIPIcs, Volume 210, 27th International Conference on Principles and Practice of Constraint Programming (CP 2021)


Abstract
The traditional Capacitated Vehicle Routing Problem (CVRP) minimizes the total distance of the routes under the capacity constraints of the vehicles. But more often, the objective involves multiple criteria including not only the total distance of the tour but also other factors such as travel costs, travel time, and fuel consumption. Moreover, in reality, there are numerous implicit preferences ingrained in the minds of the route planners and the drivers. Drivers, for instance, have familiarity with certain neighborhoods and knowledge of the state of roads, and often consider the best places for rest and lunch breaks. This knowledge is difficult to formulate and balance when operational routing decisions have to be made. This motivates us to learn the implicit preferences from past solutions and to incorporate these learned preferences in the optimization process. These preferences are in the form of arc probabilities, i.e., the more preferred a route is, the higher is the joint probability. The novelty of this work is the use of a neural network model to estimate the arc probabilities, which allows for additional features and automatic parameter estimation. This first requires identifying suitable features, neural architectures and loss functions, taking into account that there is typically few data available. We investigate the difference with a prior weighted Markov counting approach, and study the applicability of neural networks in this setting.

Cite as

Jayanta Mandi, Rocsildes Canoy, Víctor Bucarey, and Tias Guns. Data Driven VRP: A Neural Network Model to Learn Hidden Preferences for VRP. In 27th International Conference on Principles and Practice of Constraint Programming (CP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 210, pp. 42:1-42:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{mandi_et_al:LIPIcs.CP.2021.42,
  author =	{Mandi, Jayanta and Canoy, Rocsildes and Bucarey, V{\'\i}ctor and Guns, Tias},
  title =	{{Data Driven VRP: A Neural Network Model to Learn Hidden Preferences for VRP}},
  booktitle =	{27th International Conference on Principles and Practice of Constraint Programming (CP 2021)},
  pages =	{42:1--42:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-211-2},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{210},
  editor =	{Michel, Laurent D.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2021.42},
  URN =		{urn:nbn:de:0030-drops-153339},
  doi =		{10.4230/LIPIcs.CP.2021.42},
  annote =	{Keywords: Vehicle routing, Neural network, Preference learning}
}
  • Refine by Author
  • 5 Guns, Tias
  • 2 Berden, Senne
  • 2 Kolb, Samuel
  • 2 Kumar, Mohit
  • 1 Bleukx, Ignace
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Constraint and logic programming
  • 1 Applied computing → Operations research
  • 1 Computing methodologies → Active learning settings
  • 1 Computing methodologies → Discrete space search
  • 1 Computing methodologies → Machine learning
  • Show More...

  • Refine by Keyword
  • 1 Active learning
  • 1 Constraint Learning
  • 1 Constraint Programming
  • 1 Constraint acquisition
  • 1 Constraint learning
  • Show More...

  • Refine by Type
  • 5 document

  • Refine by Publication Year
  • 2 2022
  • 2 2023
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail