Search Results

Documents authored by Stuckey, Peter J.


Found 2 Possible Name Variants:

Stuckey, Peter J.

Document
Anytime Approximate Formal Feature Attribution

Authors: Jinqiang Yu, Graham Farr, Alexey Ignatiev, and Peter J. Stuckey

Published in: LIPIcs, Volume 305, 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)


Abstract
Widespread use of artificial intelligence (AI) algorithms and machine learning (ML) models on the one hand and a number of crucial issues pertaining to them warrant the need for explainable artificial intelligence (XAI). A key explainability question is: given this decision was made, what are the input features which contributed to the decision? Although a range of XAI approaches exist to tackle this problem, most of them have significant limitations. Heuristic XAI approaches suffer from the lack of quality guarantees, and often try to approximate Shapley values, which is not the same as explaining which features contribute to a decision. A recent alternative is so-called formal feature attribution (FFA), which defines feature importance as the fraction of formal abductive explanations (AXp’s) containing the given feature. This measures feature importance from the view of formally reasoning about the model’s behavior. Namely, given a system of constraints logically representing the ML model of interest, computing an AXp requires finding a minimal unsatisfiable subset (MUS) of the system. It is challenging to compute FFA using its definition because that involves counting over all AXp’s (equivalently, counting over MUSes), although one can approximate it. Based on these results, this paper makes several contributions. First, it gives compelling evidence that computing FFA is intractable, even if the set of contrastive formal explanations (CXp’s), which correspond to minimal correction subsets (MCSes) of the logical system, is provided, by proving that the problem is #P-hard. Second, by using the duality between MUSes and MCSes, it proposes an efficient heuristic to switch from MCS enumeration to MUS enumeration on-the-fly resulting in an adaptive explanation enumeration algorithm effectively approximating FFA in an anytime fashion. Finally, experimental results obtained on a range of widely used datasets demonstrate the effectiveness of the proposed FFA approximation approach in terms of the error of FFA approximation as well as the number of explanations computed and their diversity given a fixed time limit.

Cite as

Jinqiang Yu, Graham Farr, Alexey Ignatiev, and Peter J. Stuckey. Anytime Approximate Formal Feature Attribution. In 27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 305, pp. 30:1-30:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{yu_et_al:LIPIcs.SAT.2024.30,
  author =	{Yu, Jinqiang and Farr, Graham and Ignatiev, Alexey and Stuckey, Peter J.},
  title =	{{Anytime Approximate Formal Feature Attribution}},
  booktitle =	{27th International Conference on Theory and Applications of Satisfiability Testing (SAT 2024)},
  pages =	{30:1--30:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-334-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{305},
  editor =	{Chakraborty, Supratik and Jiang, Jie-Hong Roland},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2024.30},
  URN =		{urn:nbn:de:0030-drops-205526},
  doi =		{10.4230/LIPIcs.SAT.2024.30},
  annote =	{Keywords: Explainable AI, Formal Feature Attribution, Minimal Unsatisfiable Subsets, MUS Enumeration}
}
Document
From Formal Boosted Tree Explanations to Interpretable Rule Sets

Authors: Jinqiang Yu, Alexey Ignatiev, and Peter J. Stuckey

Published in: LIPIcs, Volume 280, 29th International Conference on Principles and Practice of Constraint Programming (CP 2023)


Abstract
The rapid rise of Artificial Intelligence (AI) and Machine Learning (ML) has invoked the need for explainable AI (XAI). One of the most prominent approaches to XAI is to train rule-based ML models, e.g. decision trees, lists and sets, that are deemed interpretable due to their transparent nature. Recent years have witnessed a large body of work in the area of constraints- and reasoning-based approaches to the inference of interpretable models, in particular decision sets (DSes). Despite being shown to outperform heuristic approaches in terms of accuracy, most of them suffer from scalability issues and often fail to handle large training data, in which case no solution is offered. Motivated by this limitation and the success of gradient boosted trees, we propose a novel anytime approach to producing DSes that are both accurate and interpretable. The approach makes use of the concept of a generalized formal explanation and builds on the recent advances in formal explainability of gradient boosted trees. Experimental results obtained on a wide range of datasets, demonstrate that our approach produces DSes that more accurate than those of the state-of-the-art algorithms and comparable with them in terms of explanation size.

Cite as

Jinqiang Yu, Alexey Ignatiev, and Peter J. Stuckey. From Formal Boosted Tree Explanations to Interpretable Rule Sets. In 29th International Conference on Principles and Practice of Constraint Programming (CP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 280, pp. 38:1-38:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{yu_et_al:LIPIcs.CP.2023.38,
  author =	{Yu, Jinqiang and Ignatiev, Alexey and Stuckey, Peter J.},
  title =	{{From Formal Boosted Tree Explanations to Interpretable Rule Sets}},
  booktitle =	{29th International Conference on Principles and Practice of Constraint Programming (CP 2023)},
  pages =	{38:1--38:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-300-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{280},
  editor =	{Yap, Roland H. C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2023.38},
  URN =		{urn:nbn:de:0030-drops-190758},
  doi =		{10.4230/LIPIcs.CP.2023.38},
  annote =	{Keywords: Decision set, interpretable model, gradient boosted tree, BT compilation}
}
Document
Explaining Propagation for Gini and Spread with Variable Mean

Authors: Alexander Ek, Andreas Schutt, Peter J. Stuckey, and Guido Tack

Published in: LIPIcs, Volume 235, 28th International Conference on Principles and Practice of Constraint Programming (CP 2022)


Abstract
In optimisation problems involving multiple agents (stakeholders) we often want to make sure that the solution is balanced and fair. That is, we want to maximise total utility subject to an upper bound on the statistical dispersion (e.g., spread or the Gini coefficient) of the utility given to different agents, or minimise dispersion subject to some lower bounds on utility. These needs arise in, for example, balancing tardiness in scheduling, unwanted shifts in rostering, and desired resources in resource allocation, or minimising deviation from a baseline in schedule repair, to name a few. These problems are often quite challenging. To solve them efficiently we want to effectively reason about dispersion. Previous work has studied the case where the mean is fixed, but this may not be possible for many problems, e.g., scheduling where total utility depends on the final schedule. In this paper we introduce two log-linear-time dispersion propagators - (a) spread (variance, and indirectly standard deviation) and (b) the Gini coefficient - capable of explaining their propagations, thus allowing effective clause learning solvers to be applied to these problems. Propagators for (a) exist in the literature but do not explain themselves, while propagators for (b) have not been previously studied. We avoid introducing floating-point variables, which are usually not supported by learning solvers, by reasoning about scaled, integer versions of the constraints. We show through experimentation that clause learning can substantially improve the solving of problems where we want to bound dispersion and optimise total utility and vice versa.

Cite as

Alexander Ek, Andreas Schutt, Peter J. Stuckey, and Guido Tack. Explaining Propagation for Gini and Spread with Variable Mean. In 28th International Conference on Principles and Practice of Constraint Programming (CP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 235, pp. 21:1-21:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{ek_et_al:LIPIcs.CP.2022.21,
  author =	{Ek, Alexander and Schutt, Andreas and Stuckey, Peter J. and Tack, Guido},
  title =	{{Explaining Propagation for Gini and Spread with Variable Mean}},
  booktitle =	{28th International Conference on Principles and Practice of Constraint Programming (CP 2022)},
  pages =	{21:1--21:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-240-2},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{235},
  editor =	{Solnon, Christine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2022.21},
  URN =		{urn:nbn:de:0030-drops-166503},
  doi =		{10.4230/LIPIcs.CP.2022.21},
  annote =	{Keywords: Spread constraint, Gini index, Filtering algorithm, Constraint programming, Lazy clause generation}
}
Document
Optimising Training for Service Delivery

Authors: Ilankaikone Senthooran, Pierre Le Bodic, and Peter J. Stuckey

Published in: LIPIcs, Volume 210, 27th International Conference on Principles and Practice of Constraint Programming (CP 2021)


Abstract
We study the problem of training a roster of engineers, who are scheduled to respond to service calls that require a set of skills, and where engineers and calls have different locations. Both training an engineer in a skill and sending an engineer to respond a non-local service call incur a cost. Alternatively, a local contractor can be hired. The problem consists in training engineers in skills so that the quality of service (i.e. response time) is maximised and costs are minimised. The problem is hard to solve in practice partly because (1) the value of training an engineer in one skill depends on other training decisions, (2) evaluating training decisions means evaluating the schedules that are now made possible by the new skills, and (3) these schedules must be computed over a long time horizon, otherwise training may not pay off. We show that a monolithic approach to this problem is not practical. Instead, we decompose it into three subproblems, modelled with MiniZinc. This allows us to pick the approach that works best for each subproblem (MIP or CP) and provide good solutions to the problem. Data is provided by a multinational company.

Cite as

Ilankaikone Senthooran, Pierre Le Bodic, and Peter J. Stuckey. Optimising Training for Service Delivery. In 27th International Conference on Principles and Practice of Constraint Programming (CP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 210, pp. 48:1-48:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{senthooran_et_al:LIPIcs.CP.2021.48,
  author =	{Senthooran, Ilankaikone and Le Bodic, Pierre and Stuckey, Peter J.},
  title =	{{Optimising Training for Service Delivery}},
  booktitle =	{27th International Conference on Principles and Practice of Constraint Programming (CP 2021)},
  pages =	{48:1--48:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-211-2},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{210},
  editor =	{Michel, Laurent D.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2021.48},
  URN =		{urn:nbn:de:0030-drops-153395},
  doi =		{10.4230/LIPIcs.CP.2021.48},
  annote =	{Keywords: Scheduling, Task Allocation, Training Optimisation}
}
Document
Abstract Interpretation, Symbolic Execution and Constraints

Authors: Roberto Amadini, Graeme Gange, Peter Schachte, Harald Søndergaard, and Peter J. Stuckey

Published in: OASIcs, Volume 86, Recent Developments in the Design and Implementation of Programming Languages (2020)


Abstract
Abstract interpretation is a static analysis framework for sound over-approximation of all possible runtime states of a program. Symbolic execution is a framework for reachability analysis which tries to explore all possible execution paths of a program. A shared feature between abstract interpretation and symbolic execution is that each - implicitly or explicitly - maintains constraints during execution, in the form of invariants or path conditions. We investigate the relations between the worlds of abstract interpretation, symbolic execution and constraint solving, to expose potential synergies.

Cite as

Roberto Amadini, Graeme Gange, Peter Schachte, Harald Søndergaard, and Peter J. Stuckey. Abstract Interpretation, Symbolic Execution and Constraints. In Recent Developments in the Design and Implementation of Programming Languages. Open Access Series in Informatics (OASIcs), Volume 86, pp. 7:1-7:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{amadini_et_al:OASIcs.Gabbrielli.7,
  author =	{Amadini, Roberto and Gange, Graeme and Schachte, Peter and S{\o}ndergaard, Harald and Stuckey, Peter J.},
  title =	{{Abstract Interpretation, Symbolic Execution and Constraints}},
  booktitle =	{Recent Developments in the Design and Implementation of Programming Languages},
  pages =	{7:1--7:19},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-171-9},
  ISSN =	{2190-6807},
  year =	{2020},
  volume =	{86},
  editor =	{de Boer, Frank S. and Mauro, Jacopo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.Gabbrielli.7},
  URN =		{urn:nbn:de:0030-drops-132294},
  doi =		{10.4230/OASIcs.Gabbrielli.7},
  annote =	{Keywords: Abstract interpretation, symbolic execution, constraint solving, dynamic analysis, static analysis}
}
Document
Constraint Propagation and Explanation over Novel Types by Abstract Compilation

Authors: Graeme Gange and Peter J. Stuckey

Published in: OASIcs, Volume 52, Technical Communications of the 32nd International Conference on Logic Programming (ICLP 2016)


Abstract
The appeal of constraint programming (CP) lies in compositionality – the ability to mix and match constraints as needed. However, this flexibility typically does not extend to the types of variables. Solvers usually support only a small set of pre-defined variable types, and extending this is not typically a simple exercise: not only must the solver engine be updated, but then the library of supported constraints must be re-implemented to support the new type. In this paper, we attempt to ease this second step. We describe a system for automatically deriving a native-code implementation of a global constraint (over novel variable types) from a declarative specification, complete with the ability to explain its propagation, a requirement if we want to make use of modern lazy clause generation CP solvers. We demonstrate this approach by adding support for wrapped-integer variables to chuffed, a lazy clause generation CP solver.

Cite as

Graeme Gange and Peter J. Stuckey. Constraint Propagation and Explanation over Novel Types by Abstract Compilation. In Technical Communications of the 32nd International Conference on Logic Programming (ICLP 2016). Open Access Series in Informatics (OASIcs), Volume 52, pp. 13:1-13:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{gange_et_al:OASIcs.ICLP.2016.13,
  author =	{Gange, Graeme and Stuckey, Peter J.},
  title =	{{Constraint Propagation and Explanation over Novel Types by Abstract Compilation}},
  booktitle =	{Technical Communications of the 32nd International Conference on Logic Programming (ICLP 2016)},
  pages =	{13:1--13:14},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-007-1},
  ISSN =	{2190-6807},
  year =	{2016},
  volume =	{52},
  editor =	{Carro, Manuel and King, Andy and Saeedloei, Neda and De Vos, Marina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.ICLP.2016.13},
  URN =		{urn:nbn:de:0030-drops-67437},
  doi =		{10.4230/OASIcs.ICLP.2016.13},
  annote =	{Keywords: constraint programming, program synthesis, program analysis}
}
Document
Constraints in Non-Boolean Contexts

Authors: Leslie De Koninck, Sebastian Brand, and Peter J. Stuckey

Published in: LIPIcs, Volume 11, Technical Communications of the 27th International Conference on Logic Programming (ICLP'11) (2011)


Abstract
In high-level constraint modelling languages, constraints can occur in non-Boolean contexts: implicitly, in the form of partial functions, or more explicitly, in the form of constraints on local variables in non-Boolean expressions. Specifications using these facilities are often more succinct. However, these specifications are typically executed on solvers that only support questions of the form of existentially quantified conjunctions of constraints. We show how we can translate expressions with constraints appearing in non-Boolean contexts into conjunctions of ordinary constraints. The translation is clearly structured into constrained type elimination, local variable lifting and partial function elimination. We explain our approach in the context of the modelling language Zinc. An implementation of it is an integral part of our Zinc compiler.

Cite as

Leslie De Koninck, Sebastian Brand, and Peter J. Stuckey. Constraints in Non-Boolean Contexts. In Technical Communications of the 27th International Conference on Logic Programming (ICLP'11). Leibniz International Proceedings in Informatics (LIPIcs), Volume 11, pp. 117-127, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{dekoninck_et_al:LIPIcs.ICLP.2011.117,
  author =	{De Koninck, Leslie and Brand, Sebastian and Stuckey, Peter J.},
  title =	{{Constraints in Non-Boolean Contexts}},
  booktitle =	{Technical Communications of the 27th International Conference on Logic Programming (ICLP'11)},
  pages =	{117--127},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-31-6},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{11},
  editor =	{Gallagher, John P. and Gelfond, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICLP.2011.117},
  URN =		{urn:nbn:de:0030-drops-31685},
  doi =		{10.4230/LIPIcs.ICLP.2011.117},
  annote =	{Keywords: Constraint modelling languages, model transformation, partial functions}
}

Stuckey, Peter

Document
Lazy Model Expansion by Incremental Grounding

Authors: Broes De Cat, Marc Denecker, and Peter Stuckey

Published in: LIPIcs, Volume 17, Technical Communications of the 28th International Conference on Logic Programming (ICLP'12) (2012)


Abstract
Ground-and-solve methods used in state-of-the-art Answer Set Programming and model expansion systems proceed by rewriting the problem specification into a ground format and afterwards applying search. A disadvantage of such approaches is that the rewriting step blows up the original specification for large input domains and is unfeasible in case of infinite domains. In this paper we describe a lazy approach to model expansion in the context of first-order logic that can cope with large and infinite problem domains. The method interleaves grounding and search, incrementally extending the current partial grounding only when necessary. It often allows to solve the original problem without creating the full grounding and is hence more widely applicable than ground-and-solve. We report on an existing implementation within the IDP system and on experiments that show the promise of the method.

Cite as

Broes De Cat, Marc Denecker, and Peter Stuckey. Lazy Model Expansion by Incremental Grounding. In Technical Communications of the 28th International Conference on Logic Programming (ICLP'12). Leibniz International Proceedings in Informatics (LIPIcs), Volume 17, pp. 201-211, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{decat_et_al:LIPIcs.ICLP.2012.201,
  author =	{De Cat, Broes and Denecker, Marc and Stuckey, Peter},
  title =	{{Lazy Model Expansion by Incremental Grounding}},
  booktitle =	{Technical Communications of the 28th International Conference on Logic Programming (ICLP'12)},
  pages =	{201--211},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-43-9},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{17},
  editor =	{Dovier, Agostino and Santos Costa, V{\'\i}tor},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICLP.2012.201},
  URN =		{urn:nbn:de:0030-drops-36222},
  doi =		{10.4230/LIPIcs.ICLP.2012.201},
  annote =	{Keywords: Knowledge representation and reasoning, model generation, grounding, IDP framework, first-order logic}
}
Document
Branch-and-Price Solving in G12

Authors: Jakob Puchinger, Peter Stuckey, Mark Wallace, and Sebastian Brand

Published in: Dagstuhl Seminar Proceedings, Volume 9261, Models and Algorithms for Optimization in Logistics (2009)


Abstract
The G12 project is developing a software environment for stating and solving combinatorial problems by mapping a high-level model of the problem to an efficient combination of solving methods. Model annotations are used to control this process. In this paper we explain the mapping to branch-and-price solving. G12 supports the selection of specialised subproblem solvers, the aggregation of identical subproblems, automatic disaggregation when required by search, and the use of specialised branching rules. We demonstrate the benefits of the G12 framework on three examples: a trucking problem, cutting stock, and two-dimensional bin packing.

Cite as

Jakob Puchinger, Peter Stuckey, Mark Wallace, and Sebastian Brand. Branch-and-Price Solving in G12. In Models and Algorithms for Optimization in Logistics. Dagstuhl Seminar Proceedings, Volume 9261, pp. 1-3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{puchinger_et_al:DagSemProc.09261.6,
  author =	{Puchinger, Jakob and Stuckey, Peter and Wallace, Mark and Brand, Sebastian},
  title =	{{Branch-and-Price Solving in G12}},
  booktitle =	{Models and Algorithms for Optimization in Logistics},
  pages =	{1--3},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9261},
  editor =	{Cynthia Barnhart and Uwe Clausen and Ulrich Lauther and Rolf H. M\"{o}hring},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.09261.6},
  URN =		{urn:nbn:de:0030-drops-21641},
  doi =		{10.4230/DagSemProc.09261.6},
  annote =	{Keywords: Combinatorial optimization, branch-and-price, software}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail