13 Search Results for "Jensen, Thomas"


Document
Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions

Authors: Andreas Grigorjew, Fernando H. C. Dias, Andrea Cracco, Romeo Rizzi, and Alexandru I. Tomescu

Published in: LIPIcs, Volume 301, 22nd International Symposium on Experimental Algorithms (SEA 2024)


Abstract
Given a flow network, the Minimum Flow Decomposition (MFD) problem is finding the smallest possible set of weighted paths whose superposition equals the flow. It is a classical, strongly NP-hard problem that is proven to be useful in RNA transcript assembly and applications outside of Bioinformatics. We improve an existing ILP (Integer Linear Programming) model by Dias et al. [RECOMB 2022] for DAGs by decreasing the solver’s search space using solution safety and several other optimizations. This results in a significant speedup compared to the original ILP, of up to 34× on average on the hardest instances. Moreover, we show that our optimizations apply also to MFD problem variants, resulting in speedups that go up to 219× on the hardest instances. We also developed an ILP model of reduced dimensionality for an MFD variant in which the solution path weights are restricted to a given set. This model can find an optimal MFD solution for most instances, and overall, its accuracy significantly outperforms that of previous greedy algorithms while being up to an order of magnitude faster than our optimized ILP.

Cite as

Andreas Grigorjew, Fernando H. C. Dias, Andrea Cracco, Romeo Rizzi, and Alexandru I. Tomescu. Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions. In 22nd International Symposium on Experimental Algorithms (SEA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 301, pp. 14:1-14:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{grigorjew_et_al:LIPIcs.SEA.2024.14,
  author =	{Grigorjew, Andreas and Dias, Fernando H. C. and Cracco, Andrea and Rizzi, Romeo and Tomescu, Alexandru I.},
  title =	{{Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions}},
  booktitle =	{22nd International Symposium on Experimental Algorithms (SEA 2024)},
  pages =	{14:1--14:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-325-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{301},
  editor =	{Liberti, Leo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SEA.2024.14},
  URN =		{urn:nbn:de:0030-drops-203792},
  doi =		{10.4230/LIPIcs.SEA.2024.14},
  annote =	{Keywords: Flow decomposition, Integer Linear Programming, Safety, RNA-seq, RNA transcript assembly, isoform}
}
Document
Optimizing a Non-Deterministic Abstract Machine with Environments

Authors: Małgorzata Biernacka, Dariusz Biernacki, Sergueï Lenglet, and Alan Schmitt

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
Non-deterministic abstract machine (NDAM) is a recent implementation model for programming languages where one must choose among several redexes at each reduction step, like process calculi. These machines can be derived from a zipper semantics, a mix between structural operational semantics and context-based reduction semantics. Such a machine has been generated also for the λ-calculus without a fixed reduction strategy, i.e., with the full non-deterministic β-reduction. In that machine, substitution is an external operation that replaces all the occurrences of a variable at once. Implementing substitution with environments is more low-level and more efficient as variables are replaced only when needed. In this paper, we define a NDAM with environments for the λ-calculus without a fixed reduction strategy. We also introduce other optimizations, including a form of refocusing, and we show that we can restrict our optimized NDAM to recover some of the usual λ-calculus machines, e.g., the Krivine Abstract Machine. Most of the improvements we propose in this work could be applied to other NDAMs as well.

Cite as

Małgorzata Biernacka, Dariusz Biernacki, Sergueï Lenglet, and Alan Schmitt. Optimizing a Non-Deterministic Abstract Machine with Environments. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 11:1-11:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{biernacka_et_al:LIPIcs.FSCD.2024.11,
  author =	{Biernacka, Ma{\l}gorzata and Biernacki, Dariusz and Lenglet, Sergue\"{i} and Schmitt, Alan},
  title =	{{Optimizing a Non-Deterministic Abstract Machine with Environments}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{11:1--11:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.11},
  URN =		{urn:nbn:de:0030-drops-203409},
  doi =		{10.4230/LIPIcs.FSCD.2024.11},
  annote =	{Keywords: Abstract machine, Explicit substitutions, Refocusing}
}
Document
Track A: Algorithms, Complexity and Games
Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
It is known for many algorithmic problems that if a tree decomposition of width t is given in the input, then the problem can be solved with exponential dependence on t. A line of research initiated by Lokshtanov, Marx, and Saurabh [SODA 2011] produced lower bounds showing that in many cases known algorithms already achieve the best possible exponential dependence on t, assuming the Strong Exponential-Time Hypothesis (SETH). The main message of this paper is showing that the same lower bounds can already be obtained in a much more restricted setting: informally, a graph consisting of a block of t vertices connected to components of constant size already has the same hardness as a general tree decomposition of width t. Formally, a (σ,δ)-hub is a set Q of vertices such that every component of Q has size at most σ and is adjacent to at most δ vertices of Q. We explore if the known tight lower bounds parameterized by the width of the given tree decomposition remain valid if we parameterize by the size of the given hub. - For every ε > 0, there are σ,δ > 0 such that Independent Set (equivalently Vertex Cover) cannot be solved in time (2-ε)^p⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the SETH. This matches the earlier tight lower bounds parameterized by width of the tree decomposition. Similar tight bounds are obtained for Odd Cycle Transversal, Max Cut, q-Coloring, and edge/vertex deletions versions of q-Coloring. - For every ε > 0, there are σ,δ > 0 such that △-Partition cannot be solved in time (2-ε)^p ⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the Set Cover Conjecture (SCC). In fact, we prove that this statement is equivalent to the SCC, thus it is unlikely that this could be proved assuming the SETH. - For Dominating Set, we can prove a non-tight lower bound ruling out (2-ε)^p ⋅ n^𝒪(1) algorithms, assuming either the SETH or the SCC, but this does not match the 3^p⋅ n^{𝒪(1)} upper bound. Thus our results reveal that, for many problems, the research on lower bounds on the dependence on tree width was never really about tree decompositions, but the real source of hardness comes from a much simpler structure. Additionally, we study if the same lower bounds can be obtained if σ and δ are fixed universal constants (not depending on ε). We show that lower bounds of this form are possible for Max Cut and the edge-deletion version of q-Coloring, under the Max 3-Sat Hypothesis (M3SH). However, no such lower bounds are possible for Independent Set, Odd Cycle Transversal, and the vertex-deletion version of q-Coloring: better than brute force algorithms are possible for every fixed (σ,δ).

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 34:1-34:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ICALP.2024.34,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{34:1--34:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.34},
  URN =		{urn:nbn:de:0030-drops-201772},
  doi =		{10.4230/LIPIcs.ICALP.2024.34},
  annote =	{Keywords: Parameterized Complexity, Tight Bounds, Hub, Treewidth, Strong Exponential Time Hypothesis, Vertex Coloring, Vertex Deletion, Edge Deletion, Triangle Packing, Triangle Partition, Set Cover Hypothesis, Dominating Set}
}
Document
Track A: Algorithms, Complexity and Games
Lower Bounds for Matroid Optimization Problems with a Linear Constraint

Authors: Ilan Doron-Arad, Ariel Kulik, and Hadas Shachnai

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
We study a family of matroid optimization problems with a linear constraint (MOL). In these problems, we seek a subset of elements which optimizes (i.e., maximizes or minimizes) a linear objective function subject to (i) a matroid independent set, or a matroid basis constraint, (ii) additional linear constraint. A notable member in this family is budgeted matroid independent set (BM), which can be viewed as classic 0/1-Knapsack with a matroid constraint. While special cases of BM, such as knapsack with cardinality constraint and multiple-choice knapsack, admit a fully polynomial-time approximation scheme (Fully PTAS), the best known result for BM on a general matroid is an Efficient PTAS. Prior to this work, the existence of a Fully PTAS for BM, and more generally, for any problem in the family of MOL problems, has been open. In this paper, we answer this question negatively by showing that none of the (non-trivial) problems in this family admits a Fully PTAS. This resolves the complexity status of several well studied problems. Our main result is obtained by showing first that exact weight matroid basis (EMB) does not admit a pseudo-polynomial time algorithm. This distinguishes EMB from the special cases of k-subset sum and EMB on a linear matroid, which are solvable in pseudo-polynomial time. We then obtain unconditional hardness results for the family of MOL problems in the oracle model (even if randomization is allowed), and show that the same results hold when the matroids are encoded as part of the input, assuming P ≠ NP. For the hardness proof of EMB, we introduce the Π-matroid family. This intricate subclass of matroids, which exploits the interaction between a weight function and the matroid constraint, may find use in tackling other matroid optimization problems.

Cite as

Ilan Doron-Arad, Ariel Kulik, and Hadas Shachnai. Lower Bounds for Matroid Optimization Problems with a Linear Constraint. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 56:1-56:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{doronarad_et_al:LIPIcs.ICALP.2024.56,
  author =	{Doron-Arad, Ilan and Kulik, Ariel and Shachnai, Hadas},
  title =	{{Lower Bounds for Matroid Optimization Problems with a Linear Constraint}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{56:1--56:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.56},
  URN =		{urn:nbn:de:0030-drops-201990},
  doi =		{10.4230/LIPIcs.ICALP.2024.56},
  annote =	{Keywords: matroid optimization, budgeted problems, knapsack, approximation schemes}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Smoothed Analysis of Deterministic Discounted and Mean-Payoff Games

Authors: Bruno Loff and Mateusz Skomra

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
We devise a policy-iteration algorithm for deterministic two-player discounted and mean-payoff games, that runs in polynomial time with high probability, on any input where each payoff is chosen independently from a sufficiently random distribution and the underlying graph of the game is ergodic. This includes the case where an arbitrary set of payoffs has been perturbed by a Gaussian, showing for the first time that deterministic two-player games can be solved efficiently, in the sense of smoothed analysis. More generally, we devise a condition number for deterministic discounted and mean-payoff games played on ergodic graphs, and show that our algorithm runs in time polynomial in this condition number. Our result confirms a previous conjecture of Boros et al., which was claimed as a theorem [Boros et al., 2011] and later retracted [Boros et al., 2018]. It stands in contrast with a recent counter-example by Christ and Yannakakis [Christ and Yannakakis, 2023], showing that Howard’s policy-iteration algorithm does not run in smoothed polynomial time on stochastic single-player mean-payoff games. Our approach is inspired by the analysis of random optimal assignment instances by Frieze and Sorkin [Frieze and Sorkin, 2007], and the analysis of bias-induced policies for mean-payoff games by Akian, Gaubert and Hochart [Akian et al., 2018].

Cite as

Bruno Loff and Mateusz Skomra. Smoothed Analysis of Deterministic Discounted and Mean-Payoff Games. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 147:1-147:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{loff_et_al:LIPIcs.ICALP.2024.147,
  author =	{Loff, Bruno and Skomra, Mateusz},
  title =	{{Smoothed Analysis of Deterministic Discounted and Mean-Payoff Games}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{147:1--147:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.147},
  URN =		{urn:nbn:de:0030-drops-202908},
  doi =		{10.4230/LIPIcs.ICALP.2024.147},
  annote =	{Keywords: Mean-payoff games, discounted games, policy iteration, smoothed analysis}
}
Document
Track A: Algorithms, Complexity and Games
Two Choices Are Enough for P-LCPs, USOs, and Colorful Tangents

Authors: Michaela Borzechowski, John Fearnley, Spencer Gordon, Rahul Savani, Patrick Schnider, and Simon Weber

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
We provide polynomial-time reductions between three search problems from three distinct areas: the P-matrix linear complementarity problem (P-LCP), finding the sink of a unique sink orientation (USO), and a variant of the α-Ham Sandwich problem. For all three settings, we show that "two choices are enough", meaning that the general non-binary version of the problem can be reduced in polynomial time to the binary version. This specifically means that generalized P-LCPs are equivalent to P-LCPs, and grid USOs are equivalent to cube USOs. These results are obtained by showing that both the P-LCP and our α-Ham Sandwich variant are equivalent to a new problem we introduce, P-Lin-Bellman. This problem can be seen as a new tool for formulating problems as P-LCPs.

Cite as

Michaela Borzechowski, John Fearnley, Spencer Gordon, Rahul Savani, Patrick Schnider, and Simon Weber. Two Choices Are Enough for P-LCPs, USOs, and Colorful Tangents. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 32:1-32:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{borzechowski_et_al:LIPIcs.ICALP.2024.32,
  author =	{Borzechowski, Michaela and Fearnley, John and Gordon, Spencer and Savani, Rahul and Schnider, Patrick and Weber, Simon},
  title =	{{Two Choices Are Enough for P-LCPs, USOs, and Colorful Tangents}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{32:1--32:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.32},
  URN =		{urn:nbn:de:0030-drops-201751},
  doi =		{10.4230/LIPIcs.ICALP.2024.32},
  annote =	{Keywords: P-LCP, Unique Sink Orientation, \alpha-Ham Sandwich, search complexity, TFNP, UEOPL}
}
Document
Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282)

Authors: James P. Delgrande, Birte Glimm, Thomas Meyer, Miroslaw Truszczynski, and Frank Wolter

Published in: Dagstuhl Manifestos, Volume 10, Issue 1 (2024)


Abstract
Knowledge Representation and Reasoning is a central, longstanding, and active area of Artificial Intelligence. Over the years it has evolved significantly; more recently it has been challenged and complemented by research in areas such as machine learning and reasoning under uncertainty. In July 2022,sser a Dagstuhl Perspectives workshop was held on Knowledge Representation and Reasoning. The goal of the workshop was to describe the state of the art in the field, including its relation with other areas, its shortcomings and strengths, together with recommendations for future progress. We developed this manifesto based on the presentations, panels, working groups, and discussions that took place at the Dagstuhl Workshop. It is a declaration of our views on Knowledge Representation: its origins, goals, milestones, and current foci; its relation to other disciplines, especially to Artificial Intelligence; and on its challenges, along with key priorities for the next decade.

Cite as

James P. Delgrande, Birte Glimm, Thomas Meyer, Miroslaw Truszczynski, and Frank Wolter. Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282). In Dagstuhl Manifestos, Volume 10, Issue 1, pp. 1-61, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{delgrande_et_al:DagMan.10.1.1,
  author =	{Delgrande, James P. and Glimm, Birte and Meyer, Thomas and Truszczynski, Miroslaw and Wolter, Frank},
  title =	{{Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282)}},
  pages =	{1--61},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2024},
  volume =	{10},
  number =	{1},
  editor =	{Delgrande, James P. and Glimm, Birte and Meyer, Thomas and Truszczynski, Miroslaw and Wolter, Frank},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.10.1.1},
  URN =		{urn:nbn:de:0030-drops-201403},
  doi =		{10.4230/DagMan.10.1.1},
  annote =	{Keywords: Knowledge representation and reasoning, Applications of logics, Declarative representations, Formal logic}
}
Document
Survey
Logics for Conceptual Data Modelling: A Review

Authors: Pablo R. Fillottrani and C. Maria Keet

Published in: TGDK, Volume 2, Issue 1 (2024): Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge, Volume 2, Issue 1


Abstract
Information modelling for databases and object-oriented information systems avails of conceptual data modelling languages such as EER and UML Class Diagrams. Many attempts exist to add logical rigour to them, for various reasons and with disparate strengths. In this paper we aim to provide a structured overview of the many efforts. We focus on aims, approaches to the formalisation, including key dimensions of choice points, popular logics used, and the main relevant reasoning services. We close with current challenges and research directions.

Cite as

Pablo R. Fillottrani and C. Maria Keet. Logics for Conceptual Data Modelling: A Review. In Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge (TGDK), Volume 2, Issue 1, pp. 4:1-4:30, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{fillottrani_et_al:TGDK.2.1.4,
  author =	{Fillottrani, Pablo R. and Keet, C. Maria},
  title =	{{Logics for Conceptual Data Modelling: A Review}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{4:1--4:30},
  ISSN =	{2942-7517},
  year =	{2024},
  volume =	{2},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.2.1.4},
  URN =		{urn:nbn:de:0030-drops-198616},
  doi =		{10.4230/TGDK.2.1.4},
  annote =	{Keywords: Conceptual Data Modelling, EER, UML, Description Logics, OWL}
}
Document
Automata-Based Verification of Relational Properties of Functions over Algebraic Data Structures

Authors: Théo Losekoot, Thomas Genet, and Thomas Jensen

Published in: LIPIcs, Volume 260, 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)


Abstract
This paper is concerned with automatically proving properties about the input-output relation of functional programs operating over algebraic data types. Recent results show how to approximate the image of a functional program using a regular tree language. Though expressive, those techniques cannot prove properties relating the input and the output of a function, e.g., proving that the output of a function reversing a list has the same length as the input list. In this paper, we built upon those results and define a procedure to compute or over-approximate such a relation. Instead of representing the image of a function by a regular set of terms, we represent (an approximation of) the input-output relation by a regular set of tuples of terms. Regular languages of tuples of terms are recognized using a tree automaton recognizing convolutions of terms, where a convolution transforms a tuple of terms into a term built on tuples of symbols. Both the program and the properties are transformed into predicates and Constrained Horn clauses (CHCs). Then, using an Implication Counter Example procedure (ICE), we infer a model of the clauses, associating to each predicate a regular relation. In this ICE procedure, checking if a given model satisfies the clauses is undecidable in general. We overcome undecidability by proposing an incomplete but sound inference procedure for such relational regular properties. Though the procedure is incomplete, its implementation performs well on 120 examples. It efficiently proves non-trivial relational properties or finds counter-examples.

Cite as

Théo Losekoot, Thomas Genet, and Thomas Jensen. Automata-Based Verification of Relational Properties of Functions over Algebraic Data Structures. In 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 260, pp. 7:1-7:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{losekoot_et_al:LIPIcs.FSCD.2023.7,
  author =	{Losekoot, Th\'{e}o and Genet, Thomas and Jensen, Thomas},
  title =	{{Automata-Based Verification of Relational Properties of Functions over Algebraic Data Structures}},
  booktitle =	{8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)},
  pages =	{7:1--7:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-277-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{260},
  editor =	{Gaboardi, Marco and van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2023.7},
  URN =		{urn:nbn:de:0030-drops-179915},
  doi =		{10.4230/LIPIcs.FSCD.2023.7},
  annote =	{Keywords: Formal verification, Tree automata, Constrained Horn Clauses, Model inference, Relational properties, Algebraic datatypes}
}
Document
Invited Talk
An Updated Survey of Bidding Games on Graphs (Invited Talk)

Authors: Guy Avni and Thomas A. Henzinger

Published in: LIPIcs, Volume 241, 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)


Abstract
A graph game is a two-player zero-sum game in which the players move a token throughout a graph to produce an infinite path, which determines the winner or payoff of the game. In bidding games, both players have budgets, and in each turn, we hold an "auction" (bidding) to determine which player moves the token. In this survey, we consider several bidding mechanisms and their effect on the properties of the game. Specifically, bidding games, and in particular bidding games of infinite duration, have an intriguing equivalence with random-turn games in which in each turn, the player who moves is chosen randomly. We summarize how minor changes in the bidding mechanism lead to unexpected differences in the equivalence with random-turn games.

Cite as

Guy Avni and Thomas A. Henzinger. An Updated Survey of Bidding Games on Graphs (Invited Talk). In 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 241, pp. 3:1-3:6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{avni_et_al:LIPIcs.MFCS.2022.3,
  author =	{Avni, Guy and Henzinger, Thomas A.},
  title =	{{An Updated Survey of Bidding Games on Graphs}},
  booktitle =	{47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)},
  pages =	{3:1--3:6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-256-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{241},
  editor =	{Szeider, Stefan and Ganian, Robert and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2022.3},
  URN =		{urn:nbn:de:0030-drops-168017},
  doi =		{10.4230/LIPIcs.MFCS.2022.3},
  annote =	{Keywords: Bidding games, Richman bidding, poorman bidding, mean-payoff, parity}
}
Document
Invited Paper
A Survey of Bidding Games on Graphs (Invited Paper)

Authors: Guy Avni and Thomas A. Henzinger

Published in: LIPIcs, Volume 171, 31st International Conference on Concurrency Theory (CONCUR 2020)


Abstract
A graph game is a two-player zero-sum game in which the players move a token throughout a graph to produce an infinite path, which determines the winner or payoff of the game. In bidding games, both players have budgets, and in each turn, we hold an "auction" (bidding) to determine which player moves the token. In this survey, we consider several bidding mechanisms and study their effect on the properties of the game. Specifically, bidding games, and in particular bidding games of infinite duration, have an intriguing equivalence with random-turn games in which in each turn, the player who moves is chosen randomly. We show how minor changes in the bidding mechanism lead to unexpected differences in the equivalence with random-turn games.

Cite as

Guy Avni and Thomas A. Henzinger. A Survey of Bidding Games on Graphs (Invited Paper). In 31st International Conference on Concurrency Theory (CONCUR 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 171, pp. 2:1-2:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{avni_et_al:LIPIcs.CONCUR.2020.2,
  author =	{Avni, Guy and Henzinger, Thomas A.},
  title =	{{A Survey of Bidding Games on Graphs}},
  booktitle =	{31st International Conference on Concurrency Theory (CONCUR 2020)},
  pages =	{2:1--2:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-160-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{171},
  editor =	{Konnov, Igor and Kov\'{a}cs, Laura},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2020.2},
  URN =		{urn:nbn:de:0030-drops-128147},
  doi =		{10.4230/LIPIcs.CONCUR.2020.2},
  annote =	{Keywords: Bidding games, Richman bidding, poorman bidding, mean-payoff, parity}
}
Document
Leveraging relational autocorrelation with latent group models

Authors: Jennifer Neville and David Jensen

Published in: Dagstuhl Seminar Proceedings, Volume 5051, Probabilistic, Logical and Relational Learning - Towards a Synthesis (2006)


Abstract
The presence of autocorrelation provides strong motivation for using relational techniques for learning and inference. Autocorrelation is a statistical dependency between the values of the same variable on related entities and is a nearly ubiquitous characteristic of relational data sets. Recent research has explored the use of collective inference techniques to exploit this phenomenon. These techniques achieve significant performance gains by modeling observed correlations among class labels of related instances, but the models fail to capture a frequent cause of autocorrelation---the presence of underlying groups that influence the attributes on a set of entities. We propose a latent group model (LGM) for relational data, which discovers and exploits the hidden structures responsible for the observed autocorrelation among class labels. Modeling the latent group structure improves model performance, increases inference efficiency, and enhances our understanding of the datasets. We evaluate performance on three relational classification tasks and show that LGM outperforms models that ignore latent group structure when there is little known information with which to seed inference.

Cite as

Jennifer Neville and David Jensen. Leveraging relational autocorrelation with latent group models. In Probabilistic, Logical and Relational Learning - Towards a Synthesis. Dagstuhl Seminar Proceedings, Volume 5051, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{neville_et_al:DagSemProc.05051.10,
  author =	{Neville, Jennifer and Jensen, David},
  title =	{{Leveraging relational autocorrelation with latent group models}},
  booktitle =	{Probabilistic, Logical and Relational Learning - Towards a Synthesis},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5051},
  editor =	{Luc De Raedt and Thomas Dietterich and Lise Getoor and Stephen H. Muggleton},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05051.10},
  URN =		{urn:nbn:de:0030-drops-4201},
  doi =		{10.4230/DagSemProc.05051.10},
  annote =	{Keywords: Statistical relational learning, probabilistic relational models, latent variable models, autocorrelation, collective inference}
}
Document
04441 Working Group – Towards a Handbook for User-Centred Mobile Application Design

Authors: Susanne Boll, Martin Breunig, Nigel Davies, Christian S. Jensen, Birgitta König-Ries, Rainer Malaka, Florian Matthes, Christoforos Panayiotou, Simonas Saltenis, and Thomas Schwarz

Published in: Dagstuhl Seminar Proceedings, Volume 4441, Mobile Information Management (2005)


Abstract
Why do we have difficulties designing mobile apps? Is there a "Mobile RUP"?

Cite as

Susanne Boll, Martin Breunig, Nigel Davies, Christian S. Jensen, Birgitta König-Ries, Rainer Malaka, Florian Matthes, Christoforos Panayiotou, Simonas Saltenis, and Thomas Schwarz. 04441 Working Group – Towards a Handbook for User-Centred Mobile Application Design. In Mobile Information Management. Dagstuhl Seminar Proceedings, Volume 4441, pp. 1-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2005)


Copy BibTex To Clipboard

@InProceedings{boll_et_al:DagSemProc.04441.8,
  author =	{Boll, Susanne and Breunig, Martin and Davies, Nigel and Jensen, Christian S. and K\"{o}nig-Ries, Birgitta and Malaka, Rainer and Matthes, Florian and Panayiotou, Christoforos and Saltenis, Simonas and Schwarz, Thomas},
  title =	{{04441 Working Group – Towards a Handbook for User-Centred Mobile Application Design}},
  booktitle =	{Mobile Information Management},
  pages =	{1--8},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2005},
  volume =	{4441},
  editor =	{Margaret H. Dunham and Birgitta K\"{o}nig-Ries and Evaggelia Pitoura and Peter Reiher and Can T\"{u}rker},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.04441.8},
  URN =		{urn:nbn:de:0030-drops-1662},
  doi =		{10.4230/DagSemProc.04441.8},
  annote =	{Keywords: User-Centred Mobile Application Design}
}
  • Refine by Author
  • 2 Avni, Guy
  • 2 Henzinger, Thomas A.
  • 1 Biernacka, Małgorzata
  • 1 Biernacki, Dariusz
  • 1 Boll, Susanne
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Formal languages and automata theory
  • 2 Theory of computation → Solution concepts in game theory
  • 1 Applied computing → Bioinformatics
  • 1 Computing methodologies → Artificial intelligence
  • 1 Computing methodologies → Description logics
  • Show More...

  • Refine by Keyword
  • 2 Bidding games
  • 2 Richman bidding
  • 2 mean-payoff
  • 2 parity
  • 2 poorman bidding
  • Show More...

  • Refine by Type
  • 13 document

  • Refine by Publication Year
  • 8 2024
  • 1 2005
  • 1 2006
  • 1 2020
  • 1 2022
  • Show More...