61 Search Results for "Lang, Richard"


Document
Survey
Resilience in Knowledge Graph Embeddings

Authors: Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo

Published in: TGDK, Volume 3, Issue 2 (2025). Transactions on Graph Data and Knowledge, Volume 3, Issue 2


Abstract
In recent years, knowledge graphs have gained interest and witnessed widespread applications in various domains, such as information retrieval, question-answering, recommendation systems, amongst others. Large-scale knowledge graphs to this end have demonstrated their utility in effectively representing structured knowledge. To further facilitate the application of machine learning techniques, knowledge graph embedding models have been developed. Such models can transform entities and relationships within knowledge graphs into vectors. However, these embedding models often face challenges related to noise, missing information, distribution shift, adversarial attacks, etc. This can lead to sub-optimal embeddings and incorrect inferences, thereby negatively impacting downstream applications. While the existing literature has focused so far on adversarial attacks on KGE models, the challenges related to the other critical aspects remain unexplored. In this paper, we, first of all, give a unified definition of resilience, encompassing several factors such as generalisation, in-distribution generalization, distribution adaption, and robustness. After formalizing these concepts for machine learning in general, we define them in the context of knowledge graphs. To find the gap in the existing works on resilience in the context of knowledge graphs, we perform a systematic survey, taking into account all these aspects mentioned previously. Our survey results show that most of the existing works focus on a specific aspect of resilience, namely robustness. After categorizing such works based on their respective aspects of resilience, we discuss the challenges and future research directions.

Cite as

Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo. Resilience in Knowledge Graph Embeddings. In Transactions on Graph Data and Knowledge (TGDK), Volume 3, Issue 2, pp. 1:1-1:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{sharma_et_al:TGDK.3.2.1,
  author =	{Sharma, Arnab and Kouagou, N'Dah Jean and Ngomo, Axel-Cyrille Ngonga},
  title =	{{Resilience in Knowledge Graph Embeddings}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{1:1--1:38},
  ISSN =	{2942-7517},
  year =	{2025},
  volume =	{3},
  number =	{2},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.3.2.1},
  URN =		{urn:nbn:de:0030-drops-248117},
  doi =		{10.4230/TGDK.3.2.1},
  annote =	{Keywords: Knowledge graphs, Resilience, Robustness}
}
Document
Improved Dominance Filtering for Unions and Minkowski Sums of Pareto Sets

Authors: Konstantinos Karathanasis, Spyros Kontogiannis, and Christos Zaroliagis

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
A key task in multi-objective optimization is to compute the Pareto frontier (a.k.a. Pareto subset) P of a given d-dimensional objective space F; that is, a maximal subset P ⊆ F such that every element in P is non-dominated (i.e., it is better in at least one criterion, against any other point) within F. This process, called dominance-filtering, often involves handling objective spaces derived from either the union or the Minkowski sum of two given partial objective spaces which are Pareto sets themselves, and constitutes a major bottleneck in several multi-objective optimization techniques. In this work, we introduce three new data structures, ND^{+}-trees, QND^{+}-trees and TND^{+}-trees, which are designed for efficiently indexing non-dominated objective vectors and performing dominance-checks. We also devise three new algorithms that efficiently filter out dominated objective vectors from the union or the Minkowski sum of two Pareto sets. An extensive experimental evaluation on both synthetically generated and real-world data sets reveals that our new algorithms outperform state-of-art techniques for dominance-filtering of unions and Minkowski sums of Pareto sets, and scale well w.r.t. the number of d ≥ 3 criteria and the sets' sizes.

Cite as

Konstantinos Karathanasis, Spyros Kontogiannis, and Christos Zaroliagis. Improved Dominance Filtering for Unions and Minkowski Sums of Pareto Sets. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 59:1-59:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{karathanasis_et_al:LIPIcs.ESA.2025.59,
  author =	{Karathanasis, Konstantinos and Kontogiannis, Spyros and Zaroliagis, Christos},
  title =	{{Improved Dominance Filtering for Unions and Minkowski Sums of Pareto Sets}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{59:1--59:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.59},
  URN =		{urn:nbn:de:0030-drops-245277},
  doi =		{10.4230/LIPIcs.ESA.2025.59},
  annote =	{Keywords: Multi-Objective Optimization, Multi-Dimensional Data Structures, Pareto Sets, Algorithm Engineering}
}
Document
Hardness of Median and Center in the Ulam Metric

Authors: Nick Fischer, Elazar Goldenberg, Mursalin Habib, and Karthik C. S.

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
The classical rank aggregation problem seeks to combine a set X of n permutations into a single representative "consensus" permutation. In this paper, we investigate two fundamental rank aggregation tasks under the well-studied Ulam metric: computing a median permutation (which minimizes the sum of Ulam distances to X) and computing a center permutation (which minimizes the maximum Ulam distance to X) in two settings. - Continuous Setting: In the continuous setting, the median/center is allowed to be any permutation. It is known that computing a center in the Ulam metric is NP-hard and we add to this by showing that computing a median is NP-hard as well via a simple reduction from the Max-Cut problem. While this result may not be unexpected, it had remained elusive until now and confirms a speculation by Chakraborty, Das, and Krauthgamer [SODA '21]. - Discrete Setting: In the discrete setting, the median/center must be a permutation from the input set. We fully resolve the fine-grained complexity of the discrete median and discrete center problems under the Ulam metric, proving that the naive Õ(n² L)-time algorithm (where L is the length of the permutation) is conditionally optimal. This resolves an open problem raised by Abboud, Bateni, Cohen-Addad, Karthik C. S., and Seddighin [APPROX '23]. Our reductions are inspired by the known fine-grained lower bounds for similarity measures, but we face and overcome several new highly technical challenges.

Cite as

Nick Fischer, Elazar Goldenberg, Mursalin Habib, and Karthik C. S.. Hardness of Median and Center in the Ulam Metric. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 111:1-111:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{fischer_et_al:LIPIcs.ESA.2025.111,
  author =	{Fischer, Nick and Goldenberg, Elazar and Habib, Mursalin and Karthik C. S.},
  title =	{{Hardness of Median and Center in the Ulam Metric}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{111:1--111:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.111},
  URN =		{urn:nbn:de:0030-drops-245809},
  doi =		{10.4230/LIPIcs.ESA.2025.111},
  annote =	{Keywords: Ulam distance, median, center, rank aggregation, fine-grained complexity}
}
Document
(Multivariate) k-SUM as Barrier to Succinct Computation

Authors: Geri Gokaj, Marvin Künnemann, Sabine Storandt, and Carina Truschel

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
How does the time complexity of a problem change when the input is given succinctly rather than explicitly? We study this question for several geometric problems defined on a set X of N points in ℤ^d. As succinct representation, we choose a sumset (or Minkowski sum) representation: Instead of receiving X explicitly, we are given sets A,B of n points that define X as A+B = {a+b∣ a ∈ A,b ∈ B}. We investigate the fine-grained complexity of this succinct version for several Õ(N)-time computable geometric primitives. Remarkably, we can tie their complexity tightly to the complexity of corresponding k-SUM problems. Specifically, we introduce as All-ints 3-SUM(n,n,k) the following multivariate, multi-output variant of 3-SUM: given sets A,B of size n and set C of size k, determine for all c ∈ C whether there are a ∈ A and b ∈ B with a+b = c. We obtain the following results: 1) Succinct closest L_∞-pair requires time N^{1-o(1)} under the 3-SUM hypothesis, while succinct furthest L_∞-pair can be solved in time Õ(n). 2) Succinct bichromatic closest L_∞-Pair requires time N^{1-o(1)} iff the 4-SUM hypothesis holds. 3) The following problems are fine-grained equivalent to All-ints 3-SUM(n,n,k): succinct skyline computation in 2D with output size k and succinct batched orthogonal range search with k given ranges. This establishes conditionally tight Õ(min{nk, N})-time algorithms for these problems. We obtain further connections with All-ints 3-SUM(n,n,k) for succinctly computing independent sets in unit interval graphs. Thus, (Multivariate) k-SUM problems precisely capture the barrier for enabling sumset-succinct computation for various geometric primitives.

Cite as

Geri Gokaj, Marvin Künnemann, Sabine Storandt, and Carina Truschel. (Multivariate) k-SUM as Barrier to Succinct Computation. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 42:1-42:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gokaj_et_al:LIPIcs.ESA.2025.42,
  author =	{Gokaj, Geri and K\"{u}nnemann, Marvin and Storandt, Sabine and Truschel, Carina},
  title =	{{(Multivariate) k-SUM as Barrier to Succinct Computation}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{42:1--42:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.42},
  URN =		{urn:nbn:de:0030-drops-245101},
  doi =		{10.4230/LIPIcs.ESA.2025.42},
  annote =	{Keywords: Fine-grained complexity theory, sumsets, additive combinatorics, succinct inputs, computational geometry}
}
Document
A Verified Cost Model for Call-By-Push-Value

Authors: Zhuo Zoey Chen, Johannes Åman Pohjola, and Christine Rizkallah

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
The call-by-push-value λ-calculus allows for syntactically specifying the order of evaluation as part of the term language. Hence, it serves as a unifying language for embedding various evaluation strategies including call-by-value and call-by-name. Given the impact of call-by-push-value, it is remarkable that its adequacy as a model for computational complexity theory has not yet been studied. In this paper, we show that the call-by-push-value λ-calculus is reasonable for both time and space complexity. A reasonable cost model can encode other reasonable cost models with polynomial overhead in time and constant factor overhead in space. We achieve this by encoding call-by-push-value λ-calculus into Turing machines, following a simulation strategy by Forster et al.; for the converse direction, we prove that Levy’s encoding of the call-by-value λ-calculus has reasonable complexity bounds. The main results have been formalised in the HOL4 theorem prover.

Cite as

Zhuo Zoey Chen, Johannes Åman Pohjola, and Christine Rizkallah. A Verified Cost Model for Call-By-Push-Value. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 7:1-7:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITP.2025.7,
  author =	{Chen, Zhuo Zoey and \r{A}man Pohjola, Johannes and Rizkallah, Christine},
  title =	{{A Verified Cost Model for Call-By-Push-Value}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{7:1--7:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.7},
  URN =		{urn:nbn:de:0030-drops-246067},
  doi =		{10.4230/LIPIcs.ITP.2025.7},
  annote =	{Keywords: lambda calculus, formalizations of computational models, computability theory, HOL, call-by-push-value reduction, time and space complexity, abstract machines}
}
Document
Verifying Datalog Reasoning with Lean

Authors: Johannes Tantow, Lukas Gerlach, Stephan Mennicke, and Markus Krötzsch

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Datalog is an essential logical rule language with many applications, and modern rule engines compute logical consequences for Datalog with high performance and scalability. While Datalog is rather simple and, in principle, explainable by design, such sophisticated implementations and optimizations are hard to verify. We therefore propose a certificate-based approach to validate results of Datalog reasoners in a formally verified checker for Datalog proofs. Using the proof assistant Lean, we implement such a checker and verify its correctness against direct formalizations of the Datalog semantics. We propose two JSON encodings for Datalog proofs: one using the widely supported Datalog proof trees, and one using directed acyclic graphs for succinctness. To evaluate the practical feasibility and performance of our approach, we validate proofs that we obtain by converting derivation traces of an existing Datalog reasoner into our tool-independent format.

Cite as

Johannes Tantow, Lukas Gerlach, Stephan Mennicke, and Markus Krötzsch. Verifying Datalog Reasoning with Lean. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 36:1-36:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{tantow_et_al:LIPIcs.ITP.2025.36,
  author =	{Tantow, Johannes and Gerlach, Lukas and Mennicke, Stephan and Kr\"{o}tzsch, Markus},
  title =	{{Verifying Datalog Reasoning with Lean}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{36:1--36:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.36},
  URN =		{urn:nbn:de:0030-drops-246342},
  doi =		{10.4230/LIPIcs.ITP.2025.36},
  annote =	{Keywords: Certifying Algorithms, Datalog, Formal Verification}
}
Document
Barendregt’s Theory of the λ-Calculus, Refreshed and Formalized

Authors: Adrienne Lancelot, Beniamino Accattoli, and Maxime Vemclefs

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Barendregt’s book on the untyped λ-calculus refines the inconsistent view of β-divergence as representation of the undefined via the key concept of head reduction. In this paper, we put together recent revisitations of some key theorems laid out in Barendregt’s book, and we formalize them in the Abella proof assistant. Our work provides a compact and refreshed presentation of the core of the book. The formalization faithfully mimics pen-and-paper proofs. Two interesting aspects are the manipulation of contexts for the study of contextual equivalence and a formal alternative to the informal trick at work in Takahashi’s proof of the genericity lemma. As a by-product, we obtain an alternative definition of contextual equivalence that does not mention contexts.

Cite as

Adrienne Lancelot, Beniamino Accattoli, and Maxime Vemclefs. Barendregt’s Theory of the λ-Calculus, Refreshed and Formalized. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 13:1-13:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lancelot_et_al:LIPIcs.ITP.2025.13,
  author =	{Lancelot, Adrienne and Accattoli, Beniamino and Vemclefs, Maxime},
  title =	{{Barendregt’s Theory of the \lambda-Calculus, Refreshed and Formalized}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{13:1--13:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.13},
  URN =		{urn:nbn:de:0030-drops-246114},
  doi =		{10.4230/LIPIcs.ITP.2025.13},
  annote =	{Keywords: lambda-calculus, head reduction, equational theory}
}
Document
Animating MRBNFs: Truly Modular Binding-Aware Datatypes in Isabelle/HOL

Authors: Jan van Brügge, Andrei Popescu, and Dmitriy Traytel

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Nominal Isabelle provides powerful tools for meta-theoretic reasoning about syntax of logics or programming languages, in which variables are bound. It has been instrumental to major verification successes, such as Gödel’s incompleteness theorems. However, the existing tooling is not compositional. In particular, it does not support nested recursion, linear binding patterns, or infinitely branching syntax. These limitations are fundamental in the way nominal datatypes and functions on them are constructed within Nominal Isabelle. Taking advantage of recent theoretical advancements that overcome these limitations through a modular approach using the concept of map-restricted bounded natural functor (MRBNF), we develop and implement a new definitional package for binding-aware datatypes in Isabelle/HOL, called MrBNF. We describe the journey from the user specification to the end-product types, constants and theorems the tool generates. We validate MrBNF in two formalization case studies that so far were out of reach of nominal approaches: (1) Mazza’s isomorphism between the finitary and the infinitary affine λ-calculus, and (2) the POPLmark 2B challenge, which involves non-free binders for linear pattern matching.

Cite as

Jan van Brügge, Andrei Popescu, and Dmitriy Traytel. Animating MRBNFs: Truly Modular Binding-Aware Datatypes in Isabelle/HOL. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 11:1-11:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{vanbrugge_et_al:LIPIcs.ITP.2025.11,
  author =	{van Br\"{u}gge, Jan and Popescu, Andrei and Traytel, Dmitriy},
  title =	{{Animating MRBNFs: Truly Modular Binding-Aware Datatypes in Isabelle/HOL}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{11:1--11:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.11},
  URN =		{urn:nbn:de:0030-drops-246091},
  doi =		{10.4230/LIPIcs.ITP.2025.11},
  annote =	{Keywords: syntax with bindings, datatypes, inductive predicates, Isabelle/HOL}
}
Document
Im-C - A Memory-Safe C Interpreter Providing a Better Learning, Testing, and Debugging Experience

Authors: Masaki Kunii and Ian Piumarta

Published in: OASIcs, Volume 134, Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025)


Abstract
While it is a highly flexible and efficient language, C programming can be annoying for beginners who have difficultly finding, debugging, or understanding errors - especially those related to memory manipulation. This paper describes an initial phase of Im-C, a project whose goal is to develop a number of related environments that progress from easy-to-understand "C-with-objects" to a safe environment for both testing and learning to write "real" C programs, bringing more fun to the programming experience. In the current phase an experimental C interpreter has been developed to provide a foundation for real-time validation of C program behavior. The interpreter identifies errors related to memory manipulation, pointer operations and array access, by storing all values as objects containing meta-data, while preserving the semantics expected by C programmers. We validate the effectiveness of our interpreter by comparing its ability to detect a range of both common and obscure errors against the two most popular and mature tools currently available.

Cite as

Masaki Kunii and Ian Piumarta. Im-C - A Memory-Safe C Interpreter Providing a Better Learning, Testing, and Debugging Experience. In Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025). Open Access Series in Informatics (OASIcs), Volume 134, pp. 12:1-12:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kunii_et_al:OASIcs.Programming.2025.12,
  author =	{Kunii, Masaki and Piumarta, Ian},
  title =	{{Im-C - A Memory-Safe C Interpreter Providing a Better Learning, Testing, and Debugging Experience}},
  booktitle =	{Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025)},
  pages =	{12:1--12:18},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-382-9},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{134},
  editor =	{Edwards, Jonathan and Perera, Roly and Petricek, Tomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.Programming.2025.12},
  URN =		{urn:nbn:de:0030-drops-242968},
  doi =		{10.4230/OASIcs.Programming.2025.12},
  annote =	{Keywords: C programming Language, debugging Tools, memory Safety}
}
Document
Word Structures and Their Automatic Presentations

Authors: Xiaoyang Gong, Bakh Khoussainov, and Yuyang Zhuge

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
We study automatic presentations of the structures (ℕ; S), (ℕ; E_S), (ℕ; ≤), and their expansions by a unary predicate U. Here S is the successor function, E_S is the undirected version of S, and ≤ is the natural order. We call these structures word structures. Our goal is three-fold. First, we study the isomorphism problem for automatic word structures by focusing on the following three problems. The first problem asks to design an algorithm that, given an automatic structure A, decides if A is isomorphic to (ℕ; S). The second asks to design an algorithm that, given two automatic presentations of (ℕ; S, U₁) and (ℕ; S, U₂), where U₁ and U₂ are unary predicates, decides if these structures are isomorphic. The third problem investigates if there is an algorithm that, given two automatic presentations of (ℕ; ≤, U₁) and (ℕ; ≤, U₂), decides whether U₁ ∩ U₂ ≠ ∅. We show that these problems are undecidable. Next, we study intrinsic regularity of the function S in the structure Path_ω = (ℕ; E_S). We build an automatic presentation of Path_ω in which S is not regular. This implies that S is not intrinsically regular in Path_ω. For U ⊆ ℕ, let d_U be the function that computes the distances between the consecutive elements of U. We build automatic presentations of (ℕ; ≤, U) where d_U can realise logarithmic, radical, intermediate, and exponential functions.

Cite as

Xiaoyang Gong, Bakh Khoussainov, and Yuyang Zhuge. Word Structures and Their Automatic Presentations. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 51:1-51:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gong_et_al:LIPIcs.MFCS.2025.51,
  author =	{Gong, Xiaoyang and Khoussainov, Bakh and Zhuge, Yuyang},
  title =	{{Word Structures and Their Automatic Presentations}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{51:1--51:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.51},
  URN =		{urn:nbn:de:0030-drops-241581},
  doi =		{10.4230/LIPIcs.MFCS.2025.51},
  annote =	{Keywords: Automatic structures, the isomorphism problem, decidability, undecidability, regular relations}
}
Document
Regular Model Checking for Systems with Effectively Regular Reachability Relation

Authors: Javier Esparza and Valentin Krasotin

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Regular model checking is a well-established technique for the verification of regular transition systems (RTS): transition systems whose initial configurations and transition relation can be effectively encoded as regular languages. In 2008, To and Libkin studied RTSs in which the reachability relation (the reflexive and transitive closure of the transition relation) is also effectively regular, and showed that the recurrent reachability problem (whether a regular set L of configurations is reached infinitely often) is polynomial in the size of RTS and the transducer for the reachability relation. We extend the work of To and Libkin by studying the decidability and complexity of verifying almost-sure reachability and recurrent reachability - that is, whether L is reachable or recurrently reachable with probability 1. We then apply our results to the more common case in which only a regular overapproximation of the reachability relation is available. In particular, we extend recent complexity results on verifying safety using regular abstraction frameworks - a technique recently introduced by Czerner, the authors, and Welzel-Mohr - to liveness and almost-sure properties.

Cite as

Javier Esparza and Valentin Krasotin. Regular Model Checking for Systems with Effectively Regular Reachability Relation. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 45:1-45:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{esparza_et_al:LIPIcs.MFCS.2025.45,
  author =	{Esparza, Javier and Krasotin, Valentin},
  title =	{{Regular Model Checking for Systems with Effectively Regular Reachability Relation}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{45:1--45:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.45},
  URN =		{urn:nbn:de:0030-drops-241525},
  doi =		{10.4230/LIPIcs.MFCS.2025.45},
  annote =	{Keywords: Regular model checking, abstraction, inductive invariants}
}
Document
Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits

Authors: Neil J. Ross and Scott Wesley

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Many promising quantum algorithms in economics, medical science, and material science rely on circuits that are parameterized by a large number of angles. To ensure that these algorithms are efficient, these parameterized circuits must be heavily optimized. However, most quantum circuit optimizers are not verified, so this procedure is known to be error-prone. For this reason, there is growing interest in the design of equivalence checking algorithms for parameterized quantum circuits. In this paper, we define a generalized class of parameterized circuits with arbitrary rotations and show that this problem is decidable for cyclotomic gate sets. We propose a cutoff-based procedure which reduces the problem of verifying the equivalence of parameterized quantum circuits to the problem of verifying the equivalence of finitely many parameter-free quantum circuits. Because the number of parameter-free circuits grows exponentially with the number of parameters, we also propose a probabilistic variant of the algorithm for cases when the number of parameters is intractably large. We show that our techniques extend to equivalence modulo global phase, and describe an efficient angle sampling procedure for cyclotomic gate sets.

Cite as

Neil J. Ross and Scott Wesley. Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 85:1-85:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ross_et_al:LIPIcs.MFCS.2025.85,
  author =	{Ross, Neil J. and Wesley, Scott},
  title =	{{Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{85:1--85:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.85},
  URN =		{urn:nbn:de:0030-drops-241921},
  doi =		{10.4230/LIPIcs.MFCS.2025.85},
  annote =	{Keywords: Quantum Circuits, Parameterized Equivalence Checking}
}
Document
Efficient Matching of Some Fundamental Regular Expressions with Backreferences

Authors: Taisei Nogami and Tachio Terauchi

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Regular expression matching is of practical importance due to its widespread use in real-world applications. In practical use, regular expressions are often used with real-world extensions. Accordingly, the matching problem of regular expressions with real-world extensions has been actively studied in recent years, yielding steady progress. However, backreference, a popular extension supported by most modern programming languages such as Java, Python, JavaScript and others in their standard libraries for string processing, is an exception to this positive trend. In fact, it is known that the matching problem of regular expressions with backreferences (rewbs) is theoretically hard and the existence of an asymptotically fast matching algorithm for arbitrary rewbs seems unlikely. Even among currently known partial solutions, the balance between efficiency and generality remains unsatisfactory. To bridge this gap, we present an efficient matching algorithm for rewbs of the form e_0 (e)_1 e_1 \1 e_2 where e_0, e, e_1, e_2 are pure regular expressions, which are fundamental and frequently used in practical applications. It runs in quadratic time with respect to the input string length, substantially improving the best-known cubic time complexity for these rewbs. Our algorithm combines ideas from both stringology and automata theory in a novel way. We leverage two techniques from automata theory, injection and summarization, to simultaneously examine matches whose backreferenced substrings are either a fixed right-maximal repeat or its extendable prefixes, which are concepts from stringology. By further utilizing a subtle property of extendable prefixes, our algorithm correctly decides the matching problem while achieving the quadratic-time complexity.

Cite as

Taisei Nogami and Tachio Terauchi. Efficient Matching of Some Fundamental Regular Expressions with Backreferences. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 81:1-81:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{nogami_et_al:LIPIcs.MFCS.2025.81,
  author =	{Nogami, Taisei and Terauchi, Tachio},
  title =	{{Efficient Matching of Some Fundamental Regular Expressions with Backreferences}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{81:1--81:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.81},
  URN =		{urn:nbn:de:0030-drops-241886},
  doi =		{10.4230/LIPIcs.MFCS.2025.81},
  annote =	{Keywords: Regular expressions, Backreferences, Regex matching, NFA simulation, Suffix arrays, Right-maximal repeats}
}
Document
On Expansions of Monadic Second-Order Logic with Dynamical Predicates

Authors: Joris Nieuwveld and Joël Ouaknine

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Expansions of the monadic second-order (MSO) theory of the structure ⟨ℕ;<⟩ have been a fertile and active area of research ever since the publication of the seminal papers of Büchi and Elgot & Rabin on the subject in the 1960s. In the present paper, we establish decidability of the MSO theory of ⟨ℕ;<,P⟩, where P ranges over a large class of unary "dynamical" predicates, i.e., sets of non-negative values assumed by certain integer linear recurrence sequences. One of our key technical tools is the novel concept of (effective) prodisjunctivity, which we expect may also find independent applications further afield.

Cite as

Joris Nieuwveld and Joël Ouaknine. On Expansions of Monadic Second-Order Logic with Dynamical Predicates. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 80:1-80:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{nieuwveld_et_al:LIPIcs.MFCS.2025.80,
  author =	{Nieuwveld, Joris and Ouaknine, Jo\"{e}l},
  title =	{{On Expansions of Monadic Second-Order Logic with Dynamical Predicates}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{80:1--80:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.80},
  URN =		{urn:nbn:de:0030-drops-241879},
  doi =		{10.4230/LIPIcs.MFCS.2025.80},
  annote =	{Keywords: Monadic second-order logic, linear recurrence sequences, decidability, Baker’s theorem}
}
Document
On the Reachability Problem for Two-Dimensional Branching VASS

Authors: Clotilde Bizière, Thibault Hilaire, Jérôme Leroux, and Grégoire Sutre

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Vectors addition systems with states (VASS), or equivalently Petri nets, are arguably one of the most studied formalisms for the modeling and analysis of concurrent systems. A central decision problem for VASS is reachability: whether there exists a run from an initial configuration to a final one. This problem has been known to be decidable for over forty years, and its complexity has recently been precisely characterized. Our work concerns the reachability problem for BVASS, a branching generalization of VASS. In dimension one, the exact complexity of this problem is known. In this paper, we prove that the reachability problem for 2-dimensional BVASS is decidable. In fact, we even show that the reachability set admits a computable semilinear presentation. The decidability status of the reachability problem for BVASS remains open in higher dimensions.

Cite as

Clotilde Bizière, Thibault Hilaire, Jérôme Leroux, and Grégoire Sutre. On the Reachability Problem for Two-Dimensional Branching VASS. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 22:1-22:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{biziere_et_al:LIPIcs.MFCS.2025.22,
  author =	{Bizi\`{e}re, Clotilde and Hilaire, Thibault and Leroux, J\'{e}r\^{o}me and Sutre, Gr\'{e}goire},
  title =	{{On the Reachability Problem for Two-Dimensional Branching VASS}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{22:1--22:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.22},
  URN =		{urn:nbn:de:0030-drops-241294},
  doi =		{10.4230/LIPIcs.MFCS.2025.22},
  annote =	{Keywords: Vector addition systems, Reachability problem, Semilinear sets, Verification}
}
  • Refine by Type
  • 61 Document/PDF
  • 53 Document/HTML

  • Refine by Publication Year
  • 52 2025
  • 2 2023
  • 1 2022
  • 1 2021
  • 3 2016
  • Show More...

  • Refine by Author
  • 2 Booth, Richard
  • 2 Gonzalez Boix, Elisa
  • 2 Lancelot, Adrienne
  • 2 Nittka, Alexander
  • 2 Scherp, Ansgar
  • Show More...

  • Refine by Series/Journal
  • 47 LIPIcs
  • 3 OASIcs
  • 5 LITES
  • 4 TGDK
  • 2 DagSemProc

  • Refine by Classification
  • 5 Theory of computation → Logic and verification
  • 4 Theory of computation → Lambda calculus
  • 4 Theory of computation → Logic
  • 3 Theory of computation → Automata over infinite objects
  • 3 Theory of computation → Constraint and logic programming
  • Show More...

  • Refine by Keyword
  • 3 Verification
  • 2 Belief revision
  • 2 Isabelle/HOL
  • 2 Quantum Circuits
  • 2 decidability
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail