LIPIcs, Volume 299

9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)



Thumbnail PDF

Event

FSCD 2024, July 10-13, 2024, Tallinn, Estonia

Editor

Jakob Rehof
  • TU Dortmund University, Germany
  • Lamarr Institute for Machine Learning and Artificial Intelligence, Dortmund, Germany

Publication Details

  • published at: 2024-07-05
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-323-2
  • DBLP: db/conf/rta/fscd2024

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 299, FSCD 2024, Complete Volume

Authors: Jakob Rehof


Abstract
LIPIcs, Volume 299, FSCD 2024, Complete Volume

Cite as

9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 1-692, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Proceedings{rehof:LIPIcs.FSCD.2024,
  title =	{{LIPIcs, Volume 299, FSCD 2024, Complete Volume}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{1--692},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024},
  URN =		{urn:nbn:de:0030-drops-203287},
  doi =		{10.4230/LIPIcs.FSCD.2024},
  annote =	{Keywords: LIPIcs, Volume 299, FSCD 2024, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Jakob Rehof


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 0:i-0:xviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{rehof:LIPIcs.FSCD.2024.0,
  author =	{Rehof, Jakob},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{0:i--0:xviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.0},
  URN =		{urn:nbn:de:0030-drops-203292},
  doi =		{10.4230/LIPIcs.FSCD.2024.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Meaningfulness and Genericity in a Subsuming Framework (Invited Talk)

Authors: Delia Kesner, Victor Arrial, and Giulio Guerrieri


Abstract
This paper studies the notion of meaningfulness for a unifying framework called dBang-calculus, which subsumes both call-by-name (dCBN) and call-by-value (dCBV). We first define meaningfulness in dBang and then characterize it by means of typability and inhabitation in an associated non-idempotent intersection type system previously appearing in the literature. We validate the proposed notion of meaningfulness by showing two properties: (1) consistency of the smallest theory, called ℋ, equating all meaningless terms, and (2) genericity, stating that meaningless subterms have no bearing on the significance of meaningful terms. The theory ℋ is also shown to have a unique consistent and maximal extension ℋ*, which coincides with a well-known notion of observational equivalence. Last but not least, we show that the notions of meaningfulness and genericity in the literature for dCBN and dCBV are subsumed by the corresponding ones proposed here for the dBang-calculus.

Cite as

Delia Kesner, Delia Kesner, Victor Arrial, Victor Arrial, Giulio Guerrieri, and Giulio Guerrieri. Meaningfulness and Genericity in a Subsuming Framework (Invited Talk). In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 1:1-1:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kesner_et_al:LIPIcs.FSCD.2024.1,
  author =	{Kesner, Delia and Arrial, Victor and Guerrieri, Giulio},
  title =	{{Meaningfulness and Genericity in a Subsuming Framework}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{1:1--1:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.1},
  URN =		{urn:nbn:de:0030-drops-203305},
  doi =		{10.4230/LIPIcs.FSCD.2024.1},
  annote =	{Keywords: Lambda calculus, Solvability, Meaningfulness, Inhabitation, Genericity}
}
Document
Invited Talk
Abstraction-Based Decision Making for Statistical Properties (Invited Talk)

Authors: Filip Cano, Thomas A. Henzinger, Bettina Könighofer, Konstantin Kueffner, and Kaushik Mallik


Abstract
Sequential decision-making in probabilistic environments is a fundamental problem with many applications in AI and economics. In this paper, we present an algorithm for synthesizing sequential decision-making agents that optimize statistical properties such as maximum and average response times. In the general setting of sequential decision-making, the environment is modeled as a random process that generates inputs. The agent responds to each input, aiming to maximize rewards and minimize costs within a specified time horizon. The corresponding synthesis problem is known to be PSPACE-hard. We consider the special case where the input distribution, reward, and cost depend on input-output statistics specified by counter automata. For such problems, this paper presents the first PTIME synthesis algorithms. We introduce the notion of statistical abstraction, which clusters statistically indistinguishable input-output sequences into equivalence classes. This abstraction allows for a dynamic programming algorithm whose complexity grows polynomially with the considered horizon, making the statistical case exponentially more efficient than the general case. We evaluate our algorithm on three different application scenarios of a client-server protocol, where multiple clients compete via bidding to gain access to the service offered by the server. The synthesized policies optimize profit while guaranteeing that none of the server’s clients is disproportionately starved of the service.

Cite as

Filip Cano, Thomas A. Henzinger, Bettina Könighofer, Konstantin Kueffner, and Kaushik Mallik. Abstraction-Based Decision Making for Statistical Properties (Invited Talk). In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 2:1-2:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cano_et_al:LIPIcs.FSCD.2024.2,
  author =	{Cano, Filip and Henzinger, Thomas A. and K\"{o}nighofer, Bettina and Kueffner, Konstantin and Mallik, Kaushik},
  title =	{{Abstraction-Based Decision Making for Statistical Properties}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{2:1--2:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.2},
  URN =		{urn:nbn:de:0030-drops-203310},
  doi =		{10.4230/LIPIcs.FSCD.2024.2},
  annote =	{Keywords: Abstract interpretation, Sequential decision making, Counter machines}
}
Document
Invited Talk
Lean: Past, Present, and Future (Invited Talk)

Authors: Sebastian Ullrich


Abstract
The Lean programming language and theorem prover project is celebrating its tenth birthday this year, having been started by Leonardo de Moura at Microsoft Research and first release as Lean 0.1 in 2014. In this invited talk, I will review Lean’s history and unique features and discuss our roadmap for its bright future. Corresponding to its major versions ranging from Lean 0.1 to the current version of Lean 4, the focus of the Lean project has evolved over the years. Initially intended as a platform for developing white-box automation, in contrast to the usual black-box approach of stand-alone SMT solvers [de Moura and Passmore, 2013], the system gathered more conventional features of dependently-typed interactive theorem provers as well as an initial crowd of interested mathematicians and computer scientists with its first official release as Lean 2 in 2015 [Leonardo de Moura et al., 2015]. Lean 3 in 2017 introduced user-extensible automation by extending Lean from a specification language to an accessible metaprogramming language [Gabriel Ebner et al., 2017], further accelerating growth of its mathematical library that was spun out into the separate Mathlib project [{The mathlib Community}, 2020]. Spurred by the success but also limitations of this extensibility, we started work on the next version Lean 4 in 2018 [Leonardo de Moura and Sebastian Ullrich, 2021] with the goal of turning Lean into a general-purpose programming language that would allow us to reimplement Lean in Lean itself and thereby make many more aspects of the system user-extensible, in a more efficient manner [Sebastian Ullrich, 2023]. This to date largest rework of Lean’s implementation was completed in 2023 with the official release of Lean 4.0.0, further supporting Mathlib’s growth to more than 1.5 million lines of code at the time of writing as well as improving support for many other applications such as software verification. In 2023, Lean also saw its largest organizational change when Leo and I created the Lean Focused Research Organization (FRO) to bundle and support development of Lean in a dedicated organization for the first time. Thanks to gracious support from philanthropic sponsors, an unprecedented number of currently twelve people now work on the evolution of Lean at the Lean FRO. And there is much left to do: with our new team size, we can now support development on much more than only core features, such as documentation, a robust standard library, and user interfaces and experience as well as a return to the original topic of advanced proof automation. The Lean FRO is committed to ensuring and extending Lean’s applicability in education, research, and industry and to leading it into the next decade of Lean development and beyond.

Cite as

Sebastian Ullrich. Lean: Past, Present, and Future (Invited Talk). In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 3:1-3:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ullrich:LIPIcs.FSCD.2024.3,
  author =	{Ullrich, Sebastian},
  title =	{{Lean: Past, Present, and Future}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{3:1--3:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.3},
  URN =		{urn:nbn:de:0030-drops-203328},
  doi =		{10.4230/LIPIcs.FSCD.2024.3},
  annote =	{Keywords: Lean, interactive theorem proving, focused research organization, history}
}
Document
Univalent Enriched Categories and the Enriched Rezk Completion

Authors: Niels van der Weide


Abstract
Enriched categories are categories whose sets of morphisms are enriched with extra structure. Such categories play a prominent role in the study of higher categories, homotopy theory, and the semantics of programming languages. In this paper, we study univalent enriched categories. We prove that all essentially surjective and fully faithful functors between univalent enriched categories are equivalences, and we show that every enriched category admits a Rezk completion. Finally, we use the Rezk completion for enriched categories to construct univalent enriched Kleisli categories.

Cite as

Niels van der Weide. Univalent Enriched Categories and the Enriched Rezk Completion. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 4:1-4:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{vanderweide:LIPIcs.FSCD.2024.4,
  author =	{van der Weide, Niels},
  title =	{{Univalent Enriched Categories and the Enriched Rezk Completion}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{4:1--4:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.4},
  URN =		{urn:nbn:de:0030-drops-203337},
  doi =		{10.4230/LIPIcs.FSCD.2024.4},
  annote =	{Keywords: enriched categories, univalent categories, homotopy type theory, univalent foundations, Rezk completion}
}
Document
The Flower Calculus

Authors: Pablo Donato


Abstract
We introduce the flower calculus, a deep inference proof system for intuitionistic first-order logic inspired by Peirce’s existential graphs. It works as a rewriting system over inductive objects called "flowers", that enjoy both a graphical interpretation as topological diagrams, and a textual presentation as nested sequents akin to coherent formulas. Importantly, the calculus dispenses completely with the traditional notion of symbolic connective, operating solely on nested flowers containing atomic predicates. We prove both the soundness of the full calculus and the completeness of an analytic fragment with respect to Kripke semantics. This provides to our knowledge the first analyticity result for a proof system based on existential graphs, adapting semantic cut-elimination techniques to a deep inference setting. Furthermore, the kernel of rules targetted by completeness is fully invertible, a desirable property for both automated and interactive proof search.

Cite as

Pablo Donato. The Flower Calculus. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 5:1-5:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{donato:LIPIcs.FSCD.2024.5,
  author =	{Donato, Pablo},
  title =	{{The Flower Calculus}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{5:1--5:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.5},
  URN =		{urn:nbn:de:0030-drops-203343},
  doi =		{10.4230/LIPIcs.FSCD.2024.5},
  annote =	{Keywords: deep inference, graphical calculi, existential graphs, intuitionistic logic, Kripke semantics, cut-elimination}
}
Document
Delooping Generated Groups in Homotopy Type Theory

Authors: Camil Champin, Samuel Mimram, and Émile Oleon


Abstract
Homotopy type theory is a logical setting based on Martin-Löf type theory in which one can perform geometric constructions and proofs in a synthetic way. Namely, types can be interpreted as spaces (up to continuous deformation) and proofs as homotopy invariant constructions. In this context, loop spaces of pointed connected groupoids provide a natural representation of groups, and any group can be obtained as the loop space of such a type, which is then called a delooping of the group. There are two main methods to construct the delooping of an arbitrary group G. The first one consists in describing it as a pointed higher inductive type, whereas the second one consists in taking the connected component of the principal G-torsor in the type of sets equipped with an action of G. We show here that, when a presentation is known for the group, simpler variants of those constructions can be used to build deloopings. The resulting types are more amenable to computations and lead to simpler meta-theoretic reasoning. We also investigate, in this context, an abstract construction for the Cayley graph of a generated group and show that it encodes the relations of the group. Most of the developments performed in the article have been formalized using the cubical version of the Agda proof assistant.

Cite as

Camil Champin, Samuel Mimram, and Émile Oleon. Delooping Generated Groups in Homotopy Type Theory. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 6:1-6:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{champin_et_al:LIPIcs.FSCD.2024.6,
  author =	{Champin, Camil and Mimram, Samuel and Oleon, \'{E}mile},
  title =	{{Delooping Generated Groups in Homotopy Type Theory}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{6:1--6:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.6},
  URN =		{urn:nbn:de:0030-drops-203356},
  doi =		{10.4230/LIPIcs.FSCD.2024.6},
  annote =	{Keywords: homotopy type theory, delooping, group, generator, Cayley graph}
}
Document
Machine-Checked Categorical Diagrammatic Reasoning

Authors: Benoît Guillemet, Assia Mahboubi, and Matthieu Piquerez


Abstract
This paper describes a formal proof library, developed using the Coq proof assistant, designed to assist users in writing correct diagrammatic proofs, for 1-categories. This library proposes a deep-embedded, domain-specific formal language, which features dedicated proof commands to automate the synthesis, and the verification, of the technical parts often eluded in the literature.

Cite as

Benoît Guillemet, Assia Mahboubi, and Matthieu Piquerez. Machine-Checked Categorical Diagrammatic Reasoning. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 7:1-7:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{guillemet_et_al:LIPIcs.FSCD.2024.7,
  author =	{Guillemet, Beno\^{i}t and Mahboubi, Assia and Piquerez, Matthieu},
  title =	{{Machine-Checked Categorical Diagrammatic Reasoning}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{7:1--7:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.7},
  URN =		{urn:nbn:de:0030-drops-203363},
  doi =		{10.4230/LIPIcs.FSCD.2024.7},
  annote =	{Keywords: Interactive theorem proving, categories, diagrams, formal proof automation}
}
Document
Mechanized Subject Expansion in Uniform Intersection Types for Perpetual Reductions

Authors: Andrej Dudenhefner and Daniele Pautasso


Abstract
We provide a new, purely syntactical proof of strong normalization for the simply typed λ-calculus. The result relies on a novel proof of the equivalence between typability in the simple type system and typability in the uniform intersection type system (a restriction of the non-idempotent intersection type system). For formal verification, the equivalence is mechanized using the Coq proof assistant. In the present work, strong normalization of a given simply typed term M is shown in four steps. First, M is reduced to a normal form N via a suitable reduction strategy with a decreasing measure. Second, a uniform intersection type for the normal form N is inferred. Third, a uniform intersection type for M is constructed iteratively via subject expansion. Fourth, strong normalization of M is shown by induction on the size of the type derivation. A supplementary contribution is a family of perpetual reduction strategies, i.e. strategies which preserve infinite reduction paths. This family allows for subject expansion in the intersection type systems of interest, and contains a reduction strategy with a decreasing measure in the simple type system. A notable member of this family is Barendregt’s F_∞ reduction strategy.

Cite as

Andrej Dudenhefner and Daniele Pautasso. Mechanized Subject Expansion in Uniform Intersection Types for Perpetual Reductions. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 8:1-8:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dudenhefner_et_al:LIPIcs.FSCD.2024.8,
  author =	{Dudenhefner, Andrej and Pautasso, Daniele},
  title =	{{Mechanized Subject Expansion in Uniform Intersection Types for Perpetual Reductions}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{8:1--8:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.8},
  URN =		{urn:nbn:de:0030-drops-203371},
  doi =		{10.4230/LIPIcs.FSCD.2024.8},
  annote =	{Keywords: lambda-calculus, simple types, intersection types, strong normalization, mechanization, perpetual reductions}
}
Document
Laplace Distributors and Laplace Transformations for Differential Categories

Authors: Marie Kerjean and Jean-Simon Pacaud Lemay


Abstract
In a differential category and in Differential Linear Logic, the exponential conjunction ! admits structural maps, characterizing quantitative operations and symmetric co-structural maps, characterizing differentiation. In this paper, we introduce the notion of a Laplace distributor, which is an extra structural map which distributes the linear negation operation (_)^∗ over ! and transforms the co-structural rules into the structural rules. Laplace distributors are directly inspired by the well-known Laplace transform, which is all-important in numerical analysis. In the star-autonomous setting, a Laplace distributor induces a natural transformation from ! to the exponential disjunction ?, which we then call a Laplace transformation. According to its semantics, we show that Laplace distributors correspond precisely to the notion of a generalized exponential function e^x on the monoidal unit. We also show that many well-known and important examples have a Laplace distributor/transformation, including (weighted) relations, finiteness spaces, Köthe spaces, and convenient vector spaces.

Cite as

Marie Kerjean and Jean-Simon Pacaud Lemay. Laplace Distributors and Laplace Transformations for Differential Categories. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 9:1-9:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kerjean_et_al:LIPIcs.FSCD.2024.9,
  author =	{Kerjean, Marie and Lemay, Jean-Simon Pacaud},
  title =	{{Laplace Distributors and Laplace Transformations for Differential Categories}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{9:1--9:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.9},
  URN =		{urn:nbn:de:0030-drops-203382},
  doi =		{10.4230/LIPIcs.FSCD.2024.9},
  annote =	{Keywords: Differential Categories, Differential Linear Logic, Laplace Distributor, Laplace Transformation, Exponential Function}
}
Document
Second-Order Generalised Algebraic Theories: Signatures and First-Order Semantics

Authors: Ambrus Kaposi and Szumi Xie


Abstract
Programming languages can be defined from the concrete to the abstract by abstract syntax trees, well-scoped syntax, well-typed (intrinsic) syntax, algebraic syntax (well-typed syntax quotiented by conversion). Another aspect is the representation of binding structure for which nominal approaches, De Bruijn indices/levels and higher order abstract syntax (HOAS) are available. In HOAS, binders are given by the function space of an internal language of presheaves. In this paper, we show how to combine the algebraic approach with the HOAS approach: following Uemura, we define languages as second-order generalised algebraic theories (SOGATs). Through a series of examples we show that non-substructural languages can be naturally defined as SOGATs. We give a formal definition of SOGAT signatures (using the syntax of a particular SOGAT) and define two translations from SOGAT signatures to GAT signatures (signatures for quotient inductive-inductive types), based on parallel and single substitutions, respectively.

Cite as

Ambrus Kaposi and Szumi Xie. Second-Order Generalised Algebraic Theories: Signatures and First-Order Semantics. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 10:1-10:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kaposi_et_al:LIPIcs.FSCD.2024.10,
  author =	{Kaposi, Ambrus and Xie, Szumi},
  title =	{{Second-Order Generalised Algebraic Theories: Signatures and First-Order Semantics}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{10:1--10:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.10},
  URN =		{urn:nbn:de:0030-drops-203396},
  doi =		{10.4230/LIPIcs.FSCD.2024.10},
  annote =	{Keywords: Type theory, universal algebra, inductive types, quotient inductive types, higher-order abstract syntax, logical framework}
}
Document
Optimizing a Non-Deterministic Abstract Machine with Environments

Authors: Małgorzata Biernacka, Dariusz Biernacki, Sergueï Lenglet, and Alan Schmitt


Abstract
Non-deterministic abstract machine (NDAM) is a recent implementation model for programming languages where one must choose among several redexes at each reduction step, like process calculi. These machines can be derived from a zipper semantics, a mix between structural operational semantics and context-based reduction semantics. Such a machine has been generated also for the λ-calculus without a fixed reduction strategy, i.e., with the full non-deterministic β-reduction. In that machine, substitution is an external operation that replaces all the occurrences of a variable at once. Implementing substitution with environments is more low-level and more efficient as variables are replaced only when needed. In this paper, we define a NDAM with environments for the λ-calculus without a fixed reduction strategy. We also introduce other optimizations, including a form of refocusing, and we show that we can restrict our optimized NDAM to recover some of the usual λ-calculus machines, e.g., the Krivine Abstract Machine. Most of the improvements we propose in this work could be applied to other NDAMs as well.

Cite as

Małgorzata Biernacka, Dariusz Biernacki, Sergueï Lenglet, and Alan Schmitt. Optimizing a Non-Deterministic Abstract Machine with Environments. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 11:1-11:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{biernacka_et_al:LIPIcs.FSCD.2024.11,
  author =	{Biernacka, Ma{\l}gorzata and Biernacki, Dariusz and Lenglet, Sergue\"{i} and Schmitt, Alan},
  title =	{{Optimizing a Non-Deterministic Abstract Machine with Environments}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{11:1--11:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.11},
  URN =		{urn:nbn:de:0030-drops-203409},
  doi =		{10.4230/LIPIcs.FSCD.2024.11},
  annote =	{Keywords: Abstract machine, Explicit substitutions, Refocusing}
}
Document
A Linear Type System for L^p-Metric Sensitivity Analysis

Authors: Victor Sannier and Patrick Baillot


Abstract
When working in optimisation or privacy protection, one may need to estimate the sensitivity of computer programs, i.e., the maximum multiplicative increase in the distance between two inputs and the corresponding two outputs. In particular, differential privacy is a rigorous and widely used notion of privacy that is closely related to sensitivity. Several type systems for sensitivity and differential privacy based on linear logic have been proposed in the literature, starting with the functional language Fuzz. However, they are either limited to certain metrics (L¹ and L^∞), and thus to the associated privacy mechanisms, or they rely on a complex notion of type contexts that does not interact well with operational semantics. We therefore propose a graded linear type system - inspired by Bunched Fuzz [{w}under et al., 2023] - called Plurimetric Fuzz that handles L^p vector metrics (for 1 ≤ p ≤ +∞), uses standard type contexts, gives reasonable bounds on sensitivity, and has good metatheoretical properties. We also provide a denotational semantics in terms of metric complete partial orders, and translation mappings from and to Fuzz.

Cite as

Victor Sannier and Patrick Baillot. A Linear Type System for L^p-Metric Sensitivity Analysis. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 12:1-12:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{sannier_et_al:LIPIcs.FSCD.2024.12,
  author =	{Sannier, Victor and Baillot, Patrick},
  title =	{{A Linear Type System for L^p-Metric Sensitivity Analysis}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{12:1--12:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.12},
  URN =		{urn:nbn:de:0030-drops-203412},
  doi =		{10.4230/LIPIcs.FSCD.2024.12},
  annote =	{Keywords: type system, linear logic, sensitivity, vector metrics, differential privacy, lambda-calculus, functional programming, denotational semantics}
}
Document
Simulating Dependency Pairs by Semantic Labeling

Authors: Teppei Saito and Nao Hirokawa


Abstract
We show that termination proofs by a version of the dependency pair method can be simulated by semantic labeling plus multiset path orders. By incorporating a flattening technique into multiset path orders the simulation result can be extended to the dependency pair method for relative termination, introduced by Iborra et al. This result allows us to improve applicability of their dependency pair method.

Cite as

Teppei Saito and Nao Hirokawa. Simulating Dependency Pairs by Semantic Labeling. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 13:1-13:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{saito_et_al:LIPIcs.FSCD.2024.13,
  author =	{Saito, Teppei and Hirokawa, Nao},
  title =	{{Simulating Dependency Pairs by Semantic Labeling}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{13:1--13:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.13},
  URN =		{urn:nbn:de:0030-drops-203423},
  doi =		{10.4230/LIPIcs.FSCD.2024.13},
  annote =	{Keywords: Term rewriting, Relative termination, Semantic labeling, Dependency pairs}
}
Document
Two-Dimensional Kripke Semantics I: Presheaves

Authors: G. A. Kavvos


Abstract
The study of modal logic has witnessed tremendous development following the introduction of Kripke semantics. However, recent developments in programming languages and type theory have led to a second way of studying modalities, namely through their categorical semantics. We show how the two correspond.

Cite as

G. A. Kavvos. Two-Dimensional Kripke Semantics I: Presheaves. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 14:1-14:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kavvos:LIPIcs.FSCD.2024.14,
  author =	{Kavvos, G. A.},
  title =	{{Two-Dimensional Kripke Semantics I: Presheaves}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{14:1--14:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.14},
  URN =		{urn:nbn:de:0030-drops-203438},
  doi =		{10.4230/LIPIcs.FSCD.2024.14},
  annote =	{Keywords: modal logic, categorical semantics, Kripke semantics, duality, open maps}
}
Document
Adjoint Natural Deduction

Authors: Junyoung Jang, Sophia Roshal, Frank Pfenning, and Brigitte Pientka


Abstract
Adjoint logic is a general approach to combining multiple logics with different structural properties, including linear, affine, strict, and (ordinary) intuitionistic logics, where each proposition has an intrinsic mode of truth. It has been defined in the form of a sequent calculus because the central concept of independence is most clearly understood in this form, and because it permits a proof of cut elimination following standard techniques. In this paper we present a natural deduction formulation of adjoint logic and show how it is related to the sequent calculus. As a consequence, every provable proposition has a verification (sometimes called a long normal form). We also give a computational interpretation of adjoint logic in the form of a functional language and prove properties of computations that derive from the structure of modes, including freedom from garbage (for modes without weakening and contraction), strictness (for modes disallowing weakening), and erasure (based on a preorder between modes). Finally, we present a surprisingly subtle algorithm for type checking.

Cite as

Junyoung Jang, Sophia Roshal, Frank Pfenning, and Brigitte Pientka. Adjoint Natural Deduction. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 15:1-15:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{jang_et_al:LIPIcs.FSCD.2024.15,
  author =	{Jang, Junyoung and Roshal, Sophia and Pfenning, Frank and Pientka, Brigitte},
  title =	{{Adjoint Natural Deduction}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{15:1--15:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.15},
  URN =		{urn:nbn:de:0030-drops-203441},
  doi =		{10.4230/LIPIcs.FSCD.2024.15},
  annote =	{Keywords: Substructural Logic, Type Systems, Functional Programming}
}
Document
On the Complexity of the Small Term Reachability Problem for Terminating Term Rewriting Systems

Authors: Franz Baader and Jürgen Giesl


Abstract
Motivated by an application where we try to make proofs for Description Logic inferences smaller by rewriting, we consider the following decision problem, which we call the small term reachability problem: given a term rewriting system R, a term s, and a natural number n, decide whether there is a term t of size ≤ n reachable from s using the rules of R. We investigate the complexity of this problem depending on how termination of R can be established. We show that the problem is NP-complete for length-reducing term rewriting systems. Its complexity increases to N2ExpTime-complete (NExpTime-complete) if termination is proved using a (linear) polynomial order and to PSpace-complete for systems whose termination can be shown using a restricted class of Knuth-Bendix orders. Confluence reduces the complexity to P for the length-reducing case, but has no effect on the worst-case complexity in the other two cases.

Cite as

Franz Baader and Jürgen Giesl. On the Complexity of the Small Term Reachability Problem for Terminating Term Rewriting Systems. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 16:1-16:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{baader_et_al:LIPIcs.FSCD.2024.16,
  author =	{Baader, Franz and Giesl, J\"{u}rgen},
  title =	{{On the Complexity of the Small Term Reachability Problem for Terminating Term Rewriting Systems}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{16:1--16:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.16},
  URN =		{urn:nbn:de:0030-drops-203454},
  doi =		{10.4230/LIPIcs.FSCD.2024.16},
  annote =	{Keywords: Rewriting, Termination, Confluence, Creating small terms, Derivational complexity, Description Logics, Proof rewriting}
}
Document
A Categorical Approach to DIBI Models

Authors: Tao Gu, Jialu Bao, Justin Hsu, Alexandra Silva, and Fabio Zanasi


Abstract
The logic of Dependence and Independence Bunched Implications (DIBI) is a logic to reason about conditional independence (CI); for instance, DIBI formulas can characterise CI in discrete probability distributions and in relational databases, using a probabilistic DIBI model and a similarly-constructed relational model. Despite the similarity of the two models, there lacks a uniform account. As a result, the laborious case-by-case verification of the frame conditions required for constructing new models hinders them from generalising the results to CI in other useful models such that continuous distribution. In this paper, we develop an abstract framework for systematically constructing DIBI models, using category theory as the unifying mathematical language. We show that DIBI models arise from arbitrary symmetric monoidal categories with copy-discard structure. In particular, we use string diagrams - a graphical presentation of monoidal categories - to give a uniform definition of the parallel composition and subkernel relation in DIBI models. Our approach not only generalises known models, but also yields new models of interest and reduces properties of DIBI models to structures in the underlying categories. Furthermore, our categorical framework enables a comparison between string diagrammatic approaches to CI in the literature and a logical notion of CI, defined in terms of the satisfaction of specific DIBI formulas. We show that the logical notion is an extension of string diagrammatic CI under reasonable conditions.

Cite as

Tao Gu, Jialu Bao, Justin Hsu, Alexandra Silva, and Fabio Zanasi. A Categorical Approach to DIBI Models. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 17:1-17:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{gu_et_al:LIPIcs.FSCD.2024.17,
  author =	{Gu, Tao and Bao, Jialu and Hsu, Justin and Silva, Alexandra and Zanasi, Fabio},
  title =	{{A Categorical Approach to DIBI Models}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{17:1--17:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.17},
  URN =		{urn:nbn:de:0030-drops-203469},
  doi =		{10.4230/LIPIcs.FSCD.2024.17},
  annote =	{Keywords: Conditional Independence, Dependence Independence Bunched Implications, String Diagrams, Markov Categories}
}
Document
Representation of Peano Arithmetic in Separation Logic

Authors: Sohei Ito and Makoto Tatsuta


Abstract
Separation logic is successful for software verification of heap-manipulating programs. Numbers are necessary to be added to separation logic for verification of practical software where numbers are important. However, properties of the validity such as decidability and complexity for separation logic with numbers have not been fully studied yet. This paper presents the translation of Pi-0-1 formulas in Peano arithmetic to formulas in a small fragment of separation logic with numbers, which consists only of the intuitionistic points-to predicate, 0 and the successor function. Then this paper proves that a formula in Peano arithmetic is valid in the standard model if and only if its translation in this fragment is valid in the standard interpretation. As a corollary, this paper also gives a perspective proof for the undecidability of the validity in this fragment. Since Pi-0-1 formulas can describe consistency of logical systems and non-termination of computations, this result also shows that these properties discussed in Peano arithmetic can also be discussed in such a small fragment of separation logic with numbers.

Cite as

Sohei Ito and Makoto Tatsuta. Representation of Peano Arithmetic in Separation Logic. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 18:1-18:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ito_et_al:LIPIcs.FSCD.2024.18,
  author =	{Ito, Sohei and Tatsuta, Makoto},
  title =	{{Representation of Peano Arithmetic in Separation Logic}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{18:1--18:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.18},
  URN =		{urn:nbn:de:0030-drops-203476},
  doi =		{10.4230/LIPIcs.FSCD.2024.18},
  annote =	{Keywords: First order logic, Separation logic, Peano arithmetic, Presburger arithmetic}
}
Document
Semantics for a Turing-Complete Reversible Programming Language with Inductive Types

Authors: Kostia Chardonnet, Louis Lemonnier, and Benoît Valiron


Abstract
This paper is concerned with the expressivity and denotational semantics of a functional higher-order reversible programming language based on Theseus. In this language, pattern-matching is used to ensure the reversibility of functions. We show how one can encode any Reversible Turing Machine in said language. We then build a sound and adequate categorical semantics based on join inverse categories, with additional structures to capture pattern-matching and to interpret inductive types and recursion. We then derive a notion of completeness in the sense that any computable, partial, first-order injective function is the image of a term in the language.

Cite as

Kostia Chardonnet, Louis Lemonnier, and Benoît Valiron. Semantics for a Turing-Complete Reversible Programming Language with Inductive Types. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chardonnet_et_al:LIPIcs.FSCD.2024.19,
  author =	{Chardonnet, Kostia and Lemonnier, Louis and Valiron, Beno\^{i}t},
  title =	{{Semantics for a Turing-Complete Reversible Programming Language with Inductive Types}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{19:1--19:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.19},
  URN =		{urn:nbn:de:0030-drops-203487},
  doi =		{10.4230/LIPIcs.FSCD.2024.19},
  annote =	{Keywords: Reversible programming, functional programming, Computability, Denotational Semantics}
}
Document
On Iteration in Discrete Probabilistic Programming

Authors: Mateo Torres-Ruiz, Robin Piedeleu, Alexandra Silva, and Fabio Zanasi


Abstract
Discrete probabilistic programming languages provide an expressive tool for representing and reasoning about probabilistic models. These languages typically define the semantics of a program through its posterior distribution, obtained through exact inference techniques. While the semantics of standard programming constructs in this context is well understood, there is a gap in extending these languages with tools to reason about the asymptotic behaviour of programs. In this paper, we introduce unbounded iteration in the context of a discrete probabilistic programming language, give it a semantics, and show how to compute it exactly. This allows us to express the stationary distribution of a probabilistic function while preserving the efficiency of exact inference techniques. We discuss the advantages and limitations of our approach, showcasing their practical utility by considering examples where bounded iteration poses a challenge due to the inherent difficulty of assessing the proximity of a distribution to its stationary point.

Cite as

Mateo Torres-Ruiz, Robin Piedeleu, Alexandra Silva, and Fabio Zanasi. On Iteration in Discrete Probabilistic Programming. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 20:1-20:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{torresruiz_et_al:LIPIcs.FSCD.2024.20,
  author =	{Torres-Ruiz, Mateo and Piedeleu, Robin and Silva, Alexandra and Zanasi, Fabio},
  title =	{{On Iteration in Discrete Probabilistic Programming}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{20:1--20:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.20},
  URN =		{urn:nbn:de:0030-drops-203490},
  doi =		{10.4230/LIPIcs.FSCD.2024.20},
  annote =	{Keywords: Probabilistic programming, Programming languages semantics, Unbounded iteration}
}
Document
Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti

Authors: Thiago Felicissimo and Théo Winterhalter


Abstract
Proof assistants such as Coq implement a type theory featuring three important features: impredicativity, cumulativity and product covariance. This combination has proven difficult to be expressed in the logical framework Dedukti, and previous attempts have failed in providing an encoding that is proven confluent, sound and conservative. In this work we solve this longstanding open problem by providing an encoding of these three features that we prove to be confluent, sound and to satisfy a restricted (but, we argue, strong enough) form of conservativity. Our proof of confluence is a contribution by itself, and combines various criteria and proof techniques from rewriting theory. Our proof of soundness also contributes a new strategy in which the result is shown in terms of an inverse translation function, fixing a common flaw made in some previous encoding attempts.

Cite as

Thiago Felicissimo and Théo Winterhalter. Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 21:1-21:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{felicissimo_et_al:LIPIcs.FSCD.2024.21,
  author =	{Felicissimo, Thiago and Winterhalter, Th\'{e}o},
  title =	{{Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{21:1--21:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.21},
  URN =		{urn:nbn:de:0030-drops-203503},
  doi =		{10.4230/LIPIcs.FSCD.2024.21},
  annote =	{Keywords: Dedukti, Rewriting, Confluence, Dependent types, Cumulativity, Universes}
}
Document
Automating Boundary Filling in Cubical Agda

Authors: Maximilian Doré, Evan Cavallo, and Anders Mörtberg


Abstract
When working in a proof assistant, automation is key to discharging routine proof goals such as equations between algebraic expressions. Homotopy Type Theory allows the user to reason about higher structures, such as topological spaces, using higher inductive types (HITs) and univalence. Cubical Agda is an extension of Agda with computational support for HITs and univalence. A difficulty when working in Cubical Agda is dealing with the complex combinatorics of higher structures, an infinite-dimensional generalisation of equational reasoning. To solve these higher-dimensional equations consists in constructing cubes with specified boundaries. We develop a simplified cubical language in which we isolate and study two automation problems: contortion solving, where we attempt to "contort" a cube to fit a given boundary, and the more general Kan solving, where we search for solutions that involve pasting multiple cubes together. Both problems are difficult in the general case - Kan solving is even undecidable - so we focus on heuristics that perform well on practical examples. We provide a solver for the contortion problem using a reformulation of contortions in terms of poset maps, while we solve Kan problems using constraint satisfaction programming. We have implemented our algorithms in an experimental Haskell solver that can be used to automatically solve goals presented by Cubical Agda. We illustrate this with a case study establishing the Eckmann-Hilton theorem using our solver, as well as various benchmarks - providing the ground for further study of proof automation in cubical type theories.

Cite as

Maximilian Doré, Evan Cavallo, and Anders Mörtberg. Automating Boundary Filling in Cubical Agda. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 22:1-22:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dore_et_al:LIPIcs.FSCD.2024.22,
  author =	{Dor\'{e}, Maximilian and Cavallo, Evan and M\"{o}rtberg, Anders},
  title =	{{Automating Boundary Filling in Cubical Agda}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{22:1--22:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.22},
  URN =		{urn:nbn:de:0030-drops-203514},
  doi =		{10.4230/LIPIcs.FSCD.2024.22},
  annote =	{Keywords: Cubical Agda, Automated Reasoning, Constraint Satisfaction Programming}
}
Document
Mirroring Call-By-Need, or Values Acting Silly

Authors: Beniamino Accattoli and Adrienne Lancelot


Abstract
Call-by-need evaluation for the λ-calculus can be seen as merging the best of call-by-name and call-by-value, namely the wise erasing behaviour of the former and the wise duplicating behaviour of the latter. To better understand how duplication and erasure can be combined, we design a degenerated calculus, dubbed call-by-silly, that is symmetric to call-by-need in that it merges the worst of call-by-name and call-by-value, namely silly duplications by-name and silly erasures by-value. We validate the design of the call-by-silly calculus via rewriting properties and multi types. In particular, we mirror the main theorem about call-by-need - that is, its operational equivalence with call-by-name - showing that call-by-silly and call-by-value induce the same contextual equivalence. This fact shows the blindness with respect to efficiency of call-by-value contextual equivalence. We also define a call-by-silly strategy and measure its length via tight multi types. Lastly, we prove that the call-by-silly strategy computes evaluation sequences of maximal length in the calculus.

Cite as

Beniamino Accattoli and Adrienne Lancelot. Mirroring Call-By-Need, or Values Acting Silly. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 23:1-23:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{accattoli_et_al:LIPIcs.FSCD.2024.23,
  author =	{Accattoli, Beniamino and Lancelot, Adrienne},
  title =	{{Mirroring Call-By-Need, or Values Acting Silly}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{23:1--23:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.23},
  URN =		{urn:nbn:de:0030-drops-203527},
  doi =		{10.4230/LIPIcs.FSCD.2024.23},
  annote =	{Keywords: Lambda calculus, intersection types, call-by-value, call-by-need}
}
Document
IMELL Cut Elimination with Linear Overhead

Authors: Beniamino Accattoli and Claudio Sacerdoti Coen


Abstract
Recently, Accattoli introduced the Exponential Substitution Calculus (ESC) given by untyped proof terms for Intuitionistic Multiplicative Exponential Linear Logic (IMELL), endowed with rewriting rules at-a-distance for cut elimination. He also introduced a new cut elimination strategy, dubbed the good strategy, and showed that its number of steps is a time cost model with polynomial overhead for ESC/IMELL, and the first such one. Here, we refine Accattoli’s result by introducing an abstract machine for ESC and proving that it implements the good strategy and computes cut-free terms/proofs within a linear overhead.

Cite as

Beniamino Accattoli and Claudio Sacerdoti Coen. IMELL Cut Elimination with Linear Overhead. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 24:1-24:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{accattoli_et_al:LIPIcs.FSCD.2024.24,
  author =	{Accattoli, Beniamino and Sacerdoti Coen, Claudio},
  title =	{{IMELL Cut Elimination with Linear Overhead}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{24:1--24:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.24},
  URN =		{urn:nbn:de:0030-drops-203539},
  doi =		{10.4230/LIPIcs.FSCD.2024.24},
  annote =	{Keywords: Lambda calculus, linear logic, abstract machines}
}
Document
Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories

Authors: Ralph Matthes, Kobe Wullaert, and Benedikt Ahrens


Abstract
We describe a generic construction of non-wellfounded syntax involving variable binding and its monadic substitution operation. Our construction of the syntax and its substitution takes place in category theory, notably by using monoidal categories and strong functors between them. A language is specified by a multi-sorted binding signature, say Σ. First, we provide sufficient criteria for Σ to generate a language of possibly infinite terms, through ω-continuity. Second, we construct a monadic substitution operation for the language generated by Σ. A cornerstone in this construction is a mild generalization of the notion of heterogeneous substitution systems developed by Matthes and Uustalu; such a system encapsulates the necessary corecursion scheme for implementing substitution. The results are formalized in the Coq proof assistant, through the UniMath library of univalent mathematics.

Cite as

Ralph Matthes, Kobe Wullaert, and Benedikt Ahrens. Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 25:1-25:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{matthes_et_al:LIPIcs.FSCD.2024.25,
  author =	{Matthes, Ralph and Wullaert, Kobe and Ahrens, Benedikt},
  title =	{{Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{25:1--25:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.25},
  URN =		{urn:nbn:de:0030-drops-203540},
  doi =		{10.4230/LIPIcs.FSCD.2024.25},
  annote =	{Keywords: Non-wellfounded syntax, Substitution, Monoidal categories, Actegories, Tensorial strength, Proof assistant Coq, UniMath library}
}
Document
On the Logical Structure of Some Maximality and Well-Foundedness Principles Equivalent to Choice Principles

Authors: Hugo Herbelin and Jad Koleilat


Abstract
We study the logical structure of Teichmüller-Tukey lemma, a maximality principle equivalent to the axiom of choice and show that it corresponds to the generalisation to arbitrary cardinals of update induction, a well-foundedness principle from constructive mathematics classically equivalent to the axiom of dependent choice. From there, we state general forms of maximality and well-foundedness principles equivalent to the axiom of choice, including a variant of Zorn’s lemma. A comparison with the general class of choice and bar induction principles given by Brede and the first author is initiated.

Cite as

Hugo Herbelin and Jad Koleilat. On the Logical Structure of Some Maximality and Well-Foundedness Principles Equivalent to Choice Principles. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 26:1-26:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{herbelin_et_al:LIPIcs.FSCD.2024.26,
  author =	{Herbelin, Hugo and Koleilat, Jad},
  title =	{{On the Logical Structure of Some Maximality and Well-Foundedness Principles Equivalent to Choice Principles}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{26:1--26:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.26},
  URN =		{urn:nbn:de:0030-drops-203551},
  doi =		{10.4230/LIPIcs.FSCD.2024.26},
  annote =	{Keywords: axiom of choice, Teichm\"{u}ller-Tukey lemma, update induction, constructive reverse mathematics}
}
Document
A Verified Algorithm for Deciding Pattern Completeness

Authors: René Thiemann and Akihisa Yamada


Abstract
Pattern completeness is the property that the left-hand sides of a functional program cover all cases w.r.t. pattern matching. In the context of term rewriting a related notion is quasi-reducibility, a prerequisite if one wants to perform ground confluence proofs by rewriting induction. In order to certify such confluence proofs, we develop a novel algorithm that decides pattern completeness and that can be used to ensure quasi-reducibility. One of the advantages of the proposed algorithm is its simple structure: it is similar to that of a regular matching algorithm and, unlike an existing decision procedure for quasi-reducibility, it avoids enumerating all terms up to a given depth. Despite the simple structure, proving the correctness of the algorithm is not immediate. Therefore we formalize the algorithm and verify its correctness using the proof assistant Isabelle/HOL. To this end, we not only verify some auxiliary algorithms, but also design an Isabelle library on sorted term rewriting. Moreover, we export the verified code in Haskell and experimentally evaluate its performance. We observe that our algorithm significantly outperforms existing algorithms, even including the pattern completeness check of the GHC Haskell compiler.

Cite as

René Thiemann and Akihisa Yamada. A Verified Algorithm for Deciding Pattern Completeness. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 27:1-27:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{thiemann_et_al:LIPIcs.FSCD.2024.27,
  author =	{Thiemann, Ren\'{e} and Yamada, Akihisa},
  title =	{{A Verified Algorithm for Deciding Pattern Completeness}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{27:1--27:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.27},
  URN =		{urn:nbn:de:0030-drops-203566},
  doi =		{10.4230/LIPIcs.FSCD.2024.27},
  annote =	{Keywords: Isabelle/HOL, pattern matching, term rewriting}
}
Document
Commutation Groups and State-Independent Contextuality

Authors: Samson Abramsky, Şerban-Ion Cercelescu, and Carmen-Maria Constantin


Abstract
We introduce an algebraic structure for studying state-independent contextuality arguments, a key form of quantum non-classicality exemplified by the well-known Peres-Mermin magic square, and used as a source of quantum advantage. We introduce commutation groups presented by generators and relations, and analyse them in terms of a string rewriting system. There is also a linear algebraic construction, a directed version of the Heisenberg group. We introduce contextual words as a general form of contextuality witness. We characterise when contextual words can arise in commutation groups, and explicitly construct non-contextual value assignments in other cases. We give unitary representations of commutation groups as subgroups of generalized Pauli n-groups.

Cite as

Samson Abramsky, Şerban-Ion Cercelescu, and Carmen-Maria Constantin. Commutation Groups and State-Independent Contextuality. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{abramsky_et_al:LIPIcs.FSCD.2024.28,
  author =	{Abramsky, Samson and Cercelescu, \c{S}erban-Ion and Constantin, Carmen-Maria},
  title =	{{Commutation Groups and State-Independent Contextuality}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.28},
  URN =		{urn:nbn:de:0030-drops-203572},
  doi =		{10.4230/LIPIcs.FSCD.2024.28},
  annote =	{Keywords: Contextuality, state-independence, quantum mechanics, Pauli group, group presentations, unitary representations}
}
Document
Böhm and Taylor for All!

Authors: Aloÿs Dufour and Damiano Mazza


Abstract
Böhm approximations, used in the definition of Böhm trees, are a staple of the semantics of the lambda-calculus. Introduced more recently by Ehrhard and Regnier, Taylor approximations provide a quantitative account of the behavior of programs and are well-known to be connected to intersection types. The key relation between these two notions of approximations is a commutation theorem, roughly stating that Taylor approximations of Böhm trees are the same as Böhm trees of Taylor approximations. Böhm and Taylor approximations are available for several variants or extensions of the lambda-calculus and, in some cases, commutation theorems are known. In this paper, we define Böhm and Taylor approximations and prove the commutation theorem in a very general setting. We also introduce (non-idempotent) intersection types at this level of generality. From this, we show how the commutation theorem and intersection types may be applied to any calculus embedding in a sufficiently nice way into our general calculus. All known Böhm-Taylor commutation theorems, as well as new ones, follow by this uniform construction.

Cite as

Aloÿs Dufour and Damiano Mazza. Böhm and Taylor for All!. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 29:1-29:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dufour_et_al:LIPIcs.FSCD.2024.29,
  author =	{Dufour, Alo\"{y}s and Mazza, Damiano},
  title =	{{B\"{o}hm and Taylor for All!}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{29:1--29:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.29},
  URN =		{urn:nbn:de:0030-drops-203582},
  doi =		{10.4230/LIPIcs.FSCD.2024.29},
  annote =	{Keywords: Linear logic, Differential linear logic, Taylor expansion of lambda-terms, B\"{o}hm trees, Process calculi}
}
Document
homotopy.io: A Proof Assistant for Finitely-Presented Globular n-Categories

Authors: Nathan Corbyn, Lukas Heidemann, Nick Hu, Chiara Sarti, Calin Tataru, and Jamie Vicary


Abstract
We present the proof assistant homotopy.io for working with finitely-presented semistrict higher categories. The tool runs in the browser with a point-and-click interface, allowing direct manipulation of proof objects via a graphical representation. We describe the user interface and explain how the tool can be used in practice. We also describe the essential subsystems of the tool, including collapse, contraction, expansion, typechecking, and layout, as well as key implementation details including data structure encoding, memoisation, and rendering. These technical innovations have been essential for achieving good performance in a resource-constrained setting.

Cite as

Nathan Corbyn, Lukas Heidemann, Nick Hu, Chiara Sarti, Calin Tataru, and Jamie Vicary. homotopy.io: A Proof Assistant for Finitely-Presented Globular n-Categories. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 30:1-30:26, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{corbyn_et_al:LIPIcs.FSCD.2024.30,
  author =	{Corbyn, Nathan and Heidemann, Lukas and Hu, Nick and Sarti, Chiara and Tataru, Calin and Vicary, Jamie},
  title =	{{homotopy.io: A Proof Assistant for Finitely-Presented Globular n-Categories}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{30:1--30:26},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.30},
  URN =		{urn:nbn:de:0030-drops-203594},
  doi =		{10.4230/LIPIcs.FSCD.2024.30},
  annote =	{Keywords: Higher category theory, proof assistant, string diagrams}
}
Document
Equational Theories and Validity for Logically Constrained Term Rewriting

Authors: Takahito Aoto, Naoki Nishida, and Jonas Schöpf


Abstract
Logically constrained term rewriting is a relatively new formalism where rules are equipped with constraints over some arbitrary theory. Although there are many recent advances with respect to rewriting induction, completion, complexity analysis and confluence analysis for logically constrained term rewriting, these works solely focus on the syntactic side of the formalism lacking detailed investigations on semantics. In this paper, we investigate a semantic side of logically constrained term rewriting. To this end, we first define constrained equations, constrained equational theories and validity of the former based on the latter. After presenting the relationship of validity and conversion of rewriting, we then construct a sound inference system to prove validity of constrained equations in constrained equational theories. Finally, we give an algebraic semantics, which enables one to establish invalidity of constrained equations in constrained equational theories. This algebraic semantics derives a new notion of consistency for constrained equational theories.

Cite as

Takahito Aoto, Naoki Nishida, and Jonas Schöpf. Equational Theories and Validity for Logically Constrained Term Rewriting. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 31:1-31:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{aoto_et_al:LIPIcs.FSCD.2024.31,
  author =	{Aoto, Takahito and Nishida, Naoki and Sch\"{o}pf, Jonas},
  title =	{{Equational Theories and Validity for Logically Constrained Term Rewriting}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{31:1--31:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.31},
  URN =		{urn:nbn:de:0030-drops-203607},
  doi =		{10.4230/LIPIcs.FSCD.2024.31},
  annote =	{Keywords: constrained equation, constrained equational theory, logically constrained term rewriting, algebraic semantics, consistency}
}
Document
Termination of Generalized Term Rewriting Systems

Authors: Salvador Lucas


Abstract
We investigate termination of Generalized Term Rewriting Systems (GTRS), which extend Conditional Term Rewriting Systems by considering replacement restrictions on selected arguments of function symbols, as in Context-Sensitive Rewriting, and conditional rewriting rules whose conditional part may include not only a mix of the usual (reachability, joinability,...) conditions, but also atoms defined by a set of definite Horn clauses. GTRS can be used to prove confluence and termination of Generalized Rewrite Theories and Maude programs. We have characterized confluence of terminating GTRS as the joinability of a finite set of conditional pairs. Since termination of GTRS is underexplored to date, this paper introduces a Dependency Pair Framework which is well-suited to automatically (dis)prove termination of GTRS.

Cite as

Salvador Lucas. Termination of Generalized Term Rewriting Systems. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 32:1-32:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lucas:LIPIcs.FSCD.2024.32,
  author =	{Lucas, Salvador},
  title =	{{Termination of Generalized Term Rewriting Systems}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{32:1--32:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.32},
  URN =		{urn:nbn:de:0030-drops-203616},
  doi =		{10.4230/LIPIcs.FSCD.2024.32},
  annote =	{Keywords: Program Analysis, Reduction-Based Systems, Termination}
}
Document
State Canonization and Early Pruning in Width-Based Automated Theorem Proving

Authors: Mateus de Oliveira Oliveira and Farhad Vadiee


Abstract
Width-based automated theorem proving is a framework where counter-examples for graph theoretic conjectures are searched width-wise relative to some graph width measure, such as treewidth or pathwidth. In a recent work it has been shown that dynamic programming algorithms operating on tree decompositions can be combined together with the purpose of width-based theorem proving. This approach can be used to show that several long-standing conjectures in graph theory can be tested in time 2^{2^{k^{O(1)}}} on the class of graphs of treewidth at most k. In this work, we give the first steps towards evaluating the viability of this framework from a practical standpoint. At the same time, we advance the framework in two directions. First, we introduce a state-canonization technique that significantly reduces the number of states evaluated during the search for a counter-example of the conjecture. Second, we introduce an early-pruning technique that can be applied in the study of conjectures of the form ℙ₁ → ℙ₂, for graph properties ℙ₁ and ℙ₂, where ℙ₁ is a property closed under subgraphs. As a concrete application, we use our framework in the study of graph theoretic conjectures related to coloring triangle free graphs. In particular, our algorithm is able to show that Reed’s conjecture for triangle free graphs is valid on the class of graphs of pathwidth at most 5, and on graphs of treewidth at most 3. Perhaps more interestingly, our algorithm is able to construct in a completely automated way counter-examples for non-valid strengthenings of Reed’s conjecture. These are the first results showing that width-based automated theorem proving is a promising avenue in the study of graph-theoretic conjectures.

Cite as

Mateus de Oliveira Oliveira and Farhad Vadiee. State Canonization and Early Pruning in Width-Based Automated Theorem Proving. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 33:1-33:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{deoliveiraoliveira_et_al:LIPIcs.FSCD.2024.33,
  author =	{de Oliveira Oliveira, Mateus and Vadiee, Farhad},
  title =	{{State Canonization and Early Pruning in Width-Based Automated Theorem Proving}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{33:1--33:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.33},
  URN =		{urn:nbn:de:0030-drops-203622},
  doi =		{10.4230/LIPIcs.FSCD.2024.33},
  annote =	{Keywords: Width-Based Automated Theorem Proving, Dynamic Programming, Parameterized Complexity}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail