LIPIcs, Volume 131

4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)



Thumbnail PDF

Event

FSCD 2019, June 24-30, 2019, Dortmund, Germany

Editor

Herman Geuvers
  • Radboud University Nijmegen, The Netherlands
  • Technical University Eindhoven, The Netherlands

Publication Details

  • published at: 2019-06-18
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-107-8
  • DBLP: db/conf/rta/fscd2019

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 131, FSCD'19, Complete Volume

Authors: Herman Geuvers


Abstract
LIPIcs, Volume 131, FSCD'19, Complete Volume

Cite as

4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@Proceedings{geuvers:LIPIcs.FSCD.2019,
  title =	{{LIPIcs, Volume 131, FSCD'19, Complete Volume}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019},
  URN =		{urn:nbn:de:0030-drops-107734},
  doi =		{10.4230/LIPIcs.FSCD.2019},
  annote =	{Keywords: Theory of computation, Models of computation, Formal languages and automata theory, Logic, Semantics and reasoning}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Herman Geuvers


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 0:i-0:xx, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{geuvers:LIPIcs.FSCD.2019.0,
  author =	{Geuvers, Herman},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{0:i--0:xx},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.0},
  URN =		{urn:nbn:de:0030-drops-105070},
  doi =		{10.4230/LIPIcs.FSCD.2019.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
A Fresh Look at the lambda-Calculus (Invited Talk)

Authors: Beniamino Accattoli


Abstract
The (untyped) lambda-calculus is almost 90 years old. And yet - we argue here - its study is far from being over. The paper is a bird’s eye view of the questions the author worked on in the last few years: how to measure the complexity of lambda-terms, how to decompose their evaluation, how to implement it, and how all this varies according to the evaluation strategy. The paper aims at inducing a new way of looking at an old topic, focussing on high-level issues and perspectives.

Cite as

Beniamino Accattoli. A Fresh Look at the lambda-Calculus (Invited Talk). In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 1:1-1:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{accattoli:LIPIcs.FSCD.2019.1,
  author =	{Accattoli, Beniamino},
  title =	{{A Fresh Look at the lambda-Calculus}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{1:1--1:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.1},
  URN =		{urn:nbn:de:0030-drops-105083},
  doi =		{10.4230/LIPIcs.FSCD.2019.1},
  annote =	{Keywords: lambda-calculus, sharing, abstract machines, type systems, rewriting}
}
Document
Invited Talk
A Linear Logical Framework in Hybrid (Invited Talk)

Authors: Amy P. Felty


Abstract
We present a linear logical framework implemented within the Hybrid system [Amy P. Felty and Alberto Momigliano, 2012]. Hybrid is designed to support the use of higher-order abstract syntax for representing and reasoning about formal systems, implemented in the Coq Proof Assistant. In this work, we extend the system with two linear specification logics, which provide infrastructure for reasoning directly about object languages with linear features. We originally developed this framework in order to address the challenges of reasoning about the type system of a quantum lambda calculus. In particular, we started by considering the Proto-Quipper language [Neil J. Ross, 2015], which contains the core of Quipper [Green et al., 2013; Peter Selinger and Benoît Valiron, 2006]. Quipper is a relatively new quantum programming language under active development with a linear type system. We have completed a formal proof of type soundness for Proto-Quipper [Mohamed Yousri Mahmoud and Amy P. Felty, 2018]. Our current work includes extending this work to other properties of Proto-Quipper, reasoning about other quantum programming languages [Mohamed Yousri Mahmoud and Amy P. Felty, 2018], and reasoning about other languages such as the meta-theory of low-level abstract machine code. We are also interested in applying this framework to applications outside the domain of meta-theory of programming languages and have focused on two areas - formal reasoning about the proof theory of focused linear sequent calculi and modeling biological processes as transition systems and proving properties about them. We found that a slight extension of the initial linear specification logic allowed us to provide succinct encodings and facilitate reasoning in these new domains. We illustrate by discussing a model of breast cancer progression as a set of transition rules and proving properties about this model [Joëlle Despeyroux et al., 2018]. Current work also includes modeling stem cells as they mature into different types of blood cells. This work illustrates the use of Hybrid as a meta-logical framework for fast prototyping of logical frameworks, which is achieved by defining inference rules of a specification logic inductively in Coq and building a library of definitions and lemmas used to reason about a class of object logics. Our focus here is on linear specification logics and their applications.

Cite as

Amy P. Felty. A Linear Logical Framework in Hybrid (Invited Talk). In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 2:1-2:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{felty:LIPIcs.FSCD.2019.2,
  author =	{Felty, Amy P.},
  title =	{{A Linear Logical Framework in Hybrid}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{2:1--2:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.2},
  URN =		{urn:nbn:de:0030-drops-105099},
  doi =		{10.4230/LIPIcs.FSCD.2019.2},
  annote =	{Keywords: Logical frameworks, proof assistants, linear logic}
}
Document
Invited Talk
Extending Maximal Completion (Invited Talk)

Authors: Sarah Winkler


Abstract
Maximal completion (Klein and Hirokawa 2011) is an elegantly simple yet powerful variant of Knuth-Bendix completion. This paper extends the approach to ordered completion and theorem proving as well as normalized completion. An implementation of the different procedures is described, and its practicality is demonstrated by various examples.

Cite as

Sarah Winkler. Extending Maximal Completion (Invited Talk). In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 3:1-3:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{winkler:LIPIcs.FSCD.2019.3,
  author =	{Winkler, Sarah},
  title =	{{Extending Maximal Completion}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{3:1--3:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.3},
  URN =		{urn:nbn:de:0030-drops-105102},
  doi =		{10.4230/LIPIcs.FSCD.2019.3},
  annote =	{Keywords: automated reasoning, completion, theorem proving}
}
Document
Invited Talk
Some Semantic Issues in Probabilistic Programming Languages (Invited Talk)

Authors: Hongseok Yang


Abstract
This is a slightly extended abstract of my talk at FSCD'19 about probabilistic programming and a few semantic issues on it. The main purpose of this abstract is to provide keywords and references on the work mentioned in my talk, and help interested audience to do follow-up study.

Cite as

Hongseok Yang. Some Semantic Issues in Probabilistic Programming Languages (Invited Talk). In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 4:1-4:6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{yang:LIPIcs.FSCD.2019.4,
  author =	{Yang, Hongseok},
  title =	{{Some Semantic Issues in Probabilistic Programming Languages}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{4:1--4:6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.4},
  URN =		{urn:nbn:de:0030-drops-105118},
  doi =		{10.4230/LIPIcs.FSCD.2019.4},
  annote =	{Keywords: Probabilistic Programming, Denotational Semantics, Non-differentiable Models, Bayesian Nonparametrics, Exchangeability}
}
Document
Bicategories in Univalent Foundations

Authors: Benedikt Ahrens, Dan Frumin, Marco Maggesi, and Niels van der Weide


Abstract
We develop bicategory theory in univalent foundations. Guided by the notion of univalence for (1-)categories studied by Ahrens, Kapulkin, and Shulman, we define and study univalent bicategories. To construct examples of those, we develop the notion of "displayed bicategories", an analog of displayed 1-categories introduced by Ahrens and Lumsdaine. Displayed bicategories allow us to construct univalent bicategories in a modular fashion. To demonstrate the applicability of this notion, we prove several bicategories are univalent. Among these are the bicategory of univalent categories with families and the bicategory of pseudofunctors between univalent bicategories. Our work is formalized in the UniMath library of univalent mathematics.

Cite as

Benedikt Ahrens, Dan Frumin, Marco Maggesi, and Niels van der Weide. Bicategories in Univalent Foundations. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 5:1-5:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{ahrens_et_al:LIPIcs.FSCD.2019.5,
  author =	{Ahrens, Benedikt and Frumin, Dan and Maggesi, Marco and van der Weide, Niels},
  title =	{{Bicategories in Univalent Foundations}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{5:1--5:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.5},
  URN =		{urn:nbn:de:0030-drops-105124},
  doi =		{10.4230/LIPIcs.FSCD.2019.5},
  annote =	{Keywords: bicategory theory, univalent mathematics, dependent type theory, Coq}
}
Document
Modular Specification of Monads Through Higher-Order Presentations

Authors: Benedikt Ahrens, André Hirschowitz, Ambroise Lafont, and Marco Maggesi


Abstract
In their work on second-order equational logic, Fiore and Hur have studied presentations of simply typed languages by generating binding constructions and equations among them. To each pair consisting of a binding signature and a set of equations, they associate a category of "models", and they give a monadicity result which implies that this category has an initial object, which is the language presented by the pair. In the present work, we propose, for the untyped setting, a variant of their approach where monads and modules over them are the central notions. More precisely, we study, for monads over sets, presentations by generating ("higher-order") operations and equations among them. We consider a notion of 2-signature which allows to specify a monad with a family of binding operations subject to a family of equations, as is the case for the paradigmatic example of the lambda calculus, specified by its two standard constructions (application and abstraction) subject to beta- and eta-equalities. Such a 2-signature is hence a pair (Sigma,E) of a binding signature Sigma and a family E of equations for Sigma. This notion of 2-signature has been introduced earlier by Ahrens in a slightly different context. We associate, to each 2-signature (Sigma,E), a category of "models of (Sigma,E)"; and we say that a 2-signature is "effective" if this category has an initial object; the monad underlying this (essentially unique) object is the "monad specified by the 2-signature". Not every 2-signature is effective; we identify a class of 2-signatures, which we call "algebraic", that are effective. Importantly, our 2-signatures together with their models enjoy "modularity": when we glue (algebraic) 2-signatures together, their initial models are glued accordingly. We provide a computer formalization for our main results.

Cite as

Benedikt Ahrens, André Hirschowitz, Ambroise Lafont, and Marco Maggesi. Modular Specification of Monads Through Higher-Order Presentations. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 6:1-6:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{ahrens_et_al:LIPIcs.FSCD.2019.6,
  author =	{Ahrens, Benedikt and Hirschowitz, Andr\'{e} and Lafont, Ambroise and Maggesi, Marco},
  title =	{{Modular Specification of Monads Through Higher-Order Presentations}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{6:1--6:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.6},
  URN =		{urn:nbn:de:0030-drops-105136},
  doi =		{10.4230/LIPIcs.FSCD.2019.6},
  annote =	{Keywords: free monads, presentation of monads, initial semantics, signatures, syntax, monadic substitution, computer-checked proofs}
}
Document
Towards the Average-Case Analysis of Substitution Resolution in Lambda-Calculus

Authors: Maciej Bendkowski


Abstract
Substitution resolution supports the computational character of beta-reduction, complementing its execution with a capture-avoiding exchange of terms for bound variables. Alas, the meta-level definition of substitution, masking a non-trivial computation, turns beta-reduction into an atomic rewriting rule, despite its varying operational complexity. In the current paper we propose a somewhat indirect average-case analysis of substitution resolution in the classic lambda-calculus, based on the quantitative analysis of substitution in lambda-upsilon, an extension of lambda-calculus internalising the upsilon-calculus of explicit substitutions. Within this framework, we show that for any fixed n >= 0, the probability that a uniformly random, conditioned on size, lambda-upsilon-term upsilon-normalises in n normal-order (i.e. leftmost-outermost) reduction steps tends to a computable limit as the term size tends to infinity. For that purpose, we establish an effective hierarchy (G_n)_n of regular tree grammars partitioning upsilon-normalisable terms into classes of terms normalising in n normal-order rewriting steps. The main technical ingredient in our construction is an inductive approach to the construction of G_{n+1} out of G_n based, in turn, on the algorithmic construction of finite intersection partitions, inspired by Robinson’s unification algorithm. Finally, we briefly discuss applications of our approach to other term rewriting systems, focusing on two closely related formalisms, i.e. the full lambda-upsilon-calculus and combinatory logic.

Cite as

Maciej Bendkowski. Towards the Average-Case Analysis of Substitution Resolution in Lambda-Calculus. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 7:1-7:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bendkowski:LIPIcs.FSCD.2019.7,
  author =	{Bendkowski, Maciej},
  title =	{{Towards the Average-Case Analysis of Substitution Resolution in Lambda-Calculus}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{7:1--7:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.7},
  URN =		{urn:nbn:de:0030-drops-105144},
  doi =		{10.4230/LIPIcs.FSCD.2019.7},
  annote =	{Keywords: lambda calculus, explicit substitutions, complexity, combinatorics}
}
Document
Deriving an Abstract Machine for Strong Call by Need

Authors: Małgorzata Biernacka and Witold Charatonik


Abstract
Strong call by need is a reduction strategy for computing strong normal forms in the lambda calculus, where terms are fully normalized inside the bodies of lambda abstractions and open terms are allowed. As typical for a call-by-need strategy, the arguments of a function call are evaluated at most once, only when they are needed. This strategy has been introduced recently by Balabonski et al., who proved it complete with respect to full beta-reduction and conservative over weak call by need. We show a novel reduction semantics and the first abstract machine for the strong call-by-need strategy. The reduction semantics incorporates syntactic distinction between strict and non-strict let constructs and is geared towards an efficient implementation. It has been defined within the framework of generalized refocusing, i.e., a generic method that allows to go from a reduction semantics instrumented with context kinds to the corresponding abstract machine; the machine is thus correct by construction. The format of the semantics that we use makes it explicit that strong call by need is an example of a hybrid strategy with an infinite number of substrategies.

Cite as

Małgorzata Biernacka and Witold Charatonik. Deriving an Abstract Machine for Strong Call by Need. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 8:1-8:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{biernacka_et_al:LIPIcs.FSCD.2019.8,
  author =	{Biernacka, Ma{\l}gorzata and Charatonik, Witold},
  title =	{{Deriving an Abstract Machine for Strong Call by Need}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{8:1--8:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.8},
  URN =		{urn:nbn:de:0030-drops-105159},
  doi =		{10.4230/LIPIcs.FSCD.2019.8},
  annote =	{Keywords: abstract machines, reduction semantics}
}
Document
Dependency Pairs Termination in Dependent Type Theory Modulo Rewriting

Authors: Frédéric Blanqui, Guillaume Genestier, and Olivier Hermant


Abstract
Dependency pairs are a key concept at the core of modern automated termination provers for first-order term rewriting systems. In this paper, we introduce an extension of this technique for a large class of dependently-typed higher-order rewriting systems. This extends previous results by Wahlstedt on the one hand and the first author on the other hand to strong normalization and non-orthogonal rewriting systems. This new criterion is implemented in the type-checker Dedukti.

Cite as

Frédéric Blanqui, Guillaume Genestier, and Olivier Hermant. Dependency Pairs Termination in Dependent Type Theory Modulo Rewriting. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 9:1-9:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{blanqui_et_al:LIPIcs.FSCD.2019.9,
  author =	{Blanqui, Fr\'{e}d\'{e}ric and Genestier, Guillaume and Hermant, Olivier},
  title =	{{Dependency Pairs Termination in Dependent Type Theory Modulo Rewriting}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{9:1--9:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.9},
  URN =		{urn:nbn:de:0030-drops-105167},
  doi =		{10.4230/LIPIcs.FSCD.2019.9},
  annote =	{Keywords: termination, higher-order rewriting, dependent types, dependency pairs}
}
Document
A Generic Framework for Higher-Order Generalizations

Authors: David M. Cerna and Temur Kutsia


Abstract
We consider a generic framework for anti-unification of simply typed lambda terms. It helps to compute generalizations which contain maximally common top part of the input expressions, without nesting generalization variables. The rules of the corresponding anti-unification algorithm are formulated, and their soundness and termination are proved. The algorithm depends on a parameter which decides how to choose terms under generalization variables. Changing the particular values of the parameter, we obtained four new unitary variants of higher-order anti-unification and also showed how the already known pattern generalization fits into the schema.

Cite as

David M. Cerna and Temur Kutsia. A Generic Framework for Higher-Order Generalizations. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 10:1-10:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{cerna_et_al:LIPIcs.FSCD.2019.10,
  author =	{Cerna, David M. and Kutsia, Temur},
  title =	{{A Generic Framework for Higher-Order Generalizations}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{10:1--10:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.10},
  URN =		{urn:nbn:de:0030-drops-105175},
  doi =		{10.4230/LIPIcs.FSCD.2019.10},
  annote =	{Keywords: anti-unification, typed lambda calculus, least general generalization}
}
Document
Homotopy Canonicity for Cubical Type Theory

Authors: Thierry Coquand, Simon Huber, and Christian Sattler


Abstract
Cubical type theory provides a constructive justification of homotopy type theory and satisfies canonicity: every natural number is convertible to a numeral. A crucial ingredient of cubical type theory is a path lifting operation which is explained computationally by induction on the type involving several non-canonical choices. In this paper we show by a sconing argument that if we remove these equations for the path lifting operation from the system, we still retain homotopy canonicity: every natural number is path equal to a numeral.

Cite as

Thierry Coquand, Simon Huber, and Christian Sattler. Homotopy Canonicity for Cubical Type Theory. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 11:1-11:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{coquand_et_al:LIPIcs.FSCD.2019.11,
  author =	{Coquand, Thierry and Huber, Simon and Sattler, Christian},
  title =	{{Homotopy Canonicity for Cubical Type Theory}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{11:1--11:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.11},
  URN =		{urn:nbn:de:0030-drops-105188},
  doi =		{10.4230/LIPIcs.FSCD.2019.11},
  annote =	{Keywords: cubical type theory, univalence, canonicity, sconing, Artin glueing}
}
Document
Polymorphic Higher-Order Termination

Authors: Łukasz Czajka and Cynthia Kop


Abstract
We generalise the termination method of higher-order polynomial interpretations to a setting with impredicative polymorphism. Instead of using weakly monotonic functionals, we interpret terms in a suitable extension of System F_omega. This enables a direct interpretation of rewrite rules which make essential use of impredicative polymorphism. In addition, our generalisation eases the applicability of the method in the non-polymorphic setting by allowing for the encoding of inductive data types. As an illustration of the potential of our method, we prove termination of a substantial fragment of full intuitionistic second-order propositional logic with permutative conversions.

Cite as

Łukasz Czajka and Cynthia Kop. Polymorphic Higher-Order Termination. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 12:1-12:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{czajka_et_al:LIPIcs.FSCD.2019.12,
  author =	{Czajka, {\L}ukasz and Kop, Cynthia},
  title =	{{Polymorphic Higher-Order Termination}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{12:1--12:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.12},
  URN =		{urn:nbn:de:0030-drops-105193},
  doi =		{10.4230/LIPIcs.FSCD.2019.12},
  annote =	{Keywords: termination, polymorphism, higher-order rewriting, permutative conversions}
}
Document
On the Taylor Expansion of Probabilistic lambda-terms

Authors: Ugo Dal Lago and Thomas Leventis


Abstract
We generalise Ehrhard and Regnier’s Taylor expansion from pure to probabilistic lambda-terms. We prove that the Taylor expansion is adequate when seen as a way to give semantics to probabilistic lambda-terms, and that there is a precise correspondence with probabilistic Böhm trees, as introduced by the second author. We prove this adequacy through notions of probabilistic resource terms and explicit Taylor expansion.

Cite as

Ugo Dal Lago and Thomas Leventis. On the Taylor Expansion of Probabilistic lambda-terms. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 13:1-13:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dallago_et_al:LIPIcs.FSCD.2019.13,
  author =	{Dal Lago, Ugo and Leventis, Thomas},
  title =	{{On the Taylor Expansion of Probabilistic lambda-terms}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{13:1--13:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.13},
  URN =		{urn:nbn:de:0030-drops-105206},
  doi =		{10.4230/LIPIcs.FSCD.2019.13},
  annote =	{Keywords: Probabilistic Lambda-Calculi, Taylor Expansion, Linear Logic}
}
Document
Proof Normalisation in a Logic Identifying Isomorphic Propositions

Authors: Alejandro Díaz-Caro and Gilles Dowek


Abstract
We define a fragment of propositional logic where isomorphic propositions, such as A wedge B and B wedge A, or A ==> (B wedge C) and (A ==> B) wedge (A ==> C) are identified. We define System I, a proof language for this logic, and prove its normalisation and consistency.

Cite as

Alejandro Díaz-Caro and Gilles Dowek. Proof Normalisation in a Logic Identifying Isomorphic Propositions. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 14:1-14:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{diazcaro_et_al:LIPIcs.FSCD.2019.14,
  author =	{D{\'\i}az-Caro, Alejandro and Dowek, Gilles},
  title =	{{Proof Normalisation in a Logic Identifying Isomorphic Propositions}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{14:1--14:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.14},
  URN =		{urn:nbn:de:0030-drops-105210},
  doi =		{10.4230/LIPIcs.FSCD.2019.14},
  annote =	{Keywords: Simply typed lambda calculus, Isomorphisms, Logic, Cut-elimination, Proof-reduction}
}
Document
lambda!-calculus, Intersection Types, and Involutions

Authors: Alberto Ciaffaglione, Pietro Di Gianantonio, Furio Honsell, Marina Lenisa, and Ivan Scagnetto


Abstract
Abramsky’s affine combinatory algebras are models of affine combinatory logic, which refines standard combinatory logic in the direction of Linear Logic. Abramsky introduced various universal models of computation based on affine combinatory algebras, consisting of partial involutions over a suitable formal language {of moves}, in order to discuss reversible computation in a Geometry of Interaction setting. We investigate partial involutions from the point of view of the model theory of lambda!-calculus. The latter is a refinement of the standard lambda-calculus, corresponding to affine combinatory logic. We introduce intersection type systems for the lambda!-calculus, by extending standard intersection types with a !_u-operator. These induce affine combinatory algebras, and, via suitable quotients, models of the lambda!-calculus. In particular, we introduce an intersection type system for assigning principal types to lambda!-terms, and we state a correspondence between the partial involution interpreting a combinator and the principal type of the corresponding lambda!-term. This analogy allows for explaining as unification between principal types the somewhat awkward linear application of involutions arising from Geometry of Interaction.

Cite as

Alberto Ciaffaglione, Pietro Di Gianantonio, Furio Honsell, Marina Lenisa, and Ivan Scagnetto. lambda!-calculus, Intersection Types, and Involutions. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 15:1-15:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{ciaffaglione_et_al:LIPIcs.FSCD.2019.15,
  author =	{Ciaffaglione, Alberto and Di Gianantonio, Pietro and Honsell, Furio and Lenisa, Marina and Scagnetto, Ivan},
  title =	{{lambda!-calculus, Intersection Types, and Involutions}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{15:1--15:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.15},
  URN =		{urn:nbn:de:0030-drops-105228},
  doi =		{10.4230/LIPIcs.FSCD.2019.15},
  annote =	{Keywords: Affine Combinatory Algebra, Affine Lambda-calculus, Intersection Types, Geometry of Interaction}
}
Document
Template Games, Simple Games, and Day Convolution

Authors: Clovis Eberhart, Tom Hirschowitz, and Alexis Laouar


Abstract
Template games [P.-A. Melliès, 2019] unify various approaches to game semantics, by exhibiting them as instances of a double-categorical variant of the slice construction. However, in the particular case of simple games [R. Harmer et al., 2007; C. Jacq and P.-A. Melliès, 2018], template games do not quite yield the standard (bi)category. We refine the construction using factorisation systems, obtaining as an instance a slight generalisation of simple games and strategies. This proves that template games have the descriptive power to capture combinatorial constraints defining well-known classes of games. Another instance is Day’s convolution monoidal structure on the category of presheaves over a strict monoidal category [B. Day, 1970], which answers a question raised in [C. Eberhart, 2018].

Cite as

Clovis Eberhart, Tom Hirschowitz, and Alexis Laouar. Template Games, Simple Games, and Day Convolution. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 16:1-16:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{eberhart_et_al:LIPIcs.FSCD.2019.16,
  author =	{Eberhart, Clovis and Hirschowitz, Tom and Laouar, Alexis},
  title =	{{Template Games, Simple Games, and Day Convolution}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{16:1--16:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.16},
  URN =		{urn:nbn:de:0030-drops-105237},
  doi =		{10.4230/LIPIcs.FSCD.2019.16},
  annote =	{Keywords: Game semantics, Day convolution, Categorical semantics}
}
Document
Differentials and Distances in Probabilistic Coherence Spaces

Authors: Thomas Ehrhard


Abstract
In probabilistic coherence spaces, a denotational model of probabilistic functional languages, morphisms are analytic and therefore smooth. We explore two related applications of the corresponding derivatives. First we show how derivatives allow to compute the expectation of execution time in the weak head reduction of probabilistic PCF (pPCF). Next we apply a general notion of "local" differential of morphisms to the proof of a Lipschitz property of these morphisms allowing in turn to relate the observational distance on pPCF terms to a distance the model is naturally equipped with. This suggests that extending probabilistic programming languages with derivatives, in the spirit of the differential lambda-calculus, could be quite meaningful.

Cite as

Thomas Ehrhard. Differentials and Distances in Probabilistic Coherence Spaces. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 17:1-17:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{ehrhard:LIPIcs.FSCD.2019.17,
  author =	{Ehrhard, Thomas},
  title =	{{Differentials and Distances in Probabilistic Coherence Spaces}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{17:1--17:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.17},
  URN =		{urn:nbn:de:0030-drops-105243},
  doi =		{10.4230/LIPIcs.FSCD.2019.17},
  annote =	{Keywords: Denotational semantics, probabilistic coherence spaces, differentials of programs}
}
Document
Modal Embeddings and Calling Paradigms

Authors: José Espírito Santo, Luís Pinto, and Tarmo Uustalu


Abstract
We study the computational interpretation of the two standard modal embeddings, usually named after Girard and Gödel, of intuitionistic logic into IS4. As source system we take either the call-by-name (cbn) or the call-by-value (cbv) lambda-calculus with simple types. The target system can be taken to be the, arguably, simplest fragment of IS4, here recast as a very simple lambda-calculus equipped with an indeterminate lax monoidal comonad. A slight refinement of the target and of the embeddings shows that: the target is a calculus indifferent to the calling paradigms cbn/cbv, obeying a new paradigm that we baptize call-by-box (cbb), and enjoying standardization; and that Girard’s (resp. Gödel’s) embbedding is a translation of cbn (resp. cbv) lambda-calculus into this calculus, using a compilation technique we call protecting-by-a-box, enjoying the preservation and reflection properties known for cps translations - but in a stronger form that allows the extraction of standardization for cbn or cbv as consequence of standardization for cbb. The modal target and embeddings achieve thus an unification of call-by-name and call-by-value as call-by-box.

Cite as

José Espírito Santo, Luís Pinto, and Tarmo Uustalu. Modal Embeddings and Calling Paradigms. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 18:1-18:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{espiritosanto_et_al:LIPIcs.FSCD.2019.18,
  author =	{Esp{\'\i}rito Santo, Jos\'{e} and Pinto, Lu{\'\i}s and Uustalu, Tarmo},
  title =	{{Modal Embeddings and Calling Paradigms}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{18:1--18:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.18},
  URN =		{urn:nbn:de:0030-drops-105256},
  doi =		{10.4230/LIPIcs.FSCD.2019.18},
  annote =	{Keywords: intuitionistic S4, call-by-name, call-by-value, comonadic lambda-calculus, standardization, indifference property}
}
Document
Probabilistic Rewriting: Normalization, Termination, and Unique Normal Forms

Authors: Claudia Faggian


Abstract
While a mature body of work supports the study of rewriting systems, abstract tools for Probabilistic Rewriting are still limited. We study in this setting questions such as uniqueness of the result (unique limit distribution) and normalizing strategies (is there a strategy to find a result with greatest probability?). The goal is to have tools to analyse the operational properties of probabilistic calculi (such as probabilistic lambda-calculi) whose evaluation is also non-deterministic, in the sense that different reductions are possible.

Cite as

Claudia Faggian. Probabilistic Rewriting: Normalization, Termination, and Unique Normal Forms. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 19:1-19:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{faggian:LIPIcs.FSCD.2019.19,
  author =	{Faggian, Claudia},
  title =	{{Probabilistic Rewriting: Normalization, Termination, and Unique Normal Forms}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{19:1--19:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.19},
  URN =		{urn:nbn:de:0030-drops-105263},
  doi =		{10.4230/LIPIcs.FSCD.2019.19},
  annote =	{Keywords: probabilistic rewriting, PARS, abstract rewriting systems, confluence, probabilistic lambda calculus}
}
Document
A Linear-Logical Reconstruction of Intuitionistic Modal Logic S4

Authors: Yosuke Fukuda and Akira Yoshimizu


Abstract
We propose a modal linear logic to reformulate intuitionistic modal logic S4 (IS4) in terms of linear logic, establishing an S4-version of Girard translation from IS4 to it. While the Girard translation from intuitionistic logic to linear logic is well-known, its extension to modal logic is non-trivial since a naive combination of the S4 modality and the exponential modality causes an undesirable interaction between the two modalities. To solve the problem, we introduce an extension of intuitionistic multiplicative exponential linear logic with a modality combining the S4 modality and the exponential modality, and show that it admits a sound translation from IS4. Through the Curry-Howard correspondence we further obtain a Geometry of Interaction Machine semantics of the modal lambda-calculus by Pfenning and Davies for staged computation.

Cite as

Yosuke Fukuda and Akira Yoshimizu. A Linear-Logical Reconstruction of Intuitionistic Modal Logic S4. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 20:1-20:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{fukuda_et_al:LIPIcs.FSCD.2019.20,
  author =	{Fukuda, Yosuke and Yoshimizu, Akira},
  title =	{{A Linear-Logical Reconstruction of Intuitionistic Modal Logic S4}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{20:1--20:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.20},
  URN =		{urn:nbn:de:0030-drops-105271},
  doi =		{10.4230/LIPIcs.FSCD.2019.20},
  annote =	{Keywords: linear logic, modal logic, Girard translation, Curry-Howard correspondence, geometry of interaction, staged computation}
}
Document
Sparse Tiling Through Overlap Closures for Termination of String Rewriting

Authors: Alfons Geser, Dieter Hofbauer, and Johannes Waldmann


Abstract
A strictly locally testable language is characterized by its set of admissible factors, prefixes and suffixes, called tiles. We over-approximate reachability sets in string rewriting by languages defined by sparse sets of tiles, containing only those that are reachable in derivations. Using the partial algebra defined by a tiling for semantic labeling, we obtain a transformational method for proving local termination. These algebras can be represented efficiently as finite automata of a certain shape. Using a known result on forward closures, and a new characterisation of overlap closures, we can automatically prove termination and relative termination, respectively. We report on experiments showing the strength of the method.

Cite as

Alfons Geser, Dieter Hofbauer, and Johannes Waldmann. Sparse Tiling Through Overlap Closures for Termination of String Rewriting. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 21:1-21:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{geser_et_al:LIPIcs.FSCD.2019.21,
  author =	{Geser, Alfons and Hofbauer, Dieter and Waldmann, Johannes},
  title =	{{Sparse Tiling Through Overlap Closures for Termination of String Rewriting}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{21:1--21:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.21},
  URN =		{urn:nbn:de:0030-drops-105282},
  doi =		{10.4230/LIPIcs.FSCD.2019.21},
  annote =	{Keywords: relative termination, semantic labeling, locally testable language, overlap closure}
}
Document
Proof Nets for First-Order Additive Linear Logic

Authors: Willem B. Heijltjes, Dominic J. D. Hughes, and Lutz Straßburger


Abstract
We present canonical proof nets for first-order additive linear logic, the fragment of linear logic with sum, product, and first-order universal and existential quantification. We present two versions of our proof nets. One, witness nets, retains explicit witnessing information to existential quantification. For the other, unification nets, this information is absent but can be reconstructed through unification. Unification nets embody a central contribution of the paper: first-order witness information can be left implicit, and reconstructed as needed. Witness nets are canonical for first-order additive sequent calculus. Unification nets in addition factor out any inessential choice for existential witnesses. Both notions of proof net are defined through coalescence, an additive counterpart to multiplicative contractibility, and for witness nets an additional geometric correctness criterion is provided. Both capture sequent calculus cut-elimination as a one-step global composition operation.

Cite as

Willem B. Heijltjes, Dominic J. D. Hughes, and Lutz Straßburger. Proof Nets for First-Order Additive Linear Logic. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 22:1-22:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{heijltjes_et_al:LIPIcs.FSCD.2019.22,
  author =	{Heijltjes, Willem B. and Hughes, Dominic J. D. and Stra{\ss}burger, Lutz},
  title =	{{Proof Nets for First-Order Additive Linear Logic}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{22:1--22:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.22},
  URN =		{urn:nbn:de:0030-drops-105297},
  doi =		{10.4230/LIPIcs.FSCD.2019.22},
  annote =	{Keywords: linear logic, first-order logic, proof nets, Herbrand’s theorem}
}
Document
The Sub-Additives: A Proof Theory for Probabilistic Choice extending Linear Logic

Authors: Ross Horne


Abstract
Probabilistic choice, where each branch of a choice is weighted according to a probability distribution, is an established approach for modelling processes, quantifying uncertainty in the environment and other sources of randomness. This paper uncovers new insight showing probabilistic choice has a purely logical interpretation as an operator in an extension of linear logic. By forbidding projection and injection, we reveal additive operators between the standard with and plus operators of linear logic. We call these operators the sub-additives. The attention of the reader is drawn to two sub-additive operators: the first being sound with respect to probabilistic choice; while the second arises due to the fact that probabilistic choice cannot be self-dual, hence has a de Morgan dual counterpart. The proof theoretic justification for the sub-additives is a cut elimination result, employing a technique called decomposition. The justification from the perspective of modelling probabilistic concurrent processes is that implication is sound with respect to established notions of probabilistic refinement, and is fully compositional.

Cite as

Ross Horne. The Sub-Additives: A Proof Theory for Probabilistic Choice extending Linear Logic. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 23:1-23:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{horne:LIPIcs.FSCD.2019.23,
  author =	{Horne, Ross},
  title =	{{The Sub-Additives: A Proof Theory for Probabilistic Choice extending Linear Logic}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{23:1--23:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.23},
  URN =		{urn:nbn:de:0030-drops-105300},
  doi =		{10.4230/LIPIcs.FSCD.2019.23},
  annote =	{Keywords: calculus of structures, probabilistic choice, probabilistic refinement}
}
Document
A Lower Bound of the Number of Rewrite Rules Obtained by Homological Methods

Authors: Mirai Ikebuchi


Abstract
It is well-known that some equational theories such as groups or boolean algebras can be defined by fewer equational axioms than the original axioms. However, it is not easy to determine if a given set of axioms is the smallest or not. Malbos and Mimram investigated a general method to find a lower bound of the cardinality of the set of equational axioms (or rewrite rules) that is equivalent to a given equational theory (or term rewriting systems), using homological algebra. Their method is an analog of Squier’s homology theory on string rewriting systems. In this paper, we develop the homology theory for term rewriting systems more and provide a better lower bound under a stronger notion of equivalence than their equivalence. The author also implemented a program to compute the lower bounds.

Cite as

Mirai Ikebuchi. A Lower Bound of the Number of Rewrite Rules Obtained by Homological Methods. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 24:1-24:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{ikebuchi:LIPIcs.FSCD.2019.24,
  author =	{Ikebuchi, Mirai},
  title =	{{A Lower Bound of the Number of Rewrite Rules Obtained by Homological Methods}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{24:1--24:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.24},
  URN =		{urn:nbn:de:0030-drops-105312},
  doi =		{10.4230/LIPIcs.FSCD.2019.24},
  annote =	{Keywords: Term rewriting systems, Equational logic, Homological algebra}
}
Document
Gluing for Type Theory

Authors: Ambrus Kaposi, Simon Huber, and Christian Sattler


Abstract
The relationship between categorical gluing and proofs using the logical relation technique is folklore. In this paper we work out this relationship for Martin-Löf type theory and show that parametricity and canonicity arise as special cases of gluing. The input of gluing is two models of type theory and a pseudomorphism between them and the output is a displayed model over the first model. A pseudomorphism preserves the categorical structure strictly, the empty context and context extension up to isomorphism, and there are no conditions on preservation of type formers. We look at three examples of pseudomorphisms: the identity on the syntax, the interpretation into the set model and the global section functor. Gluing along these result in syntactic parametricity, semantic parametricity and canonicity, respectively.

Cite as

Ambrus Kaposi, Simon Huber, and Christian Sattler. Gluing for Type Theory. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 25:1-25:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kaposi_et_al:LIPIcs.FSCD.2019.25,
  author =	{Kaposi, Ambrus and Huber, Simon and Sattler, Christian},
  title =	{{Gluing for Type Theory}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{25:1--25:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.25},
  URN =		{urn:nbn:de:0030-drops-105323},
  doi =		{10.4230/LIPIcs.FSCD.2019.25},
  annote =	{Keywords: Martin-L\"{o}f type theory, logical relations, parametricity, canonicity, quotient inductive types}
}
Document
The Discriminating Power of the Let-In Operator in the Lazy Call-by-Name Probabilistic lambda-Calculus

Authors: Simona Kašterović and Michele Pagani


Abstract
We consider the notion of probabilistic applicative bisimilarity (PAB), recently introduced as a behavioural equivalence over a probabilistic extension of the untyped lambda-calculus. Alberti, Dal Lago and Sangiorgi have shown that PAB is not fully abstract with respect to the context equivalence induced by the lazy call-by-name evaluation strategy. We prove that extending this calculus with a let-in operator allows for achieving the full abstraction. In particular, we recall Larsen and Skou’s testing language, which is known to correspond with PAB, and we prove that every test is representable by a context of our calculus.

Cite as

Simona Kašterović and Michele Pagani. The Discriminating Power of the Let-In Operator in the Lazy Call-by-Name Probabilistic lambda-Calculus. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kasterovic_et_al:LIPIcs.FSCD.2019.26,
  author =	{Ka\v{s}terovi\'{c}, Simona and Pagani, Michele},
  title =	{{The Discriminating Power of the Let-In Operator in the Lazy Call-by-Name Probabilistic lambda-Calculus}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{26:1--26:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.26},
  URN =		{urn:nbn:de:0030-drops-105338},
  doi =		{10.4230/LIPIcs.FSCD.2019.26},
  annote =	{Keywords: probabilistic lambda calculus, bisimulation, Howe’s technique, context equivalence, testing}
}
Document
Hilbert’s Tenth Problem in Coq

Authors: Dominique Larchey-Wendling and Yannick Forster


Abstract
We formalise the undecidability of solvability of Diophantine equations, i.e. polynomial equations over natural numbers, in Coq’s constructive type theory. To do so, we give the first full mechanisation of the Davis-Putnam-Robinson-Matiyasevich theorem, stating that every recursively enumerable problem - in our case by a Minsky machine - is Diophantine. We obtain an elegant and comprehensible proof by using a synthetic approach to computability and by introducing Conway’s FRACTRAN language as intermediate layer.

Cite as

Dominique Larchey-Wendling and Yannick Forster. Hilbert’s Tenth Problem in Coq. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 27:1-27:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{larcheywendling_et_al:LIPIcs.FSCD.2019.27,
  author =	{Larchey-Wendling, Dominique and Forster, Yannick},
  title =	{{Hilbert’s Tenth Problem in Coq}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{27:1--27:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.27},
  URN =		{urn:nbn:de:0030-drops-105342},
  doi =		{10.4230/LIPIcs.FSCD.2019.27},
  annote =	{Keywords: Hilbert’s tenth problem, Diophantine equations, undecidability, computability theory, reduction, Minsky machines, Fractran, Coq, type theory}
}
Document
The Delta-calculus: Syntax and Types

Authors: Luigi Liquori and Claude Stolze


Abstract
We present the Delta-calculus, an explicitly typed lambda-calculus with strong pairs, projections and explicit type coercions. The calculus can be parametrized with different intersection type theories T, e.g. the Coppo-Dezani, the Coppo-Dezani-Sallé, the Coppo-Dezani-Venneri and the Barendregt-Coppo-Dezani ones, producing a family of Delta-calculi with related intersection typed systems. We prove the main properties like Church-Rosser, unicity of type, subject reduction, strong normalization, decidability of type checking and type reconstruction. We state the relationship between the intersection type assignment systems à la Curry and the corresponding intersection typed systems à la Church by means of an essence function translating an explicitly typed Delta-term into a pure lambda-term one. We finally translate a Delta-term with type coercions into an equivalent one without them; the translation is proved to be coherent because its essence is the identity. The generic Delta-calculus can be parametrized to take into account other intersection type theories as the ones in the Barendregt et al. book.

Cite as

Luigi Liquori and Claude Stolze. The Delta-calculus: Syntax and Types. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{liquori_et_al:LIPIcs.FSCD.2019.28,
  author =	{Liquori, Luigi and Stolze, Claude},
  title =	{{The Delta-calculus: Syntax and Types}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.28},
  URN =		{urn:nbn:de:0030-drops-105354},
  doi =		{10.4230/LIPIcs.FSCD.2019.28},
  annote =	{Keywords: intersection types, lambda calculus \`{a} la Church and \`{a} la Curry, proof-functional logics}
}
Document
Pointers in Recursion: Exploring the Tropics

Authors: Paulin Jacobé de Naurois


Abstract
We translate the usual class of partial/primitive recursive functions to a pointer recursion framework, accessing actual input values via a pointer reading unit-cost function. These pointer recursive functions classes are proven equivalent to the usual partial/primitive recursive functions. Complexity-wise, this framework captures in a streamlined way most of the relevant sub-polynomial classes. Pointer recursion with the safe/normal tiering discipline of Bellantoni and Cook corresponds to polylogtime computation. We introduce a new, non-size increasing tiering discipline, called tropical tiering. Tropical tiering and pointer recursion, used with some of the most common recursion schemes, capture the classes logspace, logspace/polylogtime, ptime, and NC. Finally, in a fashion reminiscent of the safe recursive functions, tropical tiering is expressed directly in the syntax of the function algebras, yielding the tropical recursive function algebras.

Cite as

Paulin Jacobé de Naurois. Pointers in Recursion: Exploring the Tropics. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 29:1-29:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{jacobedenaurois:LIPIcs.FSCD.2019.29,
  author =	{Jacob\'{e} de Naurois, Paulin},
  title =	{{Pointers in Recursion: Exploring the Tropics}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{29:1--29:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.29},
  URN =		{urn:nbn:de:0030-drops-105360},
  doi =		{10.4230/LIPIcs.FSCD.2019.29},
  annote =	{Keywords: Implicit Complexity, Recursion Theory}
}
Document
Typed Equivalence of Effect Handlers and Delimited Control

Authors: Maciej Piróg, Piotr Polesiuk, and Filip Sieczkowski


Abstract
It is folklore that effect handlers and delimited control operators are closely related: recently, this relationship has been proved in an untyped setting for deep handlers and the shift_0 delimited control operator. We positively resolve the conjecture that in an appropriately polymorphic type system this relationship can be extended to the level of types, by identifying the necessary forms of polymorphism, thus extending the definability result to the typed context. In the process, we identify a novel and potentially interesting type system feature for delimited control operators. Moreover, we extend these results to substantiate the folklore connection between shallow handlers and control_0 flavour of delimited control, both in an untyped and typed settings.

Cite as

Maciej Piróg, Piotr Polesiuk, and Filip Sieczkowski. Typed Equivalence of Effect Handlers and Delimited Control. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 30:1-30:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{pirog_et_al:LIPIcs.FSCD.2019.30,
  author =	{Pir\'{o}g, Maciej and Polesiuk, Piotr and Sieczkowski, Filip},
  title =	{{Typed Equivalence of Effect Handlers and Delimited Control}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{30:1--30:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.30},
  URN =		{urn:nbn:de:0030-drops-105376},
  doi =		{10.4230/LIPIcs.FSCD.2019.30},
  annote =	{Keywords: type-and-effect systems, algebraic effects, delimited control, macro expressibility}
}
Document
Cubical Syntax for Reflection-Free Extensional Equality

Authors: Jonathan Sterling, Carlo Angiuli, and Daniel Gratzer


Abstract
We contribute XTT, a cubical reconstruction of Observational Type Theory [Altenkirch et al., 2007] which extends Martin-Löf’s intensional type theory with a dependent equality type that enjoys function extensionality and a judgmental version of the unicity of identity proofs principle (UIP): any two elements of the same equality type are judgmentally equal. Moreover, we conjecture that the typing relation can be decided in a practical way. In this paper, we establish an algebraic canonicity theorem using a novel extension of the logical families or categorical gluing argument inspired by Coquand and Shulman [Coquand, 2018; Shulman, 2015]: every closed element of boolean type is derivably equal to either true or false.

Cite as

Jonathan Sterling, Carlo Angiuli, and Daniel Gratzer. Cubical Syntax for Reflection-Free Extensional Equality. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 31:1-31:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{sterling_et_al:LIPIcs.FSCD.2019.31,
  author =	{Sterling, Jonathan and Angiuli, Carlo and Gratzer, Daniel},
  title =	{{Cubical Syntax for Reflection-Free Extensional Equality}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{31:1--31:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.31},
  URN =		{urn:nbn:de:0030-drops-105387},
  doi =		{10.4230/LIPIcs.FSCD.2019.31},
  annote =	{Keywords: Dependent type theory, extensional equality, cubical type theory, categorical gluing, canonicity}
}
Document
Guarded Recursion in Agda via Sized Types

Authors: Niccolò Veltri and Niels van der Weide


Abstract
In type theory, programming and reasoning with possibly non-terminating programs and potentially infinite objects is achieved using coinductive types. Recursively defined programs of these types need to be productive to guarantee the consistency of the type system. Proof assistants such as Agda and Coq traditionally employ strict syntactic productivity checks, which often make programming with coinductive types convoluted. One way to overcome this issue is by encoding productivity at the level of types so that the type system forbids the implementation of non-productive corecursive programs. In this paper we compare two different approaches to type-based productivity: guarded recursion and sized types. More specifically, we show how to simulate guarded recursion in Agda using sized types. We formalize the syntax of a simple type theory for guarded recursion, which is a variant of Atkey and McBride’s calculus for productive coprogramming. Then we give a denotational semantics using presheaves over the preorder of sizes. Sized types are fundamentally used to interpret the characteristic features of guarded recursion, notably the fixpoint combinator.

Cite as

Niccolò Veltri and Niels van der Weide. Guarded Recursion in Agda via Sized Types. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 32:1-32:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{veltri_et_al:LIPIcs.FSCD.2019.32,
  author =	{Veltri, Niccol\`{o} and van der Weide, Niels},
  title =	{{Guarded Recursion in Agda via Sized Types}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{32:1--32:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.32},
  URN =		{urn:nbn:de:0030-drops-105391},
  doi =		{10.4230/LIPIcs.FSCD.2019.32},
  annote =	{Keywords: guarded recursion, type theory, semantics, coinduction, sized types}
}
Document
Sequence Types for Hereditary Permutators

Authors: Pierre Vial


Abstract
The invertible terms in Scott’s model D_infty are known as the hereditary permutators. Equivalently, they are terms which are invertible up to beta eta-conversion with respect to the composition of the lambda-terms. Finding a type-theoretic characterization to the set of hereditary permutators was problem # 20 of TLCA list of problems. In 2008, Tatsuta proved that this was not possible with an inductive type system. Building on previous work, we use an infinitary intersection type system based on sequences (i.e., families of types indexed by integers) to characterize hereditary permutators with a unique type. This gives a positive answer to the problem in the coinductive case.

Cite as

Pierre Vial. Sequence Types for Hereditary Permutators. In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 33:1-33:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{vial:LIPIcs.FSCD.2019.33,
  author =	{Vial, Pierre},
  title =	{{Sequence Types for Hereditary Permutators}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{33:1--33:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.33},
  URN =		{urn:nbn:de:0030-drops-105404},
  doi =		{10.4230/LIPIcs.FSCD.2019.33},
  annote =	{Keywords: hereditary permutators, B\"{o}hm trees, intersection types, coinduction, ridigity, sequence types, non-idempotent intersection}
}
Document
System Description
Model Checking Strategy-Controlled Rewriting Systems (System Description)

Authors: Rubén Rubio, Narciso Martí-Oliet, Isabel Pita, and Alberto Verdejo


Abstract
Strategies are widespread in Computer Science. In the domain of reduction and rewriting systems, strategies are studied as recipes to restrict and control reduction steps and rule applications, which are intimately local, in a derivation-global sense. This idea has been exploited by various tools and rewriting-based specification languages, where strategies are an additional specification layer. Systems so described need to be analyzed too. This article discusses model checking of systems controlled by strategies and presents a working strategy-aware LTL model checker for the Maude specification language, based on rewriting logic, and its strategy language.

Cite as

Rubén Rubio, Narciso Martí-Oliet, Isabel Pita, and Alberto Verdejo. Model Checking Strategy-Controlled Rewriting Systems (System Description). In 4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 131, pp. 34:1-34:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{rubio_et_al:LIPIcs.FSCD.2019.34,
  author =	{Rubio, Rub\'{e}n and Mart{\'\i}-Oliet, Narciso and Pita, Isabel and Verdejo, Alberto},
  title =	{{Model Checking Strategy-Controlled Rewriting Systems}},
  booktitle =	{4th International Conference on Formal Structures for Computation and Deduction (FSCD 2019)},
  pages =	{34:1--34:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-107-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{131},
  editor =	{Geuvers, Herman},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2019.34},
  URN =		{urn:nbn:de:0030-drops-105414},
  doi =		{10.4230/LIPIcs.FSCD.2019.34},
  annote =	{Keywords: Model checking, strategies, Maude, rewriting logic}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail