LIPIcs, Volume 84

2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)



Thumbnail PDF

Publication Details

  • published at: 2017-08-30
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-047-7
  • DBLP: db/conf/rta/fscd2017

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 84, FSCD'17, Complete Volume

Authors: Dale Miller


Abstract
LIPIcs, Volume 84, FSCD'17, Complete Volume

Cite as

2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@Proceedings{miller:LIPIcs.FSCD.2017,
  title =	{{LIPIcs, Volume 84, FSCD'17, Complete Volume}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017},
  URN =		{urn:nbn:de:0030-drops-79063},
  doi =		{10.4230/LIPIcs.FSCD.2017},
  annote =	{Keywords: Theory of Computation, Computation by Abstract Devices, Analysis of Algorithms and Problem Complexity, Logics and Meanings of Programs, Mathematical Logic and Formal Languages, Programming Techniques, Software/Program Verification, Programming Languages, Deduction and Theorem Proving}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers

Authors: Dale Miller


Abstract
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers

Cite as

2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 0:i-0:xiv, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{miller:LIPIcs.FSCD.2017.0,
  author =	{Miller, Dale},
  title =	{{Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{0:i--0:xiv},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.0},
  URN =		{urn:nbn:de:0030-drops-77126},
  doi =		{10.4230/LIPIcs.FSCD.2017.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers}
}
Document
Invited Talk
Type Systems for the Relational Verification of Higher Order Programs (Invited Talk)

Authors: Marco Gaboardi


Abstract
Relational program verification is a variant of program verification where one focuses on guaranteeing properties about the executions of two programs, and as a special case about two executions of a single program on different inputs. Relational verification becomes particularly interesting when non-functional aspects of a computation, like probabilities or resource cost, are considered. Several approached to relational program verification have been developed, from relational program logics to relational abstract interpretation. In this talk, I will introduce two approaches to relational program verification for higher-order computations based on the use of type systems. The first approach consists in developing powerful type system where a rich language of assertions can be used to express complex relations between two programs. The second approach consists in developing more restrictive type systems enriched with effects expressing in a lightweight way relations between different runs of the same program. I will discuss the pros and cons of these two approaches on a concrete example: relational cost analysis, which aims at giving a bound on the difference in cost of running two programs, and as a special case the difference in cost of two executions of a single program on different inputs.

Cite as

Marco Gaboardi. Type Systems for the Relational Verification of Higher Order Programs (Invited Talk). In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, p. 1:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{gaboardi:LIPIcs.FSCD.2017.1,
  author =	{Gaboardi, Marco},
  title =	{{Type Systems for the Relational Verification of Higher Order Programs}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{1:1--1:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.1},
  URN =		{urn:nbn:de:0030-drops-77429},
  doi =		{10.4230/LIPIcs.FSCD.2017.1},
  annote =	{Keywords: Relational verification, refinement types, type and effect systems, complexity analysis}
}
Document
Invited Talk
Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses (Invited Talk)

Authors: Georg Moser


Abstract
In this talk, I'll describe how rewriting techniques can be successfully employed to built state-of-the-art automated resource analysis tools which favourably compare to other approaches. Furthermore I'll sketch the genesis of a uniform framework for resource analysis, emphasising success stories, without hiding intricate weaknesses. The talk ends with the discussion of open problems.

Cite as

Georg Moser. Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses (Invited Talk). In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 2:1-2:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{moser:LIPIcs.FSCD.2017.2,
  author =	{Moser, Georg},
  title =	{{Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{2:1--2:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.2},
  URN =		{urn:nbn:de:0030-drops-77438},
  doi =		{10.4230/LIPIcs.FSCD.2017.2},
  annote =	{Keywords: resource analysis, term rewriting, automation}
}
Document
Invited Talk
Brzozowski Goes Concurrent - A Kleene Theorem for Pomset Languages (Invited Talk)

Authors: Alexandra Silva


Abstract
Concurrent Kleene Algebra (CKA) is a mathematical formalism to study programs that exhibit concurrent behaviour. As with previous extensions of Kleene Algebra, characterizing the free model is crucial in order to develop the foundations of the theory and potential applications. For CKA, this has been an open question for a few years and this talk will overview why the problem is so difficult. We will then pave the way towards a solution, by presenting a new automaton model and a Kleene-like theorem for CKA. More precisely, we connect a relaxed version of CKA to series-parallel pomset languages, which are a natural candidate for the free model. There are two substantial differences with previous work: from expressions to automata, we use Brzozowski derivatives, which enable a direct construction of the automaton; from automata to expressions, we provide a syntactic characterization of the automata that denote valid CKA behaviours. We also survey how the present work can be used to to extend the network specification language NetKAT with primitives for concurrency so as to model and reason about concurrency within networks. This is joint work with Tobias Kappe, Paul Brunet, Bas Luttik, and Fabio Zanasi.

Cite as

Alexandra Silva. Brzozowski Goes Concurrent - A Kleene Theorem for Pomset Languages (Invited Talk). In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, p. 3:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{silva:LIPIcs.FSCD.2017.3,
  author =	{Silva, Alexandra},
  title =	{{Brzozowski Goes Concurrent - A Kleene Theorem for Pomset Languages}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{3:1--3:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.3},
  URN =		{urn:nbn:de:0030-drops-77445},
  doi =		{10.4230/LIPIcs.FSCD.2017.3},
  annote =	{Keywords: Kleene algebras, Pomset languages, concurrency, NetKAT}
}
Document
Invited Talk
Quantitative Semantics for Probabilistic Programming (Invited Talk)

Authors: Christine Tasson


Abstract
Probabilistic programming has many applications in statistics, physics, ... so that all programming languages have been equipped with probabilistic library. However, there is a need in developing semantical tools in order to formalize higher order and recursive probabilistic languages. Indeed, it is well known that categories of measurable spaces are not Cartesian closed. We have been studying quantitative semantics of probabilistic spaces to fill this gap. A first step has been to focus on probabilistic programming languages with discrete types such as integers and booleans. In this setting, probabilistic programs can be seen as linear combinations of deterministic programs. Probabilistic Coherent Spaces constitute a Cartesian closed category that is fully abstract with respect to probabilistic Call-By-Push-Value. Moreover, this toy language is endowed with a memorization operator that allow to encode most discrete probabilistic programs. The second step is to move on probabilistic programming with continuous types representing for instance reals endowed with Lebesgue measurable sets. We introduce the category of cones and stable functions which is Cartesian closed. The trick is to enlarge the category of measurable spaces to gain closeness and to embrace measurable spaces. Besides, the category of cones is a sound and adequate model of a higher order and recursive probabilistic language in which most classical distributions and probabilistic tools can be encoded. This is joint work with Thomas Ehrhard and Michele Pagani.

Cite as

Christine Tasson. Quantitative Semantics for Probabilistic Programming (Invited Talk). In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, p. 4:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{tasson:LIPIcs.FSCD.2017.4,
  author =	{Tasson, Christine},
  title =	{{Quantitative Semantics for Probabilistic Programming}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{4:1--4:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.4},
  URN =		{urn:nbn:de:0030-drops-77456},
  doi =		{10.4230/LIPIcs.FSCD.2017.4},
  annote =	{Keywords: denotational semantics, probabilistic programming, programming language, probability}
}
Document
Displayed Categories

Authors: Benedikt Ahrens and Peter LeFanu Lumsdaine


Abstract
We introduce and develop the notion of displayed categories. A displayed category over a category C is equivalent to "a category D and functor F : D -> C", but instead of having a single collection of "objects of D" with a map to the objects of C, the objects are given as a family indexed by objects of C, and similarly for the morphisms. This encapsulates a common way of building categories in practice, by starting with an existing category and adding extra data/properties to the objects and morphisms. The interest of this seemingly trivial reformulation is that various properties of functors are more naturally defined as properties of the corresponding displayed categories. Grothendieck fibrations, for example, when defined as certain functors, use equality on objects in their definition. When defined instead as certain displayed categories, no reference to equality on objects is required. Moreover, almost all examples of fibrations in nature are, in fact, categories whose standard construction can be seen as going via displayed categories. We therefore propose displayed categories as a basis for the development of fibrations in the type-theoretic setting, and similarly for various other notions whose classical definitions involve equality on objects. Besides giving a conceptual clarification of such issues, displayed categories also provide a powerful tool in computer formalisation, unifying and abstracting common constructions and proof techniques of category theory, and enabling modular reasoning about categories of multi-component structures. As such, most of the material of this article has been formalised in Coq over the UniMath library, with the aim of providing a practical library for use in further developments.

Cite as

Benedikt Ahrens and Peter LeFanu Lumsdaine. Displayed Categories. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 5:1-5:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{ahrens_et_al:LIPIcs.FSCD.2017.5,
  author =	{Ahrens, Benedikt and Lumsdaine, Peter LeFanu},
  title =	{{Displayed Categories}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{5:1--5:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.5},
  URN =		{urn:nbn:de:0030-drops-77220},
  doi =		{10.4230/LIPIcs.FSCD.2017.5},
  annote =	{Keywords: Category theory, Dependent type theory, Computer proof assistants, Coq, Univalent mathematics}
}
Document
The Confluent Terminating Context-Free Substitutive Rewriting System for the lambda-Calculus with Surjective Pairing and Terminal Type

Authors: Yohji Akama


Abstract
For the lambda-calculus with surjective pairing and terminal type, Curien and Di Cosmo, inspired by Knuth-Bendix completion, introduced a confluent rewriting system of the naive rewriting system. Their system is a confluent (CR) rewriting system stable under contexts. They left the strong normalization (SN) of their rewriting system open. By Girard's reducibility method with restricting reducibility theorem, we prove SN of their rewriting, and SN of the extensions by polymorphism and (terminal types caused by parametric polymorphism). We extend their system by sum types and eta-like reductions, and prove the SN. We compare their system to type-directed expansions.

Cite as

Yohji Akama. The Confluent Terminating Context-Free Substitutive Rewriting System for the lambda-Calculus with Surjective Pairing and Terminal Type. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 6:1-6:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{akama:LIPIcs.FSCD.2017.6,
  author =	{Akama, Yohji},
  title =	{{The Confluent Terminating Context-Free Substitutive Rewriting System for the lambda-Calculus with Surjective Pairing and Terminal Type}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{6:1--6:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.6},
  URN =		{urn:nbn:de:0030-drops-77346},
  doi =		{10.4230/LIPIcs.FSCD.2017.6},
  annote =	{Keywords: reducibility method, restricted reducibility theorem, sum type, typedirected expansion, strong normalization}
}
Document
Improving Rewriting Induction Approach for Proving Ground Confluence

Authors: Takahito Aoto, Yoshihito Toyama, and Yuta Kimura


Abstract
In (Aoto&Toyama, FSCD 2016), a method to prove ground confluence of many-sorted term rewriting systems based on rewriting induction is given. In this paper, we give several methods that add wider flexibility to the rewriting induction approach for proving ground confluence. Firstly, we give a method to deal with the case in which suitable rules are not presented in the input system. Our idea is to construct additional rewrite rules that supplement or replace existing rules in order to obtain a set of rules that is adequate for applying rewriting induction. Secondly, we give a method to deal with non-orientable constructor rules. This is accomplished by extending the inference system of rewriting induction and giving a sufficient criterion for the correctness of the system. Thirdly, we give a method to deal with disproving ground confluence. The presented methods are implemented in our ground confluence prover AGCP and experiments are reported. Our experiments reveal the presented methods are effective to deal with problems for which state-of-the-art ground confluence provers can not handle.

Cite as

Takahito Aoto, Yoshihito Toyama, and Yuta Kimura. Improving Rewriting Induction Approach for Proving Ground Confluence. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 7:1-7:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{aoto_et_al:LIPIcs.FSCD.2017.7,
  author =	{Aoto, Takahito and Toyama, Yoshihito and Kimura, Yuta},
  title =	{{Improving Rewriting Induction Approach for Proving Ground Confluence}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{7:1--7:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.7},
  URN =		{urn:nbn:de:0030-drops-77310},
  doi =		{10.4230/LIPIcs.FSCD.2017.7},
  annote =	{Keywords: Ground Confluence, Rewriting Induction, Non-Orientable Equations, Term Rewriting Systems}
}
Document
Böhm Reduction in Infinitary Term Graph Rewriting Systems

Authors: Patrick Bahr


Abstract
The confluence properties of lambda calculus and orthogonal term rewriting do not generalise to the corresponding infinitary calculi. In order to recover the confluence property in a meaningful way, Kennaway et al. introduced Böhm reduction, which extends the ordinary reduction relation so that "meaningless terms" can be contracted to a fresh constant "bottom". In previous work, we have established that Böhm reduction can be instead characterised by a different mode of convergences of transfinite reductions that is based on a partial order structure instead of a metric space. In this paper, we develop a corresponding theory of Böhm reduction for term graphs. Our main result is that partial order convergence in a term graph rewriting system can be truthfully and faithfully simulated by metric convergence in the Böhm extension of the system. To prove this result we generalise the notion of residuals and projections to the setting of infinitary term graph rewriting. As ancillary results we prove the infinitary strip lemma and the compression property, both for partial order and metric convergence.

Cite as

Patrick Bahr. Böhm Reduction in Infinitary Term Graph Rewriting Systems. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 8:1-8:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{bahr:LIPIcs.FSCD.2017.8,
  author =	{Bahr, Patrick},
  title =	{{B\"{o}hm Reduction in Infinitary Term Graph Rewriting Systems}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{8:1--8:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.8},
  URN =		{urn:nbn:de:0030-drops-77275},
  doi =		{10.4230/LIPIcs.FSCD.2017.8},
  annote =	{Keywords: infinitary rewriting, term graphs, B\"{o}hm trees}
}
Document
Optimality and the Linear Substitution Calculus

Authors: Pablo Barenbaum and Eduardo Bonelli


Abstract
We lift the theory of optimal reduction to a decomposition of the lambda calculus known as the Linear Substitution Calculus (LSC). LSC decomposes beta-reduction into finer steps that manipulate substitutions in two distinctive ways: it uses context rules that allow substitutions to act "at a distance" and rewrites modulo a set of equations that allow substitutions to "float" in a term. We propose a notion of redex family obtained by adapting Lévy labels to support these two distinctive features. This is followed by a proof of the finite family developments theorem (FFD). We then apply FFD to prove an optimal reduction theorem for LSC. We also apply FFD to deduce additional novel properties of LSC, namely an algorithm for standardisation by selection and normalisation of a linear call-by-need reduction strategy. All results are proved in the axiomatic setting of Glauert and Khashidashvili's Deterministic Residual Structures.

Cite as

Pablo Barenbaum and Eduardo Bonelli. Optimality and the Linear Substitution Calculus. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 9:1-9:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{barenbaum_et_al:LIPIcs.FSCD.2017.9,
  author =	{Barenbaum, Pablo and Bonelli, Eduardo},
  title =	{{Optimality and the Linear Substitution Calculus}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{9:1--9:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.9},
  URN =		{urn:nbn:de:0030-drops-77307},
  doi =		{10.4230/LIPIcs.FSCD.2017.9},
  annote =	{Keywords: Rewriting, Lambda Calculus, Explicit Substitutions, Optimal Reduction}
}
Document
Generalized Refocusing: From Hybrid Strategies to Abstract Machines

Authors: Malgorzata Biernacka, Witold Charatonik, and Klara Zielinska


Abstract
We present a generalization of the refocusing procedure that provides a generic method for deriving an abstract machine from a specification of a reduction semantics satisfying simple initial conditions. The proposed generalization is applicable to a class of reduction semantics encoding hybrid strategies as well as uniform strategies handled by the original refocusing method. The resulting machine is proved to correctly trace (i.e., bisimulate in smaller steps) the input reduction semantics. The procedure and the correctness proofs have been formalized in the Coq proof assistant.

Cite as

Malgorzata Biernacka, Witold Charatonik, and Klara Zielinska. Generalized Refocusing: From Hybrid Strategies to Abstract Machines. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 10:1-10:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{biernacka_et_al:LIPIcs.FSCD.2017.10,
  author =	{Biernacka, Malgorzata and Charatonik, Witold and Zielinska, Klara},
  title =	{{Generalized Refocusing: From Hybrid Strategies to Abstract Machines}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{10:1--10:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.10},
  URN =		{urn:nbn:de:0030-drops-77188},
  doi =		{10.4230/LIPIcs.FSCD.2017.10},
  annote =	{Keywords: reduction semantics, abstract machines, formal verification, Coq}
}
Document
Nested Multisets, Hereditary Multisets, and Syntactic Ordinals in Isabelle/HOL

Authors: Jasmin Christian Blanchette, Mathias Fleury, and Dmitriy Traytel


Abstract
We present a collection of formalized results about finite nested multisets, developed using the Isabelle/HOL proof assistant. The nested multiset order is a generalization of the multiset order that can be used to prove termination of processes. Hereditary multisets, a variant of nested multisets, offer a convenient representation of ordinals below epsilon-0. In Isabelle/HOL, both nested and hereditary multisets can be comfortably defined as inductive datatypes. Our formal library also provides, somewhat nonstandardly, multisets with negative multiplicities and syntactic ordinals with negative coefficients. We present applications of the library to formalizations of Goodstein's theorem and the decidability of unary PCF (programming computable functions).

Cite as

Jasmin Christian Blanchette, Mathias Fleury, and Dmitriy Traytel. Nested Multisets, Hereditary Multisets, and Syntactic Ordinals in Isabelle/HOL. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 11:1-11:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{blanchette_et_al:LIPIcs.FSCD.2017.11,
  author =	{Blanchette, Jasmin Christian and Fleury, Mathias and Traytel, Dmitriy},
  title =	{{Nested Multisets, Hereditary Multisets, and Syntactic Ordinals in Isabelle/HOL}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{11:1--11:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.11},
  URN =		{urn:nbn:de:0030-drops-77155},
  doi =		{10.4230/LIPIcs.FSCD.2017.11},
  annote =	{Keywords: Multisets, ordinals, proof assistants}
}
Document
Observably Deterministic Concurrent Strategies and Intensional Full Abstraction for Parallel-or

Authors: Simon Castellan, Pierre Clairambault, and Glynn Winskel


Abstract
Although Plotkin’s parallel-or is inherently deterministic, it has a non-deterministic interpretation in games based on (prime) event structures - in which an event has a unique causal history - because they do not directly support disjunctive causality. General event structures can express disjunctive causality and have a more permissive notion of determinism, but do not support hiding. We show that (structures equivalent to) deterministic general event structures do support hiding, and construct a new category of games based on them with a deterministic interpretation of aPCFpor, an affine variant of PCF extended with parallel-or. We then exploit this deterministic interpretation to give a relaxed notion of determinism (observable determinism) on the plain event structures model. Putting this together with our previously introduced concurrent notions of well-bracketing and innocence, we obtain an intensionally fully abstract model of aPCFpor.

Cite as

Simon Castellan, Pierre Clairambault, and Glynn Winskel. Observably Deterministic Concurrent Strategies and Intensional Full Abstraction for Parallel-or. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 12:1-12:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{castellan_et_al:LIPIcs.FSCD.2017.12,
  author =	{Castellan, Simon and Clairambault, Pierre and Winskel, Glynn},
  title =	{{Observably Deterministic Concurrent Strategies and Intensional Full Abstraction for Parallel-or}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{12:1--12:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.12},
  URN =		{urn:nbn:de:0030-drops-77219},
  doi =		{10.4230/LIPIcs.FSCD.2017.12},
  annote =	{Keywords: Game semantics, parallel-or, concurrent games, event structures, full abstraction}
}
Document
There Is Only One Notion of Differentiation

Authors: J. Robin B. Cockett and Jean-Simon Lemay


Abstract
Differential linear logic was introduced as a syntactic proof-theoretic approach to the analysis of differential calculus. Differential categories were subsequently introduce to provide a categorical model theory for differential linear logic. Differential categories used two different approaches for defining differentiation abstractly: a deriving transformation and a coderiliction. While it was thought that these notions could give rise to distinct notions of differentiation, we show here that these notions, in the presence of a monoidal coalgebra modality, are completely equivalent.

Cite as

J. Robin B. Cockett and Jean-Simon Lemay. There Is Only One Notion of Differentiation. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 13:1-13:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cockett_et_al:LIPIcs.FSCD.2017.13,
  author =	{Cockett, J. Robin B. and Lemay, Jean-Simon},
  title =	{{There Is Only One Notion of Differentiation}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{13:1--13:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.13},
  URN =		{urn:nbn:de:0030-drops-77166},
  doi =		{10.4230/LIPIcs.FSCD.2017.13},
  annote =	{Keywords: Differential Categories, Linear Logic, Coalgebra Modalities, Bialgebra Modalities}
}
Document
Confluence of an Extension of Combinatory Logic by Boolean Constants

Authors: Lukasz Czajka


Abstract
We show confluence of a conditional term rewriting system CL-pc^1, which is an extension of Combinatory Logic by Boolean constants. This solves problem 15 from the RTA list of open problems. The proof has been fully formalized in the Coq proof assistant.

Cite as

Lukasz Czajka. Confluence of an Extension of Combinatory Logic by Boolean Constants. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 14:1-14:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{czajka:LIPIcs.FSCD.2017.14,
  author =	{Czajka, Lukasz},
  title =	{{Confluence of an Extension of Combinatory Logic by Boolean Constants}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{14:1--14:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.14},
  URN =		{urn:nbn:de:0030-drops-77368},
  doi =		{10.4230/LIPIcs.FSCD.2017.14},
  annote =	{Keywords: combinatory logic, conditional linearization, unique normal form property, confluence}
}
Document
The Complexity of Principal Inhabitation

Authors: Andrej Dudenhefner and Jakob Rehof


Abstract
It is shown that in the simply typed lambda-calculus the following decision problem of principal inhabitation is Pspace-complete: Given a simple type tau, is there a lambda-term N in beta-normal form such that tau is the principal type of N? While a Ben-Yelles style algorithm was presented by Broda and Damas in 1999 to count normal principal inhabitants (thereby answering a question posed by Hindley), it does not induce a polynomial space upper bound for principal inhabitation. Further, the standard construction of the polynomial space lower bound for simple type inhabitation does not carry over immediately. We present a polynomial space bounded decision procedure based on a characterization of principal inhabitation using path derivation systems over subformulae of the input type, which does not require candidate inhabitants to be constructed explicitly. The lower bound is shown by reducing a restriction of simple type inhabitation to principal inhabitation.

Cite as

Andrej Dudenhefner and Jakob Rehof. The Complexity of Principal Inhabitation. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 15:1-15:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{dudenhefner_et_al:LIPIcs.FSCD.2017.15,
  author =	{Dudenhefner, Andrej and Rehof, Jakob},
  title =	{{The Complexity of Principal Inhabitation}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{15:1--15:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.15},
  URN =		{urn:nbn:de:0030-drops-77359},
  doi =		{10.4230/LIPIcs.FSCD.2017.15},
  annote =	{Keywords: Lambda Calculus, Type Theory, Simple Types, Inhabitation, Principal Type, Complexity}
}
Document
List Objects with Algebraic Structure

Authors: Marcelo Fiore and Philip Saville


Abstract
We introduce and study the notion of list object with algebraic structure. The first key aspect of our development is that the notion of list object is considered in the context of monoidal structure; the second key aspect is that we further equip list objects with algebraic structure in this setting. Within our framework, we observe that list objects give rise to free monoids and moreover show that this remains so in the presence of algebraic structure. In addition, we provide a basic theory explicitly describing as an inductively defined object such free monoids with suitably compatible algebraic structure in common practical situations. This theory is accompanied by the study of two technical themes that, besides being of interest in their own right, are important for establishing applications. These themes are: parametrised initiality, central to the universal property defining list objects; and approaches to algebraic structure, in particular in the context of monoidal theories. The latter leads naturally to a notion of nsr (or near semiring) category of independent interest. With the theoretical development in place, we touch upon a variety of applications, considering Natural Numbers Objects in domain theory, giving a universal property for the monadic list transformer, providing free instances of algebraic extensions of the Haskell Monad type class, elucidating the algebraic character of the construction of opetopes in higher-dimensional algebra, and considering free models of second-order algebraic theories.

Cite as

Marcelo Fiore and Philip Saville. List Objects with Algebraic Structure. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 16:1-16:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{fiore_et_al:LIPIcs.FSCD.2017.16,
  author =	{Fiore, Marcelo and Saville, Philip},
  title =	{{List Objects with Algebraic Structure}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{16:1--16:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.16},
  URN =		{urn:nbn:de:0030-drops-77262},
  doi =		{10.4230/LIPIcs.FSCD.2017.16},
  annote =	{Keywords: list object, free monoid, strong monad, (cartesian, linear, and second-order) algebraic theory, near semiring, Haskell Monad type class, opetope}
}
Document
Is the Optimal Implementation Inefficient? Elementarily Not

Authors: Stefano Guerrini and Marco Solieri


Abstract
Sharing graphs are a local and asynchronous implementation of lambda-calculus beta-reduction (or linear logic proof-net cut-elimination) that avoids useless duplications. Empirical benchmarks suggest that they are one of the most efficient machineries, when one wants to fully exploit the higher-order features of lambda-calculus. However, we still lack confirming grounds with theoretical solidity to dispel uncertainties about the adoption of sharing graphs. Aiming at analysing in detail the worst-case overhead cost of sharing operators, we restrict to the case of elementary and light linear logic, two subsystems with bounded computational complexity of multiplicative exponential linear logic. In these two cases, the bookkeeping component is unnecessary, and sharing graphs are simplified to the so-called "abstract algorithm". By a modular cost comparison over a syntactical simulation, we prove that the overhead of shared reductions is quadratically bounded to cost of the naive implementation, i.e. proof-net reduction. This result generalises and strengthens a previous complexity result, and implies that the price of sharing is negligible, if compared to the obtainable benefits on reductions requiring a large amount of duplication.

Cite as

Stefano Guerrini and Marco Solieri. Is the Optimal Implementation Inefficient? Elementarily Not. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 17:1-17:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{guerrini_et_al:LIPIcs.FSCD.2017.17,
  author =	{Guerrini, Stefano and Solieri, Marco},
  title =	{{Is the Optimal Implementation Inefficient? Elementarily Not}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{17:1--17:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.17},
  URN =		{urn:nbn:de:0030-drops-77337},
  doi =		{10.4230/LIPIcs.FSCD.2017.17},
  annote =	{Keywords: optimality, sharing graphs, lambda-calculus, complexity, linear logic, proof nets}
}
Document
Continuation Passing Style for Effect Handlers

Authors: Daniel Hillerström, Sam Lindley, Robert Atkey, and K. C. Sivaramakrishnan


Abstract
We present Continuation Passing Style (CPS) translations for Plotkin and Pretnar's effect handlers with Hillerström and Lindley's row-typed fine-grain call-by-value calculus of effect handlers as the source language. CPS translations of handlers are interesting theoretically, to explain the semantics of handlers, and also offer a practical implementation technique that does not require special support in the target language's runtime. We begin with a first-order CPS translation into untyped lambda calculus which manages a stack of continuations and handlers as a curried sequence of arguments. We then refine the initial CPS translation first by uncurrying it to yield a properly tail-recursive translation and second by making it higher-order in order to contract administrative redexes at translation time. We prove that the higher-order CPS translation simulates effect handler reduction. We have implemented the higher-order CPS translation as a JavaScript backend for the Links programming language.

Cite as

Daniel Hillerström, Sam Lindley, Robert Atkey, and K. C. Sivaramakrishnan. Continuation Passing Style for Effect Handlers. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 18:1-18:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{hillerstrom_et_al:LIPIcs.FSCD.2017.18,
  author =	{Hillerstr\"{o}m, Daniel and Lindley, Sam and Atkey, Robert and Sivaramakrishnan, K. C.},
  title =	{{Continuation Passing Style for Effect Handlers}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{18:1--18:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.18},
  URN =		{urn:nbn:de:0030-drops-77394},
  doi =		{10.4230/LIPIcs.FSCD.2017.18},
  annote =	{Keywords: effect handlers, delimited control, continuation passing style}
}
Document
Infinite Runs in Abstract Completion

Authors: Nao Hirokawa, Aart Middeldorp, Christian Sternagel, and Sarah Winkler


Abstract
Completion is one of the first and most studied techniques in term rewriting and fundamental to automated reasoning with equalities. In an earlier paper we presented a new and formalized correctness proof of abstract completion for finite runs. In this paper we extend our analysis and our formalization to infinite runs, resulting in a new proof that fair infinite runs produce complete presentations of the initial equations. We further consider ordered completion - an important extension of completion that aims to produce ground-complete presentations of the initial equations. Moreover, we revisit and extend results of Métivier concerning canonicity of rewrite systems. All proofs presented in the paper have been formalized in Isabelle/HOL.

Cite as

Nao Hirokawa, Aart Middeldorp, Christian Sternagel, and Sarah Winkler. Infinite Runs in Abstract Completion. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 19:1-19:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{hirokawa_et_al:LIPIcs.FSCD.2017.19,
  author =	{Hirokawa, Nao and Middeldorp, Aart and Sternagel, Christian and Winkler, Sarah},
  title =	{{Infinite Runs in Abstract Completion}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{19:1--19:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.19},
  URN =		{urn:nbn:de:0030-drops-77252},
  doi =		{10.4230/LIPIcs.FSCD.2017.19},
  annote =	{Keywords: term rewriting, abstract completion, ordered completion, canonicity, Isabelle/HOL}
}
Document
Refutation of Sallé's Longstanding Conjecture

Authors: Benedetto Intrigila, Giulio Manzonetto, and Andrew Polonsky


Abstract
The lambda-calculus possesses a strong notion of extensionality, called "the omega-rule", which has been the subject of many investigations. It is a longstanding open problem whether the equivalence obtained by closing the theory of Böhm trees under the omega-rule is strictly included in Morris's original observational theory, as conjectured by Sallé in the seventies. In a recent work, Breuvart et al. have shown that Morris's theory satisfies the omega-rule. In this paper we demonstrate that the two aforementioned theories actually coincide, thus disproving Sallé's conjecture.

Cite as

Benedetto Intrigila, Giulio Manzonetto, and Andrew Polonsky. Refutation of Sallé's Longstanding Conjecture. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 20:1-20:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{intrigila_et_al:LIPIcs.FSCD.2017.20,
  author =	{Intrigila, Benedetto and Manzonetto, Giulio and Polonsky, Andrew},
  title =	{{Refutation of Sall\'{e}'s Longstanding Conjecture}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{20:1--20:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.20},
  URN =		{urn:nbn:de:0030-drops-77236},
  doi =		{10.4230/LIPIcs.FSCD.2017.20},
  annote =	{Keywords: lambda calculus, observational equivalence, B\"{o}hm trees, omega-rule}
}
Document
Relating System F and Lambda2: A Case Study in Coq, Abella and Beluga

Authors: Jonas Kaiser, Brigitte Pientka, and Gert Smolka


Abstract
We give three formalisations of a proof of the equivalence of the usual, two-sorted presentation of System F and its single-sorted pure type system (PTS) variant Lambda2. This is established by reducing the typability problem of F to Lambda2 and vice versa. A key challenge is the treatment of variable binding and contextual information. The formalisations all share the same high level proof structure using relations to connect the type systems. They do, however, differ significantly in their representation and manipulation of variables and contextual information. In Coq, we use pure de Bruijn indices and parallel substitutions. In Abella, we use higher-order abstract syntax (HOAS) and nominal constants of the ambient reasoning logic. In Beluga, we also use HOAS but within contextual modal type theory. Our contribution is twofold. First, we present and compare a collection of machine-checked solutions to a non-trivial theoretical result. Second, we propose our proof as a benchmark, complementing the POPLmark and ORBI challenges by testing how well a given proof assistant or framework handles complex contextual information involving multiple type systems.

Cite as

Jonas Kaiser, Brigitte Pientka, and Gert Smolka. Relating System F and Lambda2: A Case Study in Coq, Abella and Beluga. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 21:1-21:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{kaiser_et_al:LIPIcs.FSCD.2017.21,
  author =	{Kaiser, Jonas and Pientka, Brigitte and Smolka, Gert},
  title =	{{Relating System F and Lambda2: A Case Study in Coq, Abella and Beluga}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{21:1--21:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.21},
  URN =		{urn:nbn:de:0030-drops-77248},
  doi =		{10.4230/LIPIcs.FSCD.2017.21},
  annote =	{Keywords: Pure Type Systems, System F, de Bruijn Syntax, Higher-Order Abstract Syntax, Contextual Reasoning}
}
Document
A Polynomial-Time Algorithm for the Lambek Calculus with Brackets of Bounded Order

Authors: Max Kanovich, Stepan Kuznetsov, Glyn Morrill, and Andre Scedrov


Abstract
Lambek calculus is a logical foundation of categorial grammar, a linguistic paradigm of grammar as logic and parsing as deduction. Pentus (2010) gave a polynomial-time algorithm for determining provability of bounded depth formulas in L*, the Lambek calculus with empty antecedents allowed. Pentus' algorithm is based on tabularisation of proof nets. Lambek calculus with brackets is a conservative extension of Lambek calculus with bracket modalities, suitable for the modeling of syntactical domains. In this paper we give an algorithm for provability in Lb*, the Lambek calculus with brackets allowing empty antecedents. Our algorithm runs in polynomial time when both the formula depth and the bracket nesting depth are bounded. It combines a Pentus-style tabularisation of proof nets with an automata-theoretic treatment of bracketing.

Cite as

Max Kanovich, Stepan Kuznetsov, Glyn Morrill, and Andre Scedrov. A Polynomial-Time Algorithm for the Lambek Calculus with Brackets of Bounded Order. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 22:1-22:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{kanovich_et_al:LIPIcs.FSCD.2017.22,
  author =	{Kanovich, Max and Kuznetsov, Stepan and Morrill, Glyn and Scedrov, Andre},
  title =	{{A Polynomial-Time Algorithm for the Lambek Calculus with Brackets of Bounded Order}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{22:1--22:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.22},
  URN =		{urn:nbn:de:0030-drops-77387},
  doi =		{10.4230/LIPIcs.FSCD.2017.22},
  annote =	{Keywords: Lambek calculus, proof nets, Lambek calculus with brackets, categorial grammar, polynomial algorithm}
}
Document
Polynomial Running Times for Polynomial-Time Oracle Machines

Authors: Akitoshi Kawamura and Florian Steinberg


Abstract
This paper introduces a more restrictive notion of feasibility of functionals on Baire space than the established one from second-order complexity theory. Thereby making it possible to consider functions on the natural numbers as running times of oracle Turing machines and avoiding second-order polynomials, which are notoriously difficult to handle. Furthermore, all machines that witness this stronger kind of feasibility can be clocked and the different traditions of treating partial functionals from computable analysis and second-order complexity theory are equated in a precise sense. The new notion is named "strong polynomial-time computability", and proven to be a strictly stronger requirement than polynomial-time computability. It is proven that within the framework for complexity of operators from analysis introduced by Kawamura and Cook the classes of strongly polynomial-time computable functionals and polynomial-time computable functionals coincide.

Cite as

Akitoshi Kawamura and Florian Steinberg. Polynomial Running Times for Polynomial-Time Oracle Machines. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 23:1-23:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{kawamura_et_al:LIPIcs.FSCD.2017.23,
  author =	{Kawamura, Akitoshi and Steinberg, Florian},
  title =	{{Polynomial Running Times for Polynomial-Time Oracle Machines}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{23:1--23:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.23},
  URN =		{urn:nbn:de:0030-drops-77370},
  doi =		{10.4230/LIPIcs.FSCD.2017.23},
  annote =	{Keywords: second-order complexity, oracle Turing machine, computable analysis, second-order polynomial, computational complexity of partial functionals}
}
Document
Types as Resources for Classical Natural Deduction

Authors: Delia Kesner and Pierre Vial


Abstract
We define two resource aware typing systems for the lambda-mu-calculus based on non-idempotent intersection and union types. The non-idempotent approach provides very simple combinatorial arguments - based on decreasing measures of type derivations - to characterize head and strongly normalizing terms. Moreover, typability provides upper bounds for the length of head-reduction sequences and maximal reduction sequences.

Cite as

Delia Kesner and Pierre Vial. Types as Resources for Classical Natural Deduction. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 24:1-24:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{kesner_et_al:LIPIcs.FSCD.2017.24,
  author =	{Kesner, Delia and Vial, Pierre},
  title =	{{Types as  Resources for Classical Natural Deduction}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{24:1--24:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.24},
  URN =		{urn:nbn:de:0030-drops-77135},
  doi =		{10.4230/LIPIcs.FSCD.2017.24},
  annote =	{Keywords: lambda-mu-calculus, classical logic, intersection types, normalization}
}
Document
A Fibrational Framework for Substructural and Modal Logics

Authors: Daniel R. Licata, Michael Shulman, and Mitchell Riley


Abstract
We define a general framework that abstracts the common features of many intuitionistic substructural and modal logics / type theories. The framework is a sequent calculus / normal-form type theory parametrized by a mode theory, which is used to describe the structure of contexts and the structural properties they obey. In this sequent calculus, the context itself obeys standard structural properties, while a term, drawn from the mode theory, constrains how the context can be used. Product types, implications, and modalities are defined as instances of two general connectives, one positive and one negative, that manipulate these terms. Specific mode theories can express a range of substructural and modal connectives, including non-associative, ordered, linear, affine, relevant, and cartesian products and implications; monoidal and non-monoidal functors, (co)monads and adjunctions; n-linear variables; and bunched implications. We prove cut (and identity) admissibility independently of the mode theory, obtaining it for many different logics at once. Further, we give a general equational theory on derivations / terms that, in addition to the usual beta/eta-rules, characterizes when two derivations differ only by the placement of structural rules. Additionally, we give an equivalent semantic presentation of these ideas, in which a mode theory corresponds to a 2-dimensional cartesian multicategory, the framework corresponds to another such multicategory with a functor to the mode theory, and the logical connectives make this into a bifibration. Finally, we show how the framework can be used both to encode existing existing logics / type theories and to design new ones.

Cite as

Daniel R. Licata, Michael Shulman, and Mitchell Riley. A Fibrational Framework for Substructural and Modal Logics. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 25:1-25:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{licata_et_al:LIPIcs.FSCD.2017.25,
  author =	{Licata, Daniel R. and Shulman, Michael and Riley, Mitchell},
  title =	{{A Fibrational Framework for Substructural and Modal Logics}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{25:1--25:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.25},
  URN =		{urn:nbn:de:0030-drops-77400},
  doi =		{10.4230/LIPIcs.FSCD.2017.25},
  annote =	{Keywords: type theory, modal logic, substructural logic, homotopy type theory}
}
Document
Arrays and References in Resource Aware ML

Authors: Benjamin Lichtman and Jan Hoffmann


Abstract
This article introduces a technique to accurately perform static prediction of resource usage for ML-like functional programs with references and arrays. Previous research successfully integrated the potential method of amortized analysis with a standard type system to automatically derive parametric resource bounds. The analysis is naturally compositional and the resource consumption of functions can be abstracted using potential-annotated types. The soundness theorem of the analysis guarantees that the derived bounds are correct with respect to the resource usage defined by a cost semantics. Type inference can be efficiently automated using off-the-shelf LP solvers, even if the derived bounds are polynomials. However, side effects and aliasing of heap references make it notoriously difficult to derive bounds that depend on mutable structures, such as arrays and references. As a result, existing automatic amortized analysis systems for ML-like programs cannot derive bounds for programs whose resource consumption depends on data in such structures. This article extends the potential method to handle mutable structures with minimal changes to the type rules while preserving the stated advantages of amortized analysis. To do so, we introduce a swap operation for references and arrays that users can use to make programs suitable for automatic analysis. We prove the soundness of the analysis introducing a potential-annotated memory typing, which gathers all unique locations reachable from a reference. Apart from the design of the system, the main contribution is the proof of soundness for the extended analysis system.

Cite as

Benjamin Lichtman and Jan Hoffmann. Arrays and References in Resource Aware ML. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{lichtman_et_al:LIPIcs.FSCD.2017.26,
  author =	{Lichtman, Benjamin and Hoffmann, Jan},
  title =	{{Arrays and References in Resource Aware ML}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{26:1--26:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.26},
  URN =		{urn:nbn:de:0030-drops-77282},
  doi =		{10.4230/LIPIcs.FSCD.2017.26},
  annote =	{Keywords: Resource Analysis, Functional Programming, Static Analysis, OCaml, Amortized Analysis}
}
Document
Negative Translations and Normal Modality

Authors: Tadeusz Litak, Miriam Polzer, and Ulrich Rabenstein


Abstract
We discuss the behaviour of variants of the standard negative translations - Kolmogorov, Gödel-Gentzen, Kuroda and Glivenko - in propositional logics with a unary normal modality. More specifically, we address the question whether negative translations as a rule embed faithfully a classical modal logic into its intuitionistic counterpart. As it turns out, even the Kolmogorov translation can go wrong with rather natural modal principles. Nevertheless, we isolate sufficient syntactic criteria ensuring adequacy of well-behaved (or, in our terminology, regular) translations for logics axiomatized with formulas satisfying these criteria, which we call enveloped implications. Furthermore, a large class of computationally relevant modal logics - namely, logics of type inhabitation for applicative functors (a.k.a. idioms) - turns out to validate the modal counterpart of the Double Negation Shift, thus ensuring adequacy of even the Glivenko translation. All our positive results are proved purely syntactically, using the minimal natural deduction system of Bellin, de Paiva and Ritter extended with additional axioms/combinators and hence also allow a direct formalization in a proof assistant (in our case Coq).

Cite as

Tadeusz Litak, Miriam Polzer, and Ulrich Rabenstein. Negative Translations and Normal Modality. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 27:1-27:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{litak_et_al:LIPIcs.FSCD.2017.27,
  author =	{Litak, Tadeusz and Polzer, Miriam and Rabenstein, Ulrich},
  title =	{{Negative Translations and Normal Modality}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{27:1--27:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.27},
  URN =		{urn:nbn:de:0030-drops-77412},
  doi =		{10.4230/LIPIcs.FSCD.2017.27},
  annote =	{Keywords: negative translations, intuitionistic modal logic, normal modality, double negation}
}
Document
Models of Type Theory Based on Moore Paths

Authors: Ian Orton and Andrew M. Pitts


Abstract
This paper introduces a new family of models of intensional Martin-Löf type theory. We use constructive ordered algebra in toposes. Identity types in the models are given by a notion of Moore path. By considering a particular gros topos, we show that there is such a model that is non-truncated, i.e. contains non-trivial structure at all dimensions. In other words, in this model a type in a nested sequence of identity types can contain more than one element, no matter how great the degree of nesting. Although inspired by existing non-truncated models of type theory based on simplicial and on cubical sets, the notion of model presented here is notable for avoiding any form of Kan filling condition in the semantics of types.

Cite as

Ian Orton and Andrew M. Pitts. Models of Type Theory Based on Moore Paths. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 28:1-28:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{orton_et_al:LIPIcs.FSCD.2017.28,
  author =	{Orton, Ian and Pitts, Andrew M.},
  title =	{{Models of Type Theory Based on Moore Paths}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{28:1--28:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.28},
  URN =		{urn:nbn:de:0030-drops-77149},
  doi =		{10.4230/LIPIcs.FSCD.2017.28},
  annote =	{Keywords: dependent type theory, homotopy theory, Moore path, topos}
}
Document
On Dinaturality, Typability and beta-eta-Stable Models

Authors: Paolo Pistone


Abstract
We present two results which relate dinaturality with a syntactic property (typability) and a semantic one (interpretability by beta-eta-stable sets). First, we prove that closed dinatural lambda-terms are simply typable, that is, the converse of the well-known fact that simply typable closed terms are dinatural. The argument exposes a syntactical aspect of dinaturality, as lambda-terms are type-checked by computing their associated dinaturality equation. Second, we prove that a closed lambda-term belonging to all beta-eta-stable interpretations of a simple type must be dinatural, that is, we prove dinaturality by semantical means. To do this, we show that such terms satisfy a suitable version of binary parametricity and we derive dinaturality from it. By combining the two results we obtain a new proof of the completeness of beta-eta-stable semantics with respect to simple types. While the completeness of this semantics is well-known in the literature, the technique here developed suggests that dinaturality might be exploited to prove completeness also for other, less manageable, semantics (like saturated families or reducibility candidates) for which a direct argument is not yet known.

Cite as

Paolo Pistone. On Dinaturality, Typability and beta-eta-Stable Models. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 29:1-29:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{pistone:LIPIcs.FSCD.2017.29,
  author =	{Pistone, Paolo},
  title =	{{On Dinaturality, Typability and beta-eta-Stable Models}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{29:1--29:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.29},
  URN =		{urn:nbn:de:0030-drops-77291},
  doi =		{10.4230/LIPIcs.FSCD.2017.29},
  annote =	{Keywords: Dinaturality, simply-typed lambda-calculus, beta-eta-stable semantics, completeness}
}
Document
A Curry-Howard Approach to Church's Synthesis

Authors: Pierre Pradic and Colin Riba


Abstract
Church's synthesis problem asks whether there exists a finite-state stream transducer satisfying a given input-output specification. For specifications written in Monadic Second-Order Logic over infinite words, Church's synthesis can theoretically be solved algorithmically using automata and games. We revisit Church's synthesis via the Curry-Howard correspondence by introducing SMSO, a non-classical subsystem of MSO, which is shown to be sound and complete w.r.t. synthesis thanks to an automata-based realizability model.

Cite as

Pierre Pradic and Colin Riba. A Curry-Howard Approach to Church's Synthesis. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 30:1-30:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{pradic_et_al:LIPIcs.FSCD.2017.30,
  author =	{Pradic, Pierre and Riba, Colin},
  title =	{{A Curry-Howard Approach to Church's Synthesis}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{30:1--30:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.30},
  URN =		{urn:nbn:de:0030-drops-77198},
  doi =		{10.4230/LIPIcs.FSCD.2017.30},
  annote =	{Keywords: Intuitionistic Arithmetic, Realizability, Monadic Second-Order Logic on Infinite Words}
}
Document
Combinatorial Flows and Their Normalisation

Authors: Lutz Straßburger


Abstract
This paper introduces combinatorial flows that generalize combinatorial proofs such that they also include cut and substitution as methods of proof compression. We show a normalization procedure for combinatorial flows, and how syntactic proofs are translated into combinatorial flows and vice versa.

Cite as

Lutz Straßburger. Combinatorial Flows and Their Normalisation. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 31:1-31:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{straburger:LIPIcs.FSCD.2017.31,
  author =	{Stra{\ss}burger, Lutz},
  title =	{{Combinatorial Flows and Their Normalisation}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{31:1--31:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.31},
  URN =		{urn:nbn:de:0030-drops-77204},
  doi =		{10.4230/LIPIcs.FSCD.2017.31},
  annote =	{Keywords: proof equivalence, cut elimination, substitution, deep inference}
}
Document
Streett Automata Model Checking of Higher-Order Recursion Schemes

Authors: Ryota Suzuki, Koichi Fujima, Naoki Kobayashi, and Takeshi Tsukada


Abstract
We propose a practical algorithm for Streett automata model checking of higher-order recursion schemes (HORS), which checks whether the tree generated by a given HORS is accepted by a given Streett automaton. The Streett automata model checking of HORS is useful in the context of liveness verification of higher-order functional programs. The previous approach to Streett automata model checking converted Streett automata to parity automata and then invoked a parity tree automata model checker. We show through experiments that our direct approach outperforms the previous approach. Besides being able to directly deal with Streett automata, our algorithm is the first practical Streett or parity automata model checking algorithm that runs in time polynomial in the size of HORS, assuming that the other parameters are fixed. Previous practical fixed-parameter polynomial time algorithms for HORS could only deal with the class of trivial tree automata. We have confirmed through experiments that (a parity automata version of) our model checker outperforms previous parity automata model checkers for HORS.

Cite as

Ryota Suzuki, Koichi Fujima, Naoki Kobayashi, and Takeshi Tsukada. Streett Automata Model Checking of Higher-Order Recursion Schemes. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 32:1-32:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{suzuki_et_al:LIPIcs.FSCD.2017.32,
  author =	{Suzuki, Ryota and Fujima, Koichi and Kobayashi, Naoki and Tsukada, Takeshi},
  title =	{{Streett Automata Model Checking of Higher-Order Recursion Schemes}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{32:1--32:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.32},
  URN =		{urn:nbn:de:0030-drops-77325},
  doi =		{10.4230/LIPIcs.FSCD.2017.32},
  annote =	{Keywords: Higher-order model checking, higher-order recursion schemes, Streett automata}
}
Document
A Sequent Calculus for a Semi-Associative Law

Authors: Noam Zeilberger


Abstract
We introduce a sequent calculus with a simple restriction of Lambek's product rules that precisely captures the classical Tamari order, i.e., the partial order on fully-bracketed words (equivalently, binary trees) induced by a semi-associative law (equivalently, tree rotation). We establish a focusing property for this sequent calculus (a strengthening of cut-elimination), which yields the following coherence theorem: every valid entailment in the Tamari order has exactly one focused derivation. One combinatorial application of this coherence theorem is a new proof of the Tutte-Chapoton formula for the number of intervals in the Tamari lattice Y_n. Elsewhere, we have also used the sequent calculus and the coherence theorem to build a surprising bijection between intervals of the Tamari order and a natural fragment of lambda calculus, consisting of the beta-normal planar lambda terms with no closed proper subterms.

Cite as

Noam Zeilberger. A Sequent Calculus for a Semi-Associative Law. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 33:1-33:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{zeilberger:LIPIcs.FSCD.2017.33,
  author =	{Zeilberger, Noam},
  title =	{{A Sequent Calculus for a Semi-Associative Law}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{33:1--33:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.33},
  URN =		{urn:nbn:de:0030-drops-77179},
  doi =		{10.4230/LIPIcs.FSCD.2017.33},
  annote =	{Keywords: proof theory, combinatorics, coherence theorem, substructural logic, associativity}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail