LIPIcs, Volume 108

3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)



Thumbnail PDF

Event

FSCD 2018, July 9-12, 2018, Oxford, UK

Editor

Hélène Kirchner

Publication Details

  • published at: 2018-07-04
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-077-4
  • DBLP: db/conf/rta/fscd2018

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 108, FSCD'18, Complete Volume

Authors: Hélène Kirchner


Abstract
LIPIcs, Volume 108, FSCD'18, Complete Volume

Cite as

3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@Proceedings{kirchner:LIPIcs.FSCD.2018,
  title =	{{LIPIcs, Volume 108, FSCD'18, Complete Volume}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018},
  URN =		{urn:nbn:de:0030-drops-92819},
  doi =		{10.4230/LIPIcs.FSCD.2018},
  annote =	{Keywords: Theory of computation, Models of computation, Formal languages and automata theory, Logic, Semantics and reasoning, Software and its engineering}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Hélène Kirchner


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 0:i-0:xx, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{kirchner:LIPIcs.FSCD.2018.0,
  author =	{Kirchner, H\'{e}l\`{e}ne},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{0:i--0:xx},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.0},
  URN =		{urn:nbn:de:0030-drops-91706},
  doi =		{10.4230/LIPIcs.FSCD.2018.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Analysing Privacy-Type Properties in Cryptographic Protocols (Invited Talk)

Authors: Stéphanie Delaune


Abstract
Cryptographic protocols aim at securing communications over insecure networks such as the Internet, where dishonest users may listen to communications and interfere with them. For example, passports are no more pure paper documents. Instead, they contain a chip that stores additional information such as pictures and fingerprints of their holder. In order to ensure privacy, these chips include a mechanism, i.e. a cryptographic protocol, that does not let the passport disclose private information to external users except the intended terminal. This is just a single example but of course privacy appears in many other contexts such as RFIDs technologies or electronic voting. Formal methods have been successfully applied to automatically analyse traditional protocols and discover their flaws. Privacy-type security properties (e.g. anonymity, unlinkability, vote secrecy, ...) are expressed relying on a notion of behavioural equivalence, and are actually more difficult to analyse than confidentiality and authentication properties. We will discuss some recent advances that have been done to analyse automatically equivalence-based security properties, and we will review some issues that remain to be solved in this field.

Cite as

Stéphanie Delaune. Analysing Privacy-Type Properties in Cryptographic Protocols (Invited Talk). In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 1:1-1:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{delaune:LIPIcs.FSCD.2018.1,
  author =	{Delaune, St\'{e}phanie},
  title =	{{Analysing Privacy-Type Properties in Cryptographic Protocols}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{1:1--1:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.1},
  URN =		{urn:nbn:de:0030-drops-91715},
  doi =		{10.4230/LIPIcs.FSCD.2018.1},
  annote =	{Keywords: cryptographic protocols, symbolic models, privacy-related properties, behavioural equivalence}
}
Document
Invited Talk
Formal Design, Implementation and Verification of Blockchain Languages (Invited Talk)

Authors: Grigore Rosu


Abstract
This invited paper describes recent, ongoing and planned work on the use of the rewrite-based semantic framework K to formally design, implement and verify blockchain languages and virtual machines. Both academic and commercial endeavors are discussed, as well as thoughts and directions for future research and development.

Cite as

Grigore Rosu. Formal Design, Implementation and Verification of Blockchain Languages (Invited Talk). In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 2:1-2:6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{rosu:LIPIcs.FSCD.2018.2,
  author =	{Rosu, Grigore},
  title =	{{Formal Design, Implementation and Verification of Blockchain Languages}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{2:1--2:6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.2},
  URN =		{urn:nbn:de:0030-drops-91722},
  doi =		{10.4230/LIPIcs.FSCD.2018.2},
  annote =	{Keywords: Formal semantics, Program verification, Blockchain}
}
Document
Invited Talk
Challenges in Quantum Programming Languages (Invited Talk)

Authors: Peter Selinger


Abstract
In this talk, I will give an overview of some recent progress and current challenges in the design of quantum programming languages. Unlike classical programs, which can in principle be debugged by stopping the program at critical moments and examining the contents of variables, quantum programs are not amenable to traditional debugging because the state of a quantum system cannot usually be examined in a meaningful way. Therefore, we need other methods for ensuring the correctness of quantum programs, such as formal verification. For this reason, I advocate the use of strongly typed, functional programming languages for quantum computing. As far as functional quantum programming languages are concerned, there is currently a relatively wide gap between theory and practice. On the one hand, we have languages with strong theoretical foundations, such as the quantum lambda calculus, which operate at a relatively low level of abstraction and lack many features that would be useful to practical quantum programmers. On the other hand, we have practical functional quantum programming languages such as Quipper, which is implemented as an embedded language in Haskell, has many high-level features, and has been used in large-scale projects, but lacks a theoretical basis and a strong type system [Green et al., 2013; Green et al., 2013; Green et al., 2013; Smith et al., 2014]. We have recently attempted to narrow this gap through a family of languages called Proto-Quipper, which are designed to offer Quipper-like features while having sound theoretical foundations [Ross, 2015; Rios and Selinger, 2018]. I will give an overview of Quipper and its most useful features, report on the progress we made with formalizing fragments of Quipper, and outline several of the still remaining challenges.

Cite as

Peter Selinger. Challenges in Quantum Programming Languages (Invited Talk). In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 3:1-3:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{selinger:LIPIcs.FSCD.2018.3,
  author =	{Selinger, Peter},
  title =	{{Challenges in Quantum Programming Languages}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{3:1--3:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.3},
  URN =		{urn:nbn:de:0030-drops-91730},
  doi =		{10.4230/LIPIcs.FSCD.2018.3},
  annote =	{Keywords: Quantum programming languages}
}
Document
Invited Talk
Proof Techniques for Program Equivalence in Probabilistic Higher-Order Languages (Invited Talk)

Authors: Valeria Vignudelli


Abstract
While the theory of functional higher-order languages, starting from lambda-calculi, is a well-established research field, it is only in recent years that the operational semantics of higher-order languages with probabilistic operators has started to be extensively studied. A fundamental notion in the semantics of programming languages is that of program equivalence. In higher-order languages, program equivalence is generally formalized as a contextual equivalence [Morris, 1968], which can be hard to prove directly. This has motivated the study of proof techniques for contextual equivalence, from inductive ones, such as logical relations [Andrew Pitts, 2005], to coinductive ones, mainly in the form of bisimulations [Abramsky, 1990]. In this talk I will discuss proof techniques for program equivalence in languages combining higher-order and probabilistic features. Several operational methods, traditionally developed in a deterministic setting, have been adapted to probabilistic higher-order languages [Ales Bizjak and Lars Birkedal, 2015; Dal Lago et al., 2014; Raphaëlle Crubillé and Ugo Dal Lago, 2014]. I will discuss these proof methods and focus on bisimulation-based techniques, showing how probabilities, combined with different language features, force a number of modifications to the definition of bisimulation [Crubillé et al., 2015; Sangiorgi and Vignudelli, 2016].

Cite as

Valeria Vignudelli. Proof Techniques for Program Equivalence in Probabilistic Higher-Order Languages (Invited Talk). In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 4:1-4:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{vignudelli:LIPIcs.FSCD.2018.4,
  author =	{Vignudelli, Valeria},
  title =	{{Proof Techniques for Program Equivalence in Probabilistic Higher-Order Languages}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{4:1--4:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.4},
  URN =		{urn:nbn:de:0030-drops-91749},
  doi =		{10.4230/LIPIcs.FSCD.2018.4},
  annote =	{Keywords: Lambda Calculus, Contextual Equivalence, Bisimulation, Probabilistic Programming Languages}
}
Document
A Unifying Framework for Type Inhabitation

Authors: Sandra Alves and Sabine Broda


Abstract
In this paper we define a framework to address different kinds of problems related to type inhabitation, such as type checking, the emptiness problem, generation of inhabitants and counting, in a uniform way. Our framework uses an alternative representation for types, called the pre-grammar of the type, on which different methods for these problems are based. Furthermore, we define a scheme for a decision algorithm that, for particular instantiations of the parameters, can be used to show different inhabitation related problems to be in PSPACE.

Cite as

Sandra Alves and Sabine Broda. A Unifying Framework for Type Inhabitation. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 5:1-5:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{alves_et_al:LIPIcs.FSCD.2018.5,
  author =	{Alves, Sandra and Broda, Sabine},
  title =	{{A Unifying Framework for Type Inhabitation}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{5:1--5:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.5},
  URN =		{urn:nbn:de:0030-drops-91758},
  doi =		{10.4230/LIPIcs.FSCD.2018.5},
  annote =	{Keywords: simple types, type inhabitation, rewriting, PSPACE}
}
Document
Confluence of Prefix-Constrained Rewrite Systems

Authors: Nirina Andrianarivelo and Pierre Réty


Abstract
Prefix-constrained rewriting is a strict extension of context-sensitive rewriting. We study the confluence of prefix-constrained rewrite systems, which are composed of rules of the form L: l -> r where L is a regular string language that defines the allowed rewritable positions. The usual notion of Knuth-Bendix's critical pair needs to be extended using regular string languages, and the convergence of all critical pairs is not enough to ensure local confluence. Thanks to an additional restriction we get local confluence, and then confluence for terminating systems, which makes the word problem decidable. Moreover we present an extended Knuth-Bendix completion procedure, to transform a non-confluent prefix-constrained rewrite system into a confluent one.

Cite as

Nirina Andrianarivelo and Pierre Réty. Confluence of Prefix-Constrained Rewrite Systems. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 6:1-6:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{andrianarivelo_et_al:LIPIcs.FSCD.2018.6,
  author =	{Andrianarivelo, Nirina and R\'{e}ty, Pierre},
  title =	{{Confluence of Prefix-Constrained Rewrite Systems}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{6:1--6:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.6},
  URN =		{urn:nbn:de:0030-drops-91762},
  doi =		{10.4230/LIPIcs.FSCD.2018.6},
  annote =	{Keywords: prefix-constrained term rewriting, confluence, critical pair}
}
Document
Fixed-Point Constraints for Nominal Equational Unification

Authors: Mauricio Ayala-Rincón, Maribel Fernández, and Daniele Nantes-Sobrinho


Abstract
We propose a new axiomatisation of the alpha-equivalence relation for nominal terms, based on a primitive notion of fixed-point constraint. We show that the standard freshness relation between atoms and terms can be derived from the more primitive notion of permutation fixed-point, and use this result to prove the correctness of the new alpha-equivalence axiomatisation. This gives rise to a new notion of nominal unification, where solutions for unification problems are pairs of a fixed-point context and a substitution. Although it may seem less natural than the standard notion of nominal unifier based on freshness constraints, the notion of unifier based on fixed-point constraints behaves better when equational theories are considered: for example, nominal unification remains finitary in the presence of commutativity, whereas it becomes infinitary when unifiers are expressed using freshness contexts.

Cite as

Mauricio Ayala-Rincón, Maribel Fernández, and Daniele Nantes-Sobrinho. Fixed-Point Constraints for Nominal Equational Unification. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 7:1-7:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{ayalarincon_et_al:LIPIcs.FSCD.2018.7,
  author =	{Ayala-Rinc\'{o}n, Mauricio and Fern\'{a}ndez, Maribel and Nantes-Sobrinho, Daniele},
  title =	{{Fixed-Point Constraints for Nominal Equational Unification}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{7:1--7:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.7},
  URN =		{urn:nbn:de:0030-drops-91777},
  doi =		{10.4230/LIPIcs.FSCD.2018.7},
  annote =	{Keywords: nominal terms, fixed-point equations, nominal unification, equational theories}
}
Document
Strict Ideal Completions of the Lambda Calculus

Authors: Patrick Bahr


Abstract
The infinitary lambda calculi pioneered by Kennaway et al. extend the basic lambda calculus by metric completion to infinite terms and reductions. Depending on the chosen metric, the resulting infinitary calculi exhibit different notions of strictness. To obtain infinitary normalisation and infinitary confluence properties for these calculi, Kennaway et al. extend beta-reduction with infinitely many `bot-rules', which contract meaningless terms directly to bot. Three of the resulting Böhm reduction calculi have unique infinitary normal forms corresponding to Böhm-like trees. In this paper we develop a corresponding theory of infinitary lambda calculi based on ideal completion instead of metric completion. We show that each of our calculi conservatively extends the corresponding metric-based calculus. Three of our calculi are infinitarily normalising and confluent; their unique infinitary normal forms are exactly the Böhm-like trees of the corresponding metric-based calculi. Our calculi dispense with the infinitely many bot-rules of the metric-based calculi. The fully non-strict calculus (called 111) consists of only beta-reduction, while the other two calculi (called 001 and 101) require two additional rules that precisely state their strictness properties: lambda x.bot -> bot (for 001) and bot M -> bot (for 001 and 101).

Cite as

Patrick Bahr. Strict Ideal Completions of the Lambda Calculus. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bahr:LIPIcs.FSCD.2018.8,
  author =	{Bahr, Patrick},
  title =	{{Strict Ideal Completions of the Lambda Calculus}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.8},
  URN =		{urn:nbn:de:0030-drops-91787},
  doi =		{10.4230/LIPIcs.FSCD.2018.8},
  annote =	{Keywords: lambda calculus, infinitary rewriting, B\"{o}hm trees, meaningless terms, confluence}
}
Document
Term-Graph Anti-Unification

Authors: Alexander Baumgartner, Temur Kutsia, Jordi Levy, and Mateu Villaret


Abstract
We study anti-unification for possibly cyclic, unranked term-graphs and develop an algorithm, which computes a minimal complete set of generalizations for them. For bisimilar graphs the algorithm computes the join in the lattice generated by a functional bisimulation. These results generalize anti-unification for ranked and unranked terms to the corresponding term-graphs, and solve also anti-unification problems for rational terms and dags. Our results open a way to widen anti-unification based code clone detection techniques from a tree representation to a graph representation of the code.

Cite as

Alexander Baumgartner, Temur Kutsia, Jordi Levy, and Mateu Villaret. Term-Graph Anti-Unification. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 9:1-9:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{baumgartner_et_al:LIPIcs.FSCD.2018.9,
  author =	{Baumgartner, Alexander and Kutsia, Temur and Levy, Jordi and Villaret, Mateu},
  title =	{{Term-Graph Anti-Unification}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{9:1--9:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.9},
  URN =		{urn:nbn:de:0030-drops-91797},
  doi =		{10.4230/LIPIcs.FSCD.2018.9},
  annote =	{Keywords: Cyclic term-graps, anti-unification, least general generalization}
}
Document
Proof Nets for Bi-Intuitionistic Linear Logic

Authors: Gianluigi Bellin and Willem B. Heijltjes


Abstract
Bi-Intuitionistic Linear Logic (BILL) is an extension of Intuitionistic Linear Logic with a par, dual to the tensor, and subtraction, dual to linear implication. It is the logic of categories with a monoidal closed and a monoidal co-closed structure that are related by linear distributivity, a strength of the tensor over the par. It conservatively extends Full Intuitionistic Linear Logic (FILL), which includes only the par. We give proof nets for the multiplicative, unit-free fragment MBILL-. Correctness is by local rewriting in the style of Danos contractibility, which yields sequentialization into a relational sequent calculus extending the existing one for FILL. We give a second, geometric correctness condition combining Danos-Regnier switching and Lamarche's Essential Net criterion, and demonstrate composition both inductively and as a one-off global operation.

Cite as

Gianluigi Bellin and Willem B. Heijltjes. Proof Nets for Bi-Intuitionistic Linear Logic. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 10:1-10:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bellin_et_al:LIPIcs.FSCD.2018.10,
  author =	{Bellin, Gianluigi and Heijltjes, Willem B.},
  title =	{{Proof Nets for Bi-Intuitionistic Linear Logic}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{10:1--10:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.10},
  URN =		{urn:nbn:de:0030-drops-91800},
  doi =		{10.4230/LIPIcs.FSCD.2018.10},
  annote =	{Keywords: proof nets, intuitionistic linear logic, contractibility, linear logic}
}
Document
Counting Environments and Closures

Authors: Maciej Bendkowski and Pierre Lescanne


Abstract
Environments and closures are two of the main ingredients of evaluation in lambda-calculus. A closure is a pair consisting of a lambda-term and an environment, whereas an environment is a list of lambda-terms assigned to free variables. In this paper we investigate some dynamic aspects of evaluation in lambda-calculus considering the quantitative, combinatorial properties of environments and closures. Focusing on two classes of environments and closures, namely the so-called plain and closed ones, we consider the problem of their asymptotic counting and effective random generation. We provide an asymptotic approximation of the number of both plain environments and closures of size n. Using the associated generating functions, we construct effective samplers for both classes of combinatorial structures. Finally, we discuss the related problem of asymptotic counting and random generation of closed environments and closures.

Cite as

Maciej Bendkowski and Pierre Lescanne. Counting Environments and Closures. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 11:1-11:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bendkowski_et_al:LIPIcs.FSCD.2018.11,
  author =	{Bendkowski, Maciej and Lescanne, Pierre},
  title =	{{Counting Environments and Closures}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{11:1--11:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.11},
  URN =		{urn:nbn:de:0030-drops-91817},
  doi =		{10.4230/LIPIcs.FSCD.2018.11},
  annote =	{Keywords: lambda-calculus, combinatorics, functional programming, mathematical analysis, complexity}
}
Document
Higher-Order Equational Pattern Anti-Unification

Authors: David M. Cerna and Temur Kutsia


Abstract
We consider anti-unification for simply typed lambda terms in associative, commutative, and associative-commutative theories and develop a sound and complete algorithm which takes two lambda terms and computes their generalizations in the form of higher-order patterns. The problem is finitary: the minimal complete set of generalizations contains finitely many elements. We define the notion of optimal solution and investigate special fragments of the problem for which the optimal solution can be computed in linear or polynomial time.

Cite as

David M. Cerna and Temur Kutsia. Higher-Order Equational Pattern Anti-Unification. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 12:1-12:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{cerna_et_al:LIPIcs.FSCD.2018.12,
  author =	{Cerna, David M. and Kutsia, Temur},
  title =	{{Higher-Order Equational Pattern Anti-Unification}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{12:1--12:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.12},
  URN =		{urn:nbn:de:0030-drops-91826},
  doi =		{10.4230/LIPIcs.FSCD.2018.12},
  annote =	{Keywords: Simply typed lambda calculus, anti-unification, equational theories}
}
Document
Term Rewriting Characterisation of LOGSPACE for Finite and Infinite Data

Authors: Lukasz Czajka


Abstract
We show that LOGSPACE is characterised by finite orthogonal tail-recursive cons-free constructor term rewriting systems, contributing to a line of research initiated by Neil Jones. We describe a LOGSPACE algorithm which computes constructor normal forms. This algorithm is used in the proof of our main result: that simple stream term rewriting systems characterise LOGSPACE-computable stream functions as defined by Ramyaa and Leivant. This result concerns characterising logarithmic-space computation on infinite streams by means of infinitary rewriting.

Cite as

Lukasz Czajka. Term Rewriting Characterisation of LOGSPACE for Finite and Infinite Data. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 13:1-13:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{czajka:LIPIcs.FSCD.2018.13,
  author =	{Czajka, Lukasz},
  title =	{{Term Rewriting Characterisation of LOGSPACE for Finite and Infinite Data}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{13:1--13:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.13},
  URN =		{urn:nbn:de:0030-drops-91831},
  doi =		{10.4230/LIPIcs.FSCD.2018.13},
  annote =	{Keywords: LOGSPACE, implicit complexity, term rewriting, infinitary rewriting, streams}
}
Document
Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable Systems

Authors: Jörg Endrullis, Jan Willem Klop, and Roy Overbeek


Abstract
Like termination, confluence is a central property of rewrite systems. Unlike for termination, however, there exists no known complexity hierarchy for confluence. In this paper we investigate whether the decreasing diagrams technique can be used to obtain such a hierarchy. The decreasing diagrams technique is one of the strongest and most versatile methods for proving confluence of abstract reduction systems, it is complete for countable systems, and it has many well-known confluence criteria as corollaries. So what makes decreasing diagrams so powerful? In contrast to other confluence techniques, decreasing diagrams employ a labelling of the steps -> with labels from a well-founded order in order to conclude confluence of the underlying unlabelled relation. Hence it is natural to ask how the size of the label set influences the strength of the technique. In particular, what class of abstract reduction systems can be proven confluent using decreasing diagrams restricted to 1 label, 2 labels, 3 labels, and so on? Surprisingly, we find that two labels suffice for proving confluence for every abstract rewrite system having the cofinality property, thus in particular for every confluent, countable system. We also show that this result stands in sharp contrast to the situation for commutation of rewrite relations, where the hierarchy does not collapse. Finally, as a background theme, we discuss the logical issue of first-order definability of the notion of confluence.

Cite as

Jörg Endrullis, Jan Willem Klop, and Roy Overbeek. Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable Systems. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 14:1-14:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{endrullis_et_al:LIPIcs.FSCD.2018.14,
  author =	{Endrullis, J\"{o}rg and Klop, Jan Willem and Overbeek, Roy},
  title =	{{Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable Systems}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{14:1--14:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.14},
  URN =		{urn:nbn:de:0030-drops-91848},
  doi =		{10.4230/LIPIcs.FSCD.2018.14},
  annote =	{Keywords: confluence, decreasing diagrams, weak diamond property}
}
Document
Coherence of Gray Categories via Rewriting

Authors: Simon Forest and Samuel Mimram


Abstract
Over the recent years, the theory of rewriting has been extended in order to provide systematic techniques to show coherence results for strict higher categories. Here, we investigate a further generalization to low-dimensional weak categories, and consider in details the first non-trivial case: presentations of tricategories. By a general result, those are equivalent to the stricter Gray categories, for which we introduce a notion of rewriting system, as well as associated tools: critical pairs, termination orders, etc. We show that a finite rewriting system admits a finite number of critical pairs and, as a variant of Newman's lemma in our context, that a convergent rewriting system is coherent, meaning that two parallel 3-cells are necessarily equal. This is illustrated on rewriting systems corresponding to various well-known structures in the context of Gray categories (monoids, adjunctions, Frobenius monoids). Finally, we discuss generalizations in arbitrary dimension.

Cite as

Simon Forest and Samuel Mimram. Coherence of Gray Categories via Rewriting. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 15:1-15:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{forest_et_al:LIPIcs.FSCD.2018.15,
  author =	{Forest, Simon and Mimram, Samuel},
  title =	{{Coherence of Gray Categories via Rewriting}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{15:1--15:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.15},
  URN =		{urn:nbn:de:0030-drops-91855},
  doi =		{10.4230/LIPIcs.FSCD.2018.15},
  annote =	{Keywords: rewriting, coherence, Gray category, polygraph, pseudomonoid, precategory}
}
Document
Completeness of Tree Automata Completion

Authors: Thomas Genet


Abstract
We consider rewriting of a regular language with a left-linear term rewriting system. We show a completeness theorem on equational tree automata completion stating that, if there exists a regular over-approximation of the set of reachable terms, then equational completion can compute it (or safely under-approximate it). A nice corollary of this theorem is that, if the set of reachable terms is regular, then equational completion can also compute it. This was known to be true for some term rewriting system classes preserving regularity, but was still an open question in the general case. The proof is not constructive because it depends on the regularity of the set of reachable terms, which is undecidable. To carry out those proofs we generalize and improve two results of completion: the Termination and the Upper-Bound theorems. Those theoretical results provide an algorithmic way to safely explore regular approximations with completion. This has been implemented in Timbuk and used to verify safety properties, automatically and efficiently, on first-order and higher-order functional programs.

Cite as

Thomas Genet. Completeness of Tree Automata Completion. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 16:1-16:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{genet:LIPIcs.FSCD.2018.16,
  author =	{Genet, Thomas},
  title =	{{Completeness of Tree Automata Completion}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{16:1--16:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.16},
  URN =		{urn:nbn:de:0030-drops-91868},
  doi =		{10.4230/LIPIcs.FSCD.2018.16},
  annote =	{Keywords: term rewriting systems, regularity preservation, over-approximation, completeness, tree automata, tree automata completion}
}
Document
A Diagrammatic Axiomatisation of Fermionic Quantum Circuits

Authors: Amar Hadzihasanovic, Giovanni de Felice, and Kang Feng Ng


Abstract
We introduce the fermionic ZW calculus, a string-diagrammatic language for fermionic quantum computing (FQC). After defining a fermionic circuit model, we present the basic components of the calculus, together with their interpretation, and show how the main physical gates of interest in FQC can be represented in the language. We then list our axioms, and derive some additional equations. We prove that the axioms provide a complete equational axiomatisation of the monoidal category whose objects are quantum systems of finitely many local fermionic modes, with operations that preserve or reverse the parity (number of particles mod 2) of states, and the tensor product, corresponding to the composition of two systems, as monoidal product. We achieve this through a procedure that rewrites any diagram in a normal form. We conclude by showing, as an example, how the statistics of a fermionic Mach-Zehnder interferometer can be calculated in the diagrammatic language.

Cite as

Amar Hadzihasanovic, Giovanni de Felice, and Kang Feng Ng. A Diagrammatic Axiomatisation of Fermionic Quantum Circuits. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 17:1-17:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{hadzihasanovic_et_al:LIPIcs.FSCD.2018.17,
  author =	{Hadzihasanovic, Amar and de Felice, Giovanni and Ng, Kang Feng},
  title =	{{A Diagrammatic Axiomatisation of Fermionic Quantum Circuits}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{17:1--17:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.17},
  URN =		{urn:nbn:de:0030-drops-91873},
  doi =		{10.4230/LIPIcs.FSCD.2018.17},
  annote =	{Keywords: Fermionic Quantum Computing, String Diagrams, Categorical Quantum Mechanics}
}
Document
On Repetitive Right Application of B-Terms

Authors: Mirai Ikebuchi and Keisuke Nakano


Abstract
B-terms are built from the B combinator alone defined by B == lambda f.lambda g.lambda x. f~(g~x), which is well-known as a function composition operator. This paper investigates an interesting property of B-terms, that is, whether repetitive right applications of a B-term cycles or not. We discuss conditions for B-terms to have and not to have the property through a sound and complete equational axiomatization. Specifically, we give examples of B-terms which have the property and show that there are infinitely many B-terms which do not have the property. Also, we introduce a canonical representation of B-terms that is useful to detect cycles, or equivalently, to prove the property, with an efficient algorithm.

Cite as

Mirai Ikebuchi and Keisuke Nakano. On Repetitive Right Application of B-Terms. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 18:1-18:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{ikebuchi_et_al:LIPIcs.FSCD.2018.18,
  author =	{Ikebuchi, Mirai and Nakano, Keisuke},
  title =	{{On Repetitive Right Application of B-Terms}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{18:1--18:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.18},
  URN =		{urn:nbn:de:0030-drops-91882},
  doi =		{10.4230/LIPIcs.FSCD.2018.18},
  annote =	{Keywords: Combinatory logic, B combinator, Lambda calculus}
}
Document
Index-Stratified Types

Authors: Rohan Jacob-Rao, Brigitte Pientka, and David Thibodeau


Abstract
We present Tores, a core language for encoding metatheoretic proofs. The novel features we introduce are well-founded Mendler-style (co)recursion over indexed data types and a form of recursion over objects in the index language to build new types. The latter, which we call index-stratified types, are analogue to the concept of large elimination in dependently typed languages. These features combined allow us to encode sophisticated case studies such as normalization for lambda calculi and normalization by evaluation. We prove the soundness of Tores as a programming and proof language via the key theorems of subject reduction and termination.

Cite as

Rohan Jacob-Rao, Brigitte Pientka, and David Thibodeau. Index-Stratified Types. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 19:1-19:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{jacobrao_et_al:LIPIcs.FSCD.2018.19,
  author =	{Jacob-Rao, Rohan and Pientka, Brigitte and Thibodeau, David},
  title =	{{Index-Stratified Types}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{19:1--19:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.19},
  URN =		{urn:nbn:de:0030-drops-91891},
  doi =		{10.4230/LIPIcs.FSCD.2018.19},
  annote =	{Keywords: Indexed types, (co)recursive types, logical relations}
}
Document
A Syntax for Higher Inductive-Inductive Types

Authors: Ambrus Kaposi and András Kovács


Abstract
Higher inductive-inductive types (HIITs) generalise inductive types of dependent type theories in two directions. On the one hand they allow the simultaneous definition of multiple sorts that can be indexed over each other. On the other hand they support equality constructors, thus generalising higher inductive types of homotopy type theory. Examples that make use of both features are the Cauchy reals and the well-typed syntax of type theory where conversion rules are given as equality constructors. In this paper we propose a general definition of HIITs using a domain-specific type theory. A context in this small type theory encodes a HIIT by listing the type formation rules and constructors. The type of the elimination principle and its beta-rules are computed from the context using a variant of the syntactic logical relation translation. We show that for indexed W-types and various examples of HIITs the computed elimination principles are the expected ones. Showing that the thus specified HIITs exist is left as future work. The type theory specifying HIITs was formalised in Agda together with the syntactic translations. A Haskell implementation converts the types of sorts and constructors into valid Agda code which postulates the elimination principles and computation rules.

Cite as

Ambrus Kaposi and András Kovács. A Syntax for Higher Inductive-Inductive Types. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 20:1-20:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{kaposi_et_al:LIPIcs.FSCD.2018.20,
  author =	{Kaposi, Ambrus and Kov\'{a}cs, Andr\'{a}s},
  title =	{{A Syntax for Higher Inductive-Inductive Types}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{20:1--20:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.20},
  URN =		{urn:nbn:de:0030-drops-91906},
  doi =		{10.4230/LIPIcs.FSCD.2018.20},
  annote =	{Keywords: homotopy type theory, inductive-inductive types, higher inductive types, quotient inductive types, logical relations}
}
Document
Lifting Coalgebra Modalities and IMELL Model Structure to Eilenberg-Moore Categories

Authors: Jean-Simon Pacaud Lemay


Abstract
A categorical model of the multiplicative and exponential fragments of intuitionistic linear logic (IMELL), known as a linear category, is a symmetric monoidal closed category with a monoidal coalgebra modality (also known as a linear exponential comonad). Inspired by R. Blute and P. Scott's work on categories of modules of Hopf algebras as models of linear logic, we study Eilenberg-Moore categories of monads as models of IMELL. We define an IMELL lifting monad on a linear category as a Hopf monad - in the Bruguieres, Lack, and Virelizier sense - with a mixed distributive law over the monoidal coalgebra modality. As our main result, we show that the linear category structure lifts to Eilenberg-Moore categories of IMELL lifting monads. We explain how monoids in the Eilenberg-Moore category of the monoidal coalgebra modality can induce IMELL lifting monads and provide sources for such monoids. Along the way, we also define mixed distributive laws of bimonads over coalgebra modalities and lifting differential category structure to Eilenberg-Moore categories of exponential lifting monads.

Cite as

Jean-Simon Pacaud Lemay. Lifting Coalgebra Modalities and IMELL Model Structure to Eilenberg-Moore Categories. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 21:1-21:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{lemay:LIPIcs.FSCD.2018.21,
  author =	{Lemay, Jean-Simon Pacaud},
  title =	{{Lifting Coalgebra Modalities and IMELL Model Structure to Eilenberg-Moore Categories}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{21:1--21:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.21},
  URN =		{urn:nbn:de:0030-drops-91911},
  doi =		{10.4230/LIPIcs.FSCD.2018.21},
  annote =	{Keywords: Mixed Distributive Laws, Coalgebra Modalities, Linear Categories, Bimonads, Differential Categories}
}
Document
Internal Universes in Models of Homotopy Type Theory

Authors: Daniel R. Licata, Ian Orton, Andrew M. Pitts, and Bas Spitters


Abstract
We begin by recalling the essentially global character of universes in various models of homotopy type theory, which prevents a straightforward axiomatization of their properties using the internal language of the presheaf toposes from which these model are constructed. We get around this problem by extending the internal language with a modal operator for expressing properties of global elements. In this setting we show how to construct a universe that classifies the Cohen-Coquand-Huber-Mörtberg (CCHM) notion of fibration from their cubical sets model, starting from the assumption that the interval is tiny - a property that the interval in cubical sets does indeed have. This leads to an elementary axiomatization of that and related models of homotopy type theory within what we call crisp type theory.

Cite as

Daniel R. Licata, Ian Orton, Andrew M. Pitts, and Bas Spitters. Internal Universes in Models of Homotopy Type Theory. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 22:1-22:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{licata_et_al:LIPIcs.FSCD.2018.22,
  author =	{Licata, Daniel R. and Orton, Ian and Pitts, Andrew M. and Spitters, Bas},
  title =	{{Internal Universes in Models of Homotopy Type Theory}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{22:1--22:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.22},
  URN =		{urn:nbn:de:0030-drops-91929},
  doi =		{10.4230/LIPIcs.FSCD.2018.22},
  annote =	{Keywords: cubical sets, dependent type theory, homotopy type theory, internal language, modalities, univalent foundations, universes}
}
Document
The Clocks They Are Adjunctions Denotational Semantics for Clocked Type Theory

Authors: Bassel Mannaa and Rasmus Ejlers Møgelberg


Abstract
Clocked Type Theory (CloTT) is a type theory for guarded recursion useful for programming with coinductive types, allowing productivity to be encoded in types, and for reasoning about advanced programming language features using an abstract form of step-indexing. CloTT has previously been shown to enjoy a number of syntactic properties including strong normalisation, canonicity and decidability of type checking. In this paper we present a denotational semantics for CloTT useful, e.g., for studying future extensions of CloTT with constructions such as path types. The main challenge for constructing this model is to model the notion of ticks used in CloTT for coinductive reasoning about coinductive types. We build on a category previously used to model guarded recursion, but in this category there is no object of ticks, so tick-assumptions in a context can not be modelled using standard tools. Instead we show how ticks can be modelled using adjoint functors, and how to model the tick constant using a semantic substitution.

Cite as

Bassel Mannaa and Rasmus Ejlers Møgelberg. The Clocks They Are Adjunctions Denotational Semantics for Clocked Type Theory. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 23:1-23:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{mannaa_et_al:LIPIcs.FSCD.2018.23,
  author =	{Mannaa, Bassel and M{\o}gelberg, Rasmus Ejlers},
  title =	{{The Clocks They Are Adjunctions Denotational Semantics for Clocked Type Theory}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{23:1--23:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.23},
  URN =		{urn:nbn:de:0030-drops-91938},
  doi =		{10.4230/LIPIcs.FSCD.2018.23},
  annote =	{Keywords: Guarded type theory, Coinduction, Presheaf model, Clocked type theory, Dependent adjunction}
}
Document
Call-by-Name Gradual Type Theory

Authors: Max S. New and Daniel R. Licata


Abstract
We present gradual type theory, a logic and type theory for call-by-name gradual typing. We define the central constructions of gradual typing (the dynamic type, type casts and type error) in a novel way, by universal properties relative to new judgments for gradual type and term dynamism. These dynamism judgements build on prior work in blame calculi and on the "gradual guarantee" theorem of gradual typing. Combined with the ordinary extensionality (eta) principles that type theory provides, we show that most of the standard operational behavior of casts is uniquely determined by the gradual guarantee. This provides a semantic justification for the definitions of casts, and shows that non-standard definitions of casts must violate these principles. Our type theory is the internal language of a certain class of preorder categories called equipments. We give a general construction of an equipment interpreting gradual type theory from a 2-category representing non-gradual types and programs, which is a semantic analogue of the interpretation of gradual typing using contracts, and use it to build some concrete domain-theoretic models of gradual typing.

Cite as

Max S. New and Daniel R. Licata. Call-by-Name Gradual Type Theory. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 24:1-24:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{new_et_al:LIPIcs.FSCD.2018.24,
  author =	{New, Max S. and Licata, Daniel R.},
  title =	{{Call-by-Name Gradual Type Theory}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{24:1--24:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.24},
  URN =		{urn:nbn:de:0030-drops-91944},
  doi =		{10.4230/LIPIcs.FSCD.2018.24},
  annote =	{Keywords: Gradual Typing, Type Systems, Program Logics, Category Theory, Denotational Semantics}
}
Document
Unique perfect matchings and proof nets

Authors: Lê Thành Dung Nguyen


Abstract
This paper establishes a bridge between linear logic and mainstream graph theory, building previous work by Retoré (2003). We show that the problem of correctness for MLL+Mix proof nets is equivalent to the problem of uniqueness of a perfect matching. By applying matching theory, we obtain new results for MLL+Mix proof nets: a linear-time correctness criterion, a quasi-linear sequentialization algorithm, and a characterization of the sub-polynomial complexity of the correctness problem. We also use graph algorithms to compute the dependency relation of Bagnol et al. (2015) and the kingdom ordering of Bellin (1997), and relate them to the notion of blossom which is central to combinatorial maximum matching algorithms.

Cite as

Lê Thành Dung Nguyen. Unique perfect matchings and proof nets. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:LIPIcs.FSCD.2018.25,
  author =	{Nguyen, L\^{e} Th\`{a}nh Dung},
  title =	{{Unique perfect matchings and proof nets}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{25:1--25:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.25},
  URN =		{urn:nbn:de:0030-drops-91957},
  doi =		{10.4230/LIPIcs.FSCD.2018.25},
  annote =	{Keywords: correctness criteria, matching algorithms}
}
Document
Narrowing Trees for Syntactically Deterministic Conditional Term Rewriting Systems

Authors: Naoki Nishida and Yuya Maeda


Abstract
A narrowing tree for a constructor term rewriting system and a pair of terms is a finite representation for the space of all possible innermost-narrowing derivations that start with the pair and end with non-narrowable terms. Narrowing trees have grammar representations that can be considered regular tree grammars. Innermost narrowing is a counterpart of constructor-based rewriting, and thus, narrowing trees can be used in analyzing constructor-based rewriting to normal forms. In this paper, using grammar representations, we extend narrowing trees to syntactically deterministic conditional term rewriting systems that are constructor systems. We show that narrowing trees are useful to prove two properties of a normal conditional term rewriting system: one is infeasibility of conditional critical pairs and the other is quasi-reducibility.

Cite as

Naoki Nishida and Yuya Maeda. Narrowing Trees for Syntactically Deterministic Conditional Term Rewriting Systems. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nishida_et_al:LIPIcs.FSCD.2018.26,
  author =	{Nishida, Naoki and Maeda, Yuya},
  title =	{{Narrowing Trees for Syntactically Deterministic Conditional Term Rewriting Systems}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{26:1--26:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.26},
  URN =		{urn:nbn:de:0030-drops-91964},
  doi =		{10.4230/LIPIcs.FSCD.2018.26},
  annote =	{Keywords: conditional term rewriting, innermost narrowing, regular tree grammar}
}
Document
Homogeneity Without Loss of Generality

Authors: Pawel Parys


Abstract
We consider higher-order recursion schemes as generators of infinite trees. A sort (simple type) is called homogeneous when all arguments of higher order are taken before any arguments of lower order. We prove that every scheme can be converted into an equivalent one (i.e, generating the same tree) that is homogeneous, that is, uses only homogeneous sorts. Then, we prove the same for safe schemes: every safe scheme can be converted into an equivalent safe homogeneous scheme. Furthermore, we compare two definition of safe schemes: the original definition of Damm, and the modern one. Finally, we prove a lemma which illustrates usefulness of the homogeneity assumption. The results are known, but we prove them in a novel way: by directly manipulating considered schemes.

Cite as

Pawel Parys. Homogeneity Without Loss of Generality. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{parys:LIPIcs.FSCD.2018.27,
  author =	{Parys, Pawel},
  title =	{{Homogeneity Without Loss of Generality}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.27},
  URN =		{urn:nbn:de:0030-drops-91972},
  doi =		{10.4230/LIPIcs.FSCD.2018.27},
  annote =	{Keywords: higher-order recursion schemes, lambda-calculus, homogeneous types, safe schemes}
}
Document
Nominal Unification with Atom and Context Variables

Authors: Manfred Schmidt-Schauß and David Sabel


Abstract
Automated deduction in higher-order program calculi, where properties of transformation rules are demanded, or confluence or other equational properties are requested, can often be done by syntactically computing overlaps (critical pairs) of reduction rules and transformation rules. Since higher-order calculi have alpha-equivalence as fundamental equivalence, the reasoning procedure must deal with it. We define ASD1-unification problems, which are higher-order equational unification problems employing variables for atoms, expressions and contexts, with additional distinct-variable constraints, and which have to be solved w.r.t. alpha-equivalence. Our proposal is to extend nominal unification to solve these unification problems. We succeeded in constructing the nominal unification algorithm NomUnifyASD. We show that NomUnifyASD is sound and complete for this problem class, and outputs a set of unifiers with constraints in nondeterministic polynomial time if the final constraints are satisfiable. We also show that solvability of the output constraints can be decided in NEXPTIME, and for a fixed number of context-variables in NP time. For terms without context-variables and atom-variables, NomUnifyASD runs in polynomial time, is unitary, and extends the classical problem by permitting distinct-variable constraints.

Cite as

Manfred Schmidt-Schauß and David Sabel. Nominal Unification with Atom and Context Variables. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{schmidtschau_et_al:LIPIcs.FSCD.2018.28,
  author =	{Schmidt-Schau{\ss}, Manfred and Sabel, David},
  title =	{{Nominal Unification with Atom and Context Variables}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.28},
  URN =		{urn:nbn:de:0030-drops-91983},
  doi =		{10.4230/LIPIcs.FSCD.2018.28},
  annote =	{Keywords: automated deduction, nominal unification, lambda calculus, functional programming}
}
Document
Cumulative Inductive Types In Coq

Authors: Amin Timany and Matthieu Sozeau


Abstract
In order to avoid well-known paradoxes associated with self-referential definitions, higher-order dependent type theories stratify the theory using a countably infinite hierarchy of universes (also known as sorts), Type_0 : Type_1 : *s. Such type systems are called cumulative if for any type A we have that A : Type_i implies A : Type_{i+1}. The Predicative Calculus of Inductive Constructions (pCIC) which forms the basis of the Coq proof assistant, is one such system. In this paper we present the Predicative Calculus of Cumulative Inductive Constructions (pCuIC) which extends the cumulativity relation to inductive types. We discuss cumulative inductive types as present in Coq 8.7 and their application to formalization and definitional translations.

Cite as

Amin Timany and Matthieu Sozeau. Cumulative Inductive Types In Coq. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 29:1-29:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{timany_et_al:LIPIcs.FSCD.2018.29,
  author =	{Timany, Amin and Sozeau, Matthieu},
  title =	{{Cumulative Inductive Types In Coq}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{29:1--29:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.29},
  URN =		{urn:nbn:de:0030-drops-91991},
  doi =		{10.4230/LIPIcs.FSCD.2018.29},
  annote =	{Keywords: Coq, Proof Assistants, Inductive Types, Universes, Cumulativity}
}
Document
Completion for Logically Constrained Rewriting

Authors: Sarah Winkler and Aart Middeldorp


Abstract
We propose an abstract completion procedure for logically constrained term rewrite systems (LCTRSs). This procedure can be instantiated to both standard Knuth-Bendix completion and ordered completion for LCTRSs, and we present a succinct and uniform correctness proof. A prototype implementation illustrates the viability of the new completion approach.

Cite as

Sarah Winkler and Aart Middeldorp. Completion for Logically Constrained Rewriting. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 30:1-30:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{winkler_et_al:LIPIcs.FSCD.2018.30,
  author =	{Winkler, Sarah and Middeldorp, Aart},
  title =	{{Completion for Logically Constrained Rewriting}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{30:1--30:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.30},
  URN =		{urn:nbn:de:0030-drops-92001},
  doi =		{10.4230/LIPIcs.FSCD.2018.30},
  annote =	{Keywords: Constrained rewriting, completion, automation, theorem proving}
}
Document
System Description
ProTeM: A Proof Term Manipulator (System Description)

Authors: Christina Kohl and Aart Middeldorp


Abstract
Proof terms are a useful concept for reasoning about computations in term rewriting. Human calculation with proof terms is tedious and error-prone. We present ProTeM, a new tool that offers support for manipulating proof terms that represent multisteps in left-linear rewrite systems.

Cite as

Christina Kohl and Aart Middeldorp. ProTeM: A Proof Term Manipulator (System Description). In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 31:1-31:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{kohl_et_al:LIPIcs.FSCD.2018.31,
  author =	{Kohl, Christina and Middeldorp, Aart},
  title =	{{ProTeM: A Proof Term Manipulator}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{31:1--31:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.31},
  URN =		{urn:nbn:de:0030-drops-92013},
  doi =		{10.4230/LIPIcs.FSCD.2018.31},
  annote =	{Keywords: Proof terms, term rewriting, interactive tool}
}
Document
Confluence Competition 2018

Authors: Takahito Aoto, Makoto Hamana, Nao Hirokawa, Aart Middeldorp, Julian Nagele, Naoki Nishida, Kiraku Shintani, and Harald Zankl


Abstract
We report on the 2018 edition of the Confluence Competition, a competition of software tools that aim to (dis)prove confluence and related properties of rewrite systems automatically.

Cite as

Takahito Aoto, Makoto Hamana, Nao Hirokawa, Aart Middeldorp, Julian Nagele, Naoki Nishida, Kiraku Shintani, and Harald Zankl. Confluence Competition 2018. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 32:1-32:5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{aoto_et_al:LIPIcs.FSCD.2018.32,
  author =	{Aoto, Takahito and Hamana, Makoto and Hirokawa, Nao and Middeldorp, Aart and Nagele, Julian and Nishida, Naoki and Shintani, Kiraku and Zankl, Harald},
  title =	{{Confluence Competition 2018}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{32:1--32:5},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.32},
  URN =		{urn:nbn:de:0030-drops-92023},
  doi =		{10.4230/LIPIcs.FSCD.2018.32},
  annote =	{Keywords: Confluence, competition, rewrite systems}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail