eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
0
0
10.4230/LIPIcs.FSCD.2016
article
LIPIcs, Volume 52, FSCD'16, Complete Volume
Kesner, Delia
Pientka, Brigitte
LIPIcs, Volume 52, FSCD'16, Complete Volume
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016/LIPIcs.FSCD.2016.pdf
Theory of Computation, Computation by abstract devices, Analysis of algorithms and problem complexity, Logics and meanings of programs, Mathematical logic and formal languages, Programming techniques, Software/Program Verification, Programming languages, Deduction and Theorem Proving
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
0:i
0:xviii
10.4230/LIPIcs.FSCD.2016.0
article
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers, Organising Commitee
Kesner, Delia
Pientka, Brigitte
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers, Organising Commitee
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.0/LIPIcs.FSCD.2016.0.pdf
Front Matter
Table of Contents
Preface
Steering Committee
Program Committee
External Reviewers
Organising Commitee
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
1:1
1:1
10.4230/LIPIcs.FSCD.2016.1
article
Compositional Compiler Verification for a Multi-Language World
Ahmed, Amal
Verified compilers are typically proved correct under severe
restrictions on what the compiler's output may be linked with, from no
linking at all to linking only with code compiled from the same source
language. Such assumptions contradict the reality of how we use these
compilers since most software systems today are comprised of
components written in different languages compiled by different
compilers to a common target, as well as low-level libraries that may
be handwritten in the target language.
The key challenge in verifying compilers for today's world of
multi-language software is how to formally state a compiler
correctness theorem that is compositional along two dimensions.
First, the theorem must guarantee correct compilation of components
while allowing compiled code to be composed (linked) with
target-language components of arbitrary provenance, including those
compiled from other languages. Second, the theorem must support
verification of multi-pass compilers by composing correctness proofs
for individual passes. In this talk, I will describe a methodology
for verifying compositional compiler correctness for a higher-order
typed language and discuss the challenges that lie ahead. I will
argue that compositional compiler correctness is, in essence, a
language interoperability problem: for viable solutions in the long
term, high-level languages must be equipped with principled
foreign-function interfaces that specify safe interoperability
between high-level and low-level components, and between more
precisely and less precisely typed code.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.1/LIPIcs.FSCD.2016.1.pdf
verified compilation; compositional compiler correctness
multi-language semantics
typed low-level languages
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
2:1
2:2
10.4230/LIPIcs.FSCD.2016.2
article
Coalgebras and Higher-Order Computation: a GoI Approach
Hasuo, Ichiro
Girard's geometry of interaction (GoI) can be seen---in one practical
aspect of it---as a compositional compilation method from functional
programs to sequential machines. There tokens move around and express
interactions between (parts of) programs. Intrigued by the combination
of abstract structures and concrete dynamics in GoI, our line of work
has aimed at exploiting, in GoI, results from the theory of
coalgebra---a categorical abstraction of state-based transition
systems that has found its use principally in concurrency theory. Such
reinforced connection between higher-order computation and state-based
dynamics is made possible thanks to an elegant categorical
axiomatization of GoI by Abramsky, Haghverdi and Scott, where traced
monoidal categories are identified to be the essential structure
behind. In the talk I shall lay out these basic ideas, together with
some of our results on: GoI semantics for a quantum programming
language; and our ``memoryful'' extension of GoI with algebraic
effects.
The talk is based on my joint work with my colleague Naohiko Hoshino (RIMS, Kyoto Univer- sity) and my (former) students Koko Muroya (University of Birmingham) and Toshiki Kataoka
(University of Tokyo), to whom I owe special thanks.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.2/LIPIcs.FSCD.2016.2.pdf
functional programming
geometry of interaction
categorical semantics
coalgebra
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
3:1
3:2
10.4230/LIPIcs.FSCD.2016.3
article
Teaching Foundations of Computation and Deduction Through Literate Functional Programming and Type Theory Formalization
Huet, Gérard
We describe experiments in teaching fundamental informatics notions around mathematical structures for formal concepts, and effective algorithms to manipulate them. The major themes of lambda-calculus and type theory served as guides for the effective implementation of functional programming languages and higher-order proof assistants, appropriate for reflecting the theoretical material into effective tools to represent constructively the concepts and formally certify the proofs of their properties. Progressively, a literate programming and proving style replaced informal mathematics in the presentation of the material as executable course notes. The talk will evoke the various stages of (in)completion of the corresponding set of notes along the years, and tell how their elaboration proved to be essential to the discovery of fundamental results.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.3/LIPIcs.FSCD.2016.3.pdf
Foundations
Computation
Deduction
Programming
Proofs
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
4:1
4:2
10.4230/LIPIcs.FSCD.2016.4
article
Verified Analysis of Functional Data Structures
Nipkow, Tobias
In recent work the author has analyzed a number of classical
functional search tree and priority queue implementations with the
help of the theorem prover Isabelle/HOL. The functional correctness
proofs of AVL trees, red-black trees, 2-3 trees, 2-3-4 trees, 1-2
brother trees, AA trees and splay trees could be automated. The
amortized logarithmic complexity of skew heaps, splay trees, splay
heaps and pairing heaps had to be proved manually.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.4/LIPIcs.FSCD.2016.4.pdf
Program Verification
Algorithm Analysis
Functional Programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
5:1
5:15
10.4230/LIPIcs.FSCD.2016.5
article
Strong Normalization for the Parameter-Free Polymorphic Lambda Calculus Based on the Omega-Rule.
Akiyoshi, Ryota
Terui, Kazushige
Following Aehlig, we consider a hierarchy F^p= { F^p_n }_{n in Nat} of
parameter-free subsystems of System F, where each F^p_n
corresponds to ID_n, the theory of n-times iterated inductive
definitions (thus our F^p_n corresponds to the n+1th system of
Aehlig). We here present two proofs of strong normalization for
F^p_n, which are directly formalizable with inductive definitions.
The first one, based on the Joachimski-Matthes method, can be fully
formalized in ID_n+1. This provides a tight upper bound on the
complexity of the normalization theorem for System F^p_n. The
second one, based on the Godel-Tait method, can be locally
formalized in ID_n. This provides a direct proof to the known
result that the representable functions in F^p_n are provably
total in ID_n. In both cases, Buchholz' Omega-rule plays a
central role.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.5/LIPIcs.FSCD.2016.5.pdf
Polymorphic Lambda Calculus
Strong Normalization
Computability Predicate
Infinitary Proof Theory
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
6:1
6:16
10.4230/LIPIcs.FSCD.2016.6
article
Normalisation by Evaluation for Dependent Types
Altenkirch, Thorsten
Kaposi, Ambrus
We develop normalisation by evaluation (NBE) for dependent types based
on presheaf categories. Our construction is formulated using internal
type theory using quotient inductive types. We use a typed
presentation hence there are no preterms or realizers in our
construction. NBE for simple types is using a logical relation between
the syntax and the presheaf interpretation. In our construction, we
merge the presheaf interpretation and the logical relation into a
proof-relevant logical predicate. We have formalized parts of the
construction in Agda.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.6/LIPIcs.FSCD.2016.6.pdf
normalisation by evaluation
dependent types
internal type theory
logical relations
Agda
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
7:1
7:15
10.4230/LIPIcs.FSCD.2016.7
article
Minimal Paradefinite Logics for Reasoning with Incompleteness and Inconsistency
Arieli, Ofer
Avron, Arnon
Paradefinite (`beyond the definite') logics are logics that can be
used for handling contradictory or partial information. As such,
paradefinite logics should be both paraconsistent and paracomplete. In
this paper we consider the simplest semantic framework for defining
paradefinite logics, consisting of four-valued matrices, and study the
better accepted logics that are induced by these matrices.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.7/LIPIcs.FSCD.2016.7.pdf
Paraconsistecy
Paracompleteness
4-valued logics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
8:1
8:18
10.4230/LIPIcs.FSCD.2016.8
article
Structural Interactions and Absorption of Structural Rules in BI Sequent Calculus
Arisaka, Ryuta
Development of a contraction-free BI sequent calculus, be the
contraction-freeness implicit or explicit, has not been successful in
the literature. We address this problem by presenting such a sequent
system. Our calculus involves no structural rules. It should be an
insight into non-formula contraction absorption in other non-classical
logics. Contraction absorption in sequent calculus is associated to
simpler cut elimination and to efficient proof searches.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.8/LIPIcs.FSCD.2016.8.pdf
cut-elimination
contraction-free
sequent calculus
proof theory
BI
logic combination
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
9:1
9:17
10.4230/LIPIcs.FSCD.2016.9
article
Environmental Bisimulations for Delimited-Control Operators with Dynamic Prompt Generation
Aristizábal, Andrés
Biernacki, Dariusz
Lenglet, Sergueï
Polesiuk, Piotr
We present sound and complete environmental bisimilarities for a
variant of Dybvig et al.'s calculus of multi-prompted
delimited-control operators with dynamic prompt generation. The
reasoning principles that we obtain generalize and advance the
existing techniques for establishing program equivalence in calculi
with single-prompted delimited control.
The basic theory that we develop is presented using Madiot et al.'s
framework that allows for smooth integration and composition of up-to
techniques facilitating bisimulation proofs. We also generalize the
framework in order to express environmental bisimulations that support
equivalence proofs of evaluation contexts representing
continuations. This change leads to a novel and powerful up-to
technique enhancing bisimulation proofs in the presence of control
operators.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.9/LIPIcs.FSCD.2016.9.pdf
delimited continuation
dynamic prompt generation
contextual equivalence
environmental bisimulation
up-to technique
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
10:1
10:18
10.4230/LIPIcs.FSCD.2016.10
article
Complexity of Acyclic Term Graph Rewriting
Avanzini, Martin
Moser, Georg
Term rewriting has been used as a formal model to reason about the
complexity of logic, functional, and imperative programs. In contrast
to term rewriting, term graph rewriting permits sharing of
common sub-expressions, and consequently is able to capture more
closely reasonable implementations of rule based languages. However,
the automated complexity analysis of term graph rewriting has received
little to no attention.
With this work, we provide first steps towards overcoming this
situation. We present adaptions of two prominent complexity techniques
from term rewriting, viz, the interpretation method and
dependency tuples. Our adaptions are non-trivial, in the sense
that they can observe not only term but also graph structures, i.e.
take sharing into account. In turn, the developed methods allow us to
more precisely estimate the runtime complexity of programs where
sharing of sub-expressions is essential.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.10/LIPIcs.FSCD.2016.10.pdf
Program Analysis
Graph Rewriting
Complexity Analysis
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
11:1
11:17
10.4230/LIPIcs.FSCD.2016.11
article
Nominal Narrowing
Ayala-Rincón, Mauricio
Fernández, Maribel
Nantes-Sobrinho, Daniele
Nominal unification is a generalisation of first-order unification
that takes alpha-equivalence into account. In this paper, we study
nominal unification in the context of equational theories. We
introduce nominal narrowing and design a general nominal E-unification
procedure, which is sound and complete for a wide class of equational
theories. We give examples of application.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.11/LIPIcs.FSCD.2016.11.pdf
Nominal Rewriting
Nominal Unification
Matching
Narrowing
Equational Theories
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
12:1
12:16
10.4230/LIPIcs.FSCD.2016.12
article
Synthesis of Functional Programs with Help of First-Order Intuitionistic Logic
Benke, Marcin
Schubert, Aleksy
Walukiewicz-Chrzaszcz, Daria
Curry-Howard isomorphism makes it possible to obtain functional
programs from proofs in logic. We analyse the problem of program
synthesis for ML programs with algebraic types and relate it to the
proof search problems in appropriate logics. The problem of synthesis
for closed programs is easily equivalent to the proof construction in
intuitionistic propositional logic and thus fits in the class of
PSPACE-complete problems. We focus further attention on the synthesis
problem relative to a given external library of functions. It turns
out that the problem is undecidable for unbounded instantiation in
ML. However its restriction to instantiations with atomic types
only results in a case equivalent to proof search in a restricted
fragment of intuitionistic first-order logic, being the core of
Sigma_1 level of the logic in the Mints hierarchy. This results in
EXPSPACE-completeness for this special case of the ML program
synthesis problem.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.12/LIPIcs.FSCD.2016.12.pdf
ML
program synthesis
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
13:1
13:17
10.4230/LIPIcs.FSCD.2016.13
article
Classical Extraction in Continuation Models
Blot, Valentin
We use the control features of continuation models to interpret proofs
in first-order classical theories. This interpretation is suitable for
extracting algorithms from proofs of Pi^0_2 formulas. It is
fundamentally different from the usual direct interpretation, which is
shown to be equivalent to Friedman's trick. The main difference is
that atomic formulas and natural numbers are interpreted as distinct
objects. Nevertheless, the control features inherent to the
continuation models permit extraction using a special "channel" on
which the extracted value is transmitted at toplevel without unfolding
the recursive calls. We prove that the technique fails in Scott
domains, but succeeds in the refined setting of Laird's bistable
bicpos, as well as in game semantics.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.13/LIPIcs.FSCD.2016.13.pdf
Extraction
Classical Logic
Control Operators
Game Semantics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
14:1
14:15
10.4230/LIPIcs.FSCD.2016.14
article
Proving Correctness of Logically Decorated Graph Rewriting Systems
Brenas, Jon Haël
Echahed, Rachid
Strecker, Martin
We first introduce the notion of logically decorated rewriting systems where the left-hand sides are endowed with logical formulas which help to express positive as well as negative application conditions, in addition to classical pattern-matching. These systems are defined using graph structures and an extension of combinatory propositional
dynamic logic, CPDL, with restricted universal programs, called C2PDL. In a second step, we tackle the problem of proving the correctness of logically decorated graph rewriting systems by using a Hoare-like calculus. We introduce a notion of specification defined as a tuple (Pre, Post, R, S) with Pre and Post being formulas of C2PDL, R a rewriting system and S a rewriting strategy. We provide a sound calculus which infers proof obligations of the considered specifications and establish the decidability of the verification problem of the (partial) correctness of the considered specifications.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.14/LIPIcs.FSCD.2016.14.pdf
Graph Rewriting
Hoare Logic,Combinatory PDL
Rewrite Strategies
Program Verification
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
15:1
15:18
10.4230/LIPIcs.FSCD.2016.15
article
New Results on Morris's Observational Theory: The Benefits of Separating the Inseparable
Breuvart, Flavien
Manzonetto, Giulio
Polonsky, Andrew
Ruoppolo, Domenico
Working in the untyped lambda calculus, we study Morris's
lambda-theory H+. Introduced in 1968, this is the original
extensional theory of contextual equivalence. On the syntactic side,
we show that this lambda-theory validates the omega-rule, thus
settling a long-standing open problem.On the semantic side, we
provide sufficient and necessary conditions for relational graph
models to be fully abstract for H+. We show that a relational graph
model captures Morris's observational preorder exactly when it is
extensional and lambda-Konig. Intuitively, a model is lambda-Konig
when every lambda-definable tree has an infinite path which is
witnessed by some element of the model.
Both results follow from a weak separability property enjoyed by
terms differing only because of some infinite eta-expansion,
which is proved through a refined version of the Böhm-out technique.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.15/LIPIcs.FSCD.2016.15.pdf
Lambda calculus
relational models
fully abstract
Böhm-out
omega-rule
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
16:1
16:18
10.4230/LIPIcs.FSCD.2016.16
article
Modular Focused Proof Systems for Intuitionistic Modal Logics
Chaudhuri, Kaustuv
Marin, Sonia
Straßburger, Lutz
Focusing is a general technique for syntactically compartmentalizing
the non-deterministic choices in a proof system, which not only
improves proof search but also has the representational benefit of
distilling sequent proofs into synthetic normal forms. However, since
focusing is usually specified as a restriction of the sequent
calculus, the technique has not been transferred to logics that lack a
(shallow) sequent presentation, as is the case for some of the logics
of the modal cube. We have recently extended the focusing technique
to classical nested sequents, a generalization of ordinary sequents.
In this work we further extend focusing to intuitionistic nested
sequents, which can capture all the logics of the intuitionistic S5
cube in a modular fashion. We present an internal cut-elimination
procedure for the focused system which in turn is used to show its
completeness.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.16/LIPIcs.FSCD.2016.16.pdf
intuitionistic modal logic
focusing
proof search
cut elimination
nested sequents
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
17:1
17:18
10.4230/LIPIcs.FSCD.2016.17
article
The Independence of Markov’s Principle in Type Theory
Coquand, Thierry
Mannaa, Bassel
In this paper, we show that Markov's principle is not derivable in
dependent type theory with natural numbers and one universe. One
tentative way to prove this would be to remark that Markov's principle
does not hold in a sheaf model of type theory over Cantor space, since
Markov's principle does not hold for the generic point of this model.
It is however not clear how to interpret the universe in a sheaf
model. Instead we design an extension of type theory, which
intuitively extends type theory by the addition of a generic point of
Cantor space. We then show the consistency of this extension by a
normalization argument. Markov's principle does not hold in this
extension, and it follows that it cannot be proved in type theory.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.17/LIPIcs.FSCD.2016.17.pdf
Forcing
Dependent type theory
Markov's Principle
Cantor Space
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
18:1
18:15
10.4230/LIPIcs.FSCD.2016.18
article
On Undefined and Meaningless in Lambda Definability
de Vries, Fer-Jan
We distinguish between undefined terms as used in lambda definability
of partial recursive functions and meaningless terms as used in
infinite lambda calculus for the infinitary terms models that
generalise the Bohm model. While there are uncountable many known
sets of meaningless terms, there are four known sets of undefined
terms. Two of these four are sets of meaningless terms.
In this paper we first present set of sufficient conditions for a set
of lambda terms to serve as set of undefined terms in lambda
definability of partial functions. The four known sets of undefined
terms satisfy these conditions.
Next we locate the smallest set of meaningless terms satisfying these
conditions. This set sits very low in the lattice of all sets of
meaningless terms. Any larger set of meaningless terms than this
smallest set is a set of undefined terms. Thus we find uncountably
many new sets of undefined terms.
As an unexpected bonus of our careful analysis of lambda definability
we obtain a natural modification, strict lambda-definability, which
allows for a Barendregt style of proof in which the representation of
composition is truly the composition of representations.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.18/LIPIcs.FSCD.2016.18.pdf
lambda calculus
lambda definability
partial recursive function
undefined term
meaningless term
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
19:1
19:16
10.4230/LIPIcs.FSCD.2016.19
article
The Intersection Type Unification Problem
Dudenhefner, Andrej
Martens, Moritz
Rehof, Jakob
The intersection type unification problem is an important component in
proof search related to several natural decision problems in
intersection type systems. It is unknown and remains open whether the
unification problem is decidable. We give the first nontrivial lower
bound for the problem by showing (our main result) that it is
exponential time hard. Furthermore, we show that this holds even under
rank 1 solutions (substitutions whose codomains are restricted to
contain rank 1 types). In addition, we provide a fixed-parameter
intractability result for intersection type matching (one-sided
unification), which is known to be NP-complete.
We place the intersection type unification problem in the context of
unification theory. The equational theory of intersection types can
be presented as an algebraic theory with an ACI (associative,
commutative, and idempotent) operator (intersection type) combined
with distributivity properties with respect to a second operator
(function type). Although the problem is algebraically natural and
interesting, it appears to occupy a hitherto unstudied place in the
theory of unification, and our investigation of the problem suggests
that new methods are required to understand the problem. Thus, for the
lower bound proof, we were not able to reduce from known results in
ACI-unification theory and use game-theoretic methods for two-player
tiling games.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.19/LIPIcs.FSCD.2016.19.pdf
Intersection Type
Equational Theory
Unification
Tiling
Complexity
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
20:1
20:18
10.4230/LIPIcs.FSCD.2016.20
article
Computing Connected Proof(-Structure)s From Their Taylor Expansion
Guerrieri, Giulio
Pellissier, Luc
Tortora de Falco, Lorenzo
We show that every connected Multiplicative Exponential Linear Logic (MELL) proof-structure (with or without cuts) is uniquely determined by a well-chosen element of its Taylor expansion: the one obtained by taking two copies of the content of each box. As a consequence, the relational
model is injective with respect to connected MELL proof-structures.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.20/LIPIcs.FSCD.2016.20.pdf
proof-nets
(differential) linear logic
relational model
Taylor expansion
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
21:1
21:18
10.4230/LIPIcs.FSCD.2016.21
article
Strongly Normalising Cyclic Data Computation by Iteration Categories of Second-Order Algebraic Theories
Hamana, Makoto
Cyclic data structures, such as cyclic lists, in functional
programming are tricky to handle because of their cyclicity. This
paper presents an investigation of categorical, algebraic, and
computational foundations of cyclic datatypes. Our framework of
cyclic datatypes is based on second-order algebraic theories of Fiore
et al., which give a uniform setting for syntax, types, and
computation rules for describing and reasoning about cyclic datatypes.
We extract the ``fold'' computation rules from the categorical
semantics based on iteration categories of Bloom and Esik. Thereby,
the rules are correct by construction. Finally, we prove strong
normalisation using the General Schema criterion for second-order
computation rules. Rather than the fixed point law, we particularly
choose Bekic law for computation, which is a key to obtaining strong
normalisation.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.21/LIPIcs.FSCD.2016.21.pdf
cyclic data structures
traced cartesian category
fixed point
functional programming
fold
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
22:1
22:17
10.4230/LIPIcs.FSCD.2016.22
article
Non-Omega-Overlapping TRSs are UN
Kahrs, Stefan
Smith, Connor
This paper solves problem #79 of RTA's list of open
problems --- in the positive. If the rules of a TRS do not overlap w.r.t.
substitutions of infinite terms then the TRS has unique normal forms.
We solve the problem by reducing the problem to one of consistency for
"similar" constructor term rewriting systems. For this we introduce
a new proof technique. We define a relation ⇓ that is
consistent by construction, and which --- if transitive --- would
coincide with the rewrite system's equivalence relation =R.
We then prove the transitivity of ⇓ by coalgebraic
reasoning. Any concrete proof for instances of this relation only
refers to terms of some finite coalgebra, and we then construct an
equivalence relation on that coalgebra which coincides with ⇓.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.22/LIPIcs.FSCD.2016.22.pdf
consistency
omega-substitutions
uniqueness of normal forms
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
23:1
23:18
10.4230/LIPIcs.FSCD.2016.23
article
Complexity Hierarchies and Higher-Order Cons-Free Rewriting
Kop, Cynthia
Grue Simonsen, Jakob
Constructor rewriting systems are said to be cons-free if, roughly,
constructor terms in the right-hand sides of rules are subterms of
constructor terms in the left-hand side; the computational intuition
is that rules cannot build new data structures. It is well-known that
cons-free programming languages can be used to characterize
computational complexity classes, and that cons-free first-order term
rewriting can be used to characterize the set of polynomial-time
decidable sets.
We investigate cons-free higher-order term rewriting systems, the
complexity classes they characterize, and how these depend on the
order of the types used in the systems. We prove that, for every k >=
1, left-linear cons-free systems with type order k characterize
E^kTIME if arbitrary evaluation is used (i.e., the system does not
have a fixed reduction strategy).
The main difference with prior work in implicit complexity is that (i)
our results hold for non-orthogonal term rewriting systems with
possible rule overlaps with no assumptions about reduction strategy,
(ii) results for such term rewriting systems have previously only been
obtained for k = 1, and with additional syntactic restrictions on top
of cons-freeness and left-linearity.
Our results are apparently among the first implicit characterizations
of the hierarchy E^1TIME != E^2TIME != .... Our work confirms prior
results that having full non-determinism (via overlaps of rules) does
not directly allow for characterization of non-deterministic
complexity classes like NE. We also show that non-determinism makes
the classes characterized highly sensitive to minor syntactic changes
such as admitting product types or non-left-linear rules.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.23/LIPIcs.FSCD.2016.23.pdf
higher-order term rewriting
implicit complexity
cons-freeness
ETIME hierarchy
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
24:1
24:15
10.4230/LIPIcs.FSCD.2016.24
article
Weighted Relational Models for Mobility
Laird, James
We investigate operational and denotational semantics for
computational and concurrent systems with mobile names which capture
their computational properties. For example, various properties of
fixed networks, such as shortest or longest path, transition
probabilities, and secure data flows, correspond to the ``sum'' in a
semiring of the weights of paths through the network: we aim to model
networks with a dynamic topology in a similar way. Alongside rich
computational formalisms such as the lambda-calculus, these can be
represented as terms in a calculus of solos with weights from a
complete semiring $R$, so that reduction associates a weight in R to
each reduction path.
Taking inspiration from differential nets, we develop a denotational
semantics for this calculus in the category of sets and R-weighted
relations, based on its differential and compact-closed structure, but
giving a simple, syntax-independent representation of terms as
matrices over R. We show that this corresponds to the sum in R of
the values associated to its independent reduction paths, and that our
semantics is fully abstract with respect to the observational
equivalence induced by sum-of-paths evaluation.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.24/LIPIcs.FSCD.2016.24.pdf
Concurrency
Mobility
Denotational Semantics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
25:1
25:17
10.4230/LIPIcs.FSCD.2016.25
article
Focusing in Orthologic
Laurent, Olivier
We propose new sequent calculus systems for orthologic (also
known as minimal quantum logic) which satisfy the cut
elimination property. The first one is a very simple system relying on
the involutive status of negation. The second one incorporates the
notion of focusing (coming from linear logic) to add
constraints on proofs and thus to facilitate proof search. We
demonstrate how to take benefits from the new systems in automatic
proof search for orthologic.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.25/LIPIcs.FSCD.2016.25.pdf
orthologic
focusing
minimal quantum logic
linear logic
automatic proof search
cut elimination
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
26:1
26:17
10.4230/LIPIcs.FSCD.2016.26
article
Functions-as-Constructors Higher-Order Unification
Libal, Tomer
Miller, Dale
Unification is a central operation in the construction of a range of
computational logic systems based on first-order and higher-order
logics. First-order unification has a number of properties that
dominates the way it is incorporated within such systems. In
particular, first-order unification is decidable, unary, and can be
performed on untyped term structures. None of these three properties
hold for full higher-order unification: unification is undecidable,
unifiers can be incomparable, and term-level typing can dominate the
search for unifiers. The so-called pattern subset of
higher-order unification was designed to be a small extension to
first-order unification that respected the basic laws governing
lambda-binding (the equalities of alpha, beta, and
eta-conversion) but which also satisfied those three properties.
While the pattern fragment of higher-order unification has been
popular in various implemented systems and in various theoretical
considerations, it is too weak for a number of applications. In this
paper, we define an extension of pattern unification that is motivated
by some existing applications and which satisfies these three
properties. The main idea behind this extension is that the arguments
to a higher-order, free variable can be more than just distinct bound
variables: they can also be terms constructed from (sufficient numbers
of) such variables using term constructors and where no argument is a
subterm of any other argument. We show that this extension to pattern
unification satisfies the three properties mentioned above.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.26/LIPIcs.FSCD.2016.26.pdf
higher-order unification
pattern unification
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
27:1
27:17
10.4230/LIPIcs.FSCD.2016.27
article
Homological Computations for Term Rewriting Systems
Malbos, Philippe
Mimram, Samuel
An important problem in universal algebra consists in finding
presentations of algebraic theories by generators and relations, which
are as small as possible. Exhibiting lower bounds on the number of
those generators and relations for a given theory is a difficult task
because it a priori requires considering all possible sets of
generators for a theory and no general method exists. In this article,
we explain how homological computations can provide such lower bounds,
in a systematic way, and show how to actually compute those in the
case where a presentation of the theory by a convergent rewriting
system is known. We also introduce the notion of coherent presentation
of a theory in order to consider finer homotopical invariants. In some
aspects, this work generalizes, to term rewriting systems, Squier's
celebrated homological and homotopical invariants for string rewriting
systems.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.27/LIPIcs.FSCD.2016.27.pdf
term rewriting system
Lawvere theory
Tietze equivalence
resolution
homology
convergent pres entation
coherent presentation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
28:1
28:18
10.4230/LIPIcs.FSCD.2016.28
article
Reversible Term Rewriting
Nishida, Naoki
Palacios, Adrián
Vidal, Germán
Essentially, in a reversible programming language, for each forward
computation step from state S to state S', there exists a
constructive and deterministic method to go backwards from state S'
to state S. Besides its theoretical interest, reversible
computation is a fundamental concept which is relevant in many
different areas like cellular automata, bidirectional program
transformation, or quantum computing, to name a few. In this paper,
we focus on term rewriting, a computation model that underlies most
rule-based programming languages. In general, term rewriting is not
reversible, even for injective functions; namely, given a rewrite step
t1 -> t2, we do not always have a decidable and deterministic
method to get t1 from t2. Here, we introduce a conservative
extension of term rewriting that becomes reversible. Furthermore, we
also define a transformation to make a rewrite system reversible using
standard term rewriting.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.28/LIPIcs.FSCD.2016.28.pdf
term rewriting
reversible computation
program transformation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
29:1
29:16
10.4230/LIPIcs.FSCD.2016.29
article
Certifying Confluence of Almost Orthogonal CTRSs via Exact Tree Automata Completion
Sternagel, Christian
Sternagel, Thomas
Suzuki et al. showed that properly oriented, right-stable, orthogonal, and oriented conditional term rewrite systems with extra variables in right-hand sides are confluent. We present our Isabelle/HOL formalization of this result, including two generalizations. On the one hand, we relax proper orientedness and orthogonality to extended proper orientedness and almost orthogonality modulo infeasibility, as suggested by Suzuki et al. On the other hand, we further loosen the requirements of the latter, enabling more powerful methods for proving infeasibility of conditional critical pairs. Furthermore, we formalized a construction by Jacquemard that employs exact tree automata completion for non-reachability analysis and apply it to certify infeasibility of conditional critical pairs. Combining these two results and extending the conditional confluence checker ConCon accordingly, we are able to automatically prove and certify confluence of an important class of conditional term rewrite systems.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.29/LIPIcs.FSCD.2016.29.pdf
certification
conditional term rewriting
confluence
infeasible critical pairs
non-reachability
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
30:1
30:18
10.4230/LIPIcs.FSCD.2016.30
article
Category Theory in Coq 8.5
Timany, Amin
Jacobs, Bart
We report on our experience implementing category theory in Coq 8.5.
Our work formalizes most of basic category theory, including concepts
not covered by existing formalizations, in a library that is fit to be
used as a general-purpose category-theoretical foundation.
Our development particularly takes advantage of two features new to
Coq 8.5: primitive projections for records and universe polymorphism.
Primitive projections allow for well-behaved dualities while universe
polymorphism provides a relative notion of largeness and smallness.
The latter is one of the main contributions of this paper. It pushes
the limits of the new universe polymorphism and constraint inference
algorithm of Coq 8.5.
In this paper we present in detail smallness and largeness in
categories and the foundation they are built on top of. We
furthermore explain how we have used the universe polymorphism of Coq 8.5
to represent smallness and largeness arguments by simply ignoring
them and entrusting them to the universe inference algorithm of Coq 8.5.
We also briefly discuss our experience throughout this
implementation, discuss concepts formalized in this development and
give a comparison with a few other developments of similar extent.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.30/LIPIcs.FSCD.2016.30.pdf
Category Theory
Coq 8.5
Universe Polymorphism
Homotopy Type Theory
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
31:1
31:17
10.4230/LIPIcs.FSCD.2016.31
article
Formal Languages, Formally and Coinductively
Traytel, Dmitriy
Traditionally, formal languages are defined as sets of words. More
recently, the alternative coalgebraic or coinductive representation as
infinite tries, i.e., prefix trees branching over the alphabet, has
been used to obtain compact and elegant proofs of classic results in
language theory. In this paper, we study this representation in the
Isabelle proof assistant. We define regular operations on infinite
tries and prove the axioms of Kleene algebra for those
operations. Thereby, we exercise corecursion and coinduction and
confirm the coinductive view being profitable in formalizations, as it
improves over the set-of-words view with respect to proof automation.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.31/LIPIcs.FSCD.2016.31.pdf
Formal languages
codatatypes
corecursion
coinduction
interactive theorem proving
Isabelle HOL
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
32:1
32:18
10.4230/LIPIcs.FSCD.2016.32
article
Normalisation by Random Descent
van Oostrom, Vincent
Toyama, Yoshihito
We present abstract hyper-normalisation results for strategies. These
results are then applied to term rewriting systems, both first and
higher-order. For example, we show hyper-normalisation of the
left--outer strategy for, what we call, left--outer pattern rewrite
systems, a class comprising both Combinatory Logic and the
lambda-calculus but also systems with critical pairs. Our
results apply to strategies that need not be deterministic but do have
Newman's random descent property: all reductions to normal form have
the same length, with Huet and Lévy's external strategy being an
example. Technically, we base our development on supplementing the
usual notion of commutation diagram with a notion of order, expressing
that the measure of its right leg does not exceed that of its left
leg, where measure is an abstraction of the usual notion of length.
We give an exact characterisation of such global commutation diagrams,
for pairs of reductions, by means of local ones, for pairs of steps,
we dub Dyck diagrams.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.32/LIPIcs.FSCD.2016.32.pdf
strategy
hyper-normalisation
commutation
random descent
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
33:1
33:12
10.4230/LIPIcs.FSCD.2016.33
article
Ground Confluence Prover based on Rewriting Induction
Aoto, Takahito
Toyama, Yoshihito
Ground confluence of term rewriting systems guarantees that all ground
terms are confluent. Recently, interests in proving confluence of
term rewriting systems automatically has grown, and confluence provers
have been developed. But they mainly focus on confluence and not
ground confluence. In fact, little interest has been paid to
developing tools for proving ground confluence automatically. We
report an implementation of a ground confluence prover based on
rewriting induction, which is a method originally developed for
proving inductive theorems.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.33/LIPIcs.FSCD.2016.33.pdf
Ground Confluence
Rewriting Induction
Non-Orientable Equations
Term Rewriting Systems
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
34:1
34:11
10.4230/LIPIcs.FSCD.2016.34
article
Globular: An Online Proof Assistant for Higher-Dimensional Rewriting
Bar, Krzysztof
Kissinger, Aleks
Vicary, Jamie
This article introduces Globular, an online proof assistant for the
formalization and verification of proofs in higher-dimensional
category theory. The tool produces graphical visualizations of
higher-dimensional proofs, assists in their construction with a
point-and-click interface, and performs type checking to prevent
incorrect rewrites. Hosted on the web, it has a low barrier to use,
and allows hyperlinking of formalized proofs directly from research
papers. It allows the formalization of proofs from logic, topology and
algebra which are not formalizable by other methods, and we give
several examples.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.34/LIPIcs.FSCD.2016.34.pdf
higher category theory
higher-dimensional rewriting
interactive theorem proving
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
35:1
35:11
10.4230/LIPIcs.FSCD.2016.35
article
Interaction Automata and the ia2d Interpreter
Gimenez, Stéphane
Obwaller, David
We introduce interaction automata as a topological model of
computation and present the conceptual plane interpreter ia2d.
Interaction automata form a refinement of both interaction nets and
cellular automata models that combine data deployment, memory
management and structured computation mechanisms. Their local
structure is inspired from pointer machines and allows an asynchronous
spatial distribution of the computation. Our tool can be considered
as a proof-of-concept piece of abstract hardware on which functional
programs can be run in parallel.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.35/LIPIcs.FSCD.2016.35.pdf
Interaction nets
computation models
parallel computation
functional programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2016-06-17
52
36:1
36:12
10.4230/LIPIcs.FSCD.2016.36
article
Automating the First-Order Theory of Rewriting for Left-Linear Right-Ground Rewrite Systems
Rapp, Franziska
Middeldorp, Aart
The first-order theory of rewriting is decidable for finite left-linear
right-ground rewrite systems. We present a new tool that implements the
decision procedure for this theory. It is based on tree automata
techniques. The tool offers the possibility to synthesize rewrite systems
that satisfy properties that are expressible in the first-order theory of
rewriting.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol052-fscd2016/LIPIcs.FSCD.2016.36/LIPIcs.FSCD.2016.36.pdf
first-order theory
ground rewrite systems
automation
synthesis