LIPIcs, Volume 21

24th International Conference on Rewriting Techniques and Applications (RTA 2013)



Thumbnail PDF

Publication Details

  • published at: 2013-06-24
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-939897-53-8
  • DBLP: db/conf/rta/rta2013

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 21, RTA'13, Complete Volume

Authors: Femke van Raamsdonk


Abstract
LIPIcs, Volume 21, RTA'13, Complete Volume

Cite as

24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Proceedings{vanraamsdonk:LIPIcs.RTA.2013,
  title =	{{LIPIcs, Volume 21, RTA'13, Complete Volume}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013},
  URN =		{urn:nbn:de:0030-drops-41152},
  doi =		{10.4230/LIPIcs.RTA.2013},
  annote =	{Keywords: Programming Techniques, Software Engineering, Programming Languages, Computation by Abstract Devices, Analysis of Algorithms and Problem Complexity Logics and Meanings of Programs, Mathematical Logic and Formal Languages, Symbolic and Algebraic Manipulation, Artificial Intelligence}
}
Document
Front Matter
Frontmatter, Table of Contents, Preface, Conference Organization

Authors: Femke van Raamsdonk


Abstract
Frontmatter, Table of Contents, Preface, Conference Organization

Cite as

24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. i-xiii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{vanraamsdonk:LIPIcs.RTA.2013.i,
  author =	{van Raamsdonk, Femke},
  title =	{{Frontmatter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{i--xiii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.i},
  URN =		{urn:nbn:de:0030-drops-40486},
  doi =		{10.4230/LIPIcs.RTA.2013.i},
  annote =	{Keywords: Frontmatter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Pattern Generation by Cellular Automata (Invited Talk)

Authors: Jarkko Kari


Abstract
A one-dimensional cellular automaton is a discrete dynamical system where a sequence of symbols evolves synchronously according to a local update rule. We discuss simple update rules that make the automaton perform multiplications of numbers by a constant. If the constant and the number base are selected suitably the automaton becomes a universal pattern generator: all finite strings over its state alphabet appear from a finite seed. In particular we consider the automata that multiply by constants 3 and 3/2 in base 6. We discuss the connections of these automata to some difficult open questions in number theory, and we pose several further questions concerning pattern generation in cellular automata.

Cite as

Jarkko Kari. Pattern Generation by Cellular Automata (Invited Talk). In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 1-3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{kari:LIPIcs.RTA.2013.1,
  author =	{Kari, Jarkko},
  title =	{{Pattern Generation by Cellular Automata}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{1--3},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.1},
  URN =		{urn:nbn:de:0030-drops-40490},
  doi =		{10.4230/LIPIcs.RTA.2013.1},
  annote =	{Keywords: cellular automata, pattern generation, Z-numbers}
}
Document
Invited Talk
Husserl and Hilbert on Completeness and Husserl's Term Rewrite-based Theory of Multiplicity (Invited Talk)

Authors: Mitsuhiro Okada


Abstract
Hilbert and Husserl presented axiomatic arithmetic theories in different ways and proposed two different notions of 'completeness' for arithmetic, at the turning of the 20th Century (1900-1901). The former led to the completion axiom, the latter completion of rewriting. We look into the latter in comparison with the former. The key notion to understand the latter is the notion of definite multiplicity or manifold (Mannigfaltigkeit). We show that his notion of multiplicity is understood by means of term rewrite theory in a very coherent manner, and that his notion of 'definite' multiplicity is understood as the relational web (or tissue) structure, the core part of which is a 'convergent' term rewrite proof structure. We examine how Husserl introduced his term rewrite theory in 1901 in the context of a controversy with Hilbert on the notion of completeness, and in the context of solving the justification problem of the use of imaginaries in mathematics, which was an important issue in the foundations of mathematics in the period.

Cite as

Mitsuhiro Okada. Husserl and Hilbert on Completeness and Husserl's Term Rewrite-based Theory of Multiplicity (Invited Talk). In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 4-19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{okada:LIPIcs.RTA.2013.4,
  author =	{Okada, Mitsuhiro},
  title =	{{Husserl and Hilbert on Completeness and Husserl's Term Rewrite-based Theory of Multiplicity}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{4--19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.4},
  URN =		{urn:nbn:de:0030-drops-40524},
  doi =		{10.4230/LIPIcs.RTA.2013.4},
  annote =	{Keywords: History of term rewrite theory, Husserl, Hilbert, proof theory, Knuth-Bendix completion}
}
Document
Invited Talk
Evidence Normalization in System FC (Invited Talk)

Authors: Dimitrios Vytiniotis and Simon Peyton Jones


Abstract
System FC is an explicitly typed language that serves as the target language for Haskell source programs. System FC is based on System F with the addition of erasable but explicit type equality proof witnesses. Equality proof witnesses are generated from type inference performed on source Haskell programs. Such witnesses may be very large objects, which causes performance degradation in later stages of compilation, and makes it hard to debug the results of type inference and subsequent program transformations. In this paper we present an equality proof simplification algorithm, implemented in GHC, which greatly reduces the size of the target System FC programs.

Cite as

Dimitrios Vytiniotis and Simon Peyton Jones. Evidence Normalization in System FC (Invited Talk). In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 20-38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{vytiniotis_et_al:LIPIcs.RTA.2013.20,
  author =	{Vytiniotis, Dimitrios and Peyton Jones, Simon},
  title =	{{Evidence Normalization in System FC}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{20--38},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.20},
  URN =		{urn:nbn:de:0030-drops-40506},
  doi =		{10.4230/LIPIcs.RTA.2013.20},
  annote =	{Keywords: Haskell, type functions, system FC}
}
Document
Linear Logic and Strong Normalization

Authors: Beniamino Accattoli


Abstract
Strong normalization for linear logic requires elaborated rewriting techniques. In this paper we give a new presentation of MELL proof nets, without any commutative cut-elimination rule. We show how this feature induces a compact and simple proof of strong normalization, via reducibility candidates. It is the first proof of strong normalization for MELL which does not rely on any form of confluence, and so it smoothly scales up to full linear logic. Moreover, it is an axiomatic proof, as more generally it holds for every set of rewriting rules satisfying three very natural requirements with respect to substitution, commutation with promotion, full composition, and Kesner's IE property. The insight indeed comes from the theory of explicit substitutions, and from looking at the exponentials as a substitution device.

Cite as

Beniamino Accattoli. Linear Logic and Strong Normalization. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 39-54, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{accattoli:LIPIcs.RTA.2013.39,
  author =	{Accattoli, Beniamino},
  title =	{{Linear Logic and Strong Normalization}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{39--54},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.39},
  URN =		{urn:nbn:de:0030-drops-40515},
  doi =		{10.4230/LIPIcs.RTA.2013.39},
  annote =	{Keywords: linear logic, proof nets, strong normalization, explicit substitutions}
}
Document
A Combination Framework for Complexity

Authors: Martin Avanzini and Georg Moser


Abstract
In this paper we present a combination framework for the automated polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis, and is employed as theoretical foundation in the automated complexity tool TCT. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity.

Cite as

Martin Avanzini and Georg Moser. A Combination Framework for Complexity. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 55-70, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.55,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{A Combination Framework for Complexity}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{55--70},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.55},
  URN =		{urn:nbn:de:0030-drops-40534},
  doi =		{10.4230/LIPIcs.RTA.2013.55},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
Tyrolean Complexity Tool: Features and Usage

Authors: Martin Avanzini and Georg Moser


Abstract
The Tyrolean Complexity Tool, TCT for short, is an open source complexity analyser for term rewrite systems. Our tool TCT features a majority of the known techniques for the automated characterisation of polynomial complexity of rewrite systems and can investigate derivational and runtime complexity, for full and innermost rewriting. This system description outlines features and provides a short introduction to the usage of TCT.

Cite as

Martin Avanzini and Georg Moser. Tyrolean Complexity Tool: Features and Usage. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 71-80, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.71,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Tyrolean Complexity Tool: Features and Usage}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{71--80},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.71},
  URN =		{urn:nbn:de:0030-drops-40540},
  doi =		{10.4230/LIPIcs.RTA.2013.71},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
Abstract Logical Model Checking of Infinite-State Systems Using Narrowing

Authors: Kyungmin Bae, Santiago Escobar, and José Meseguer


Abstract
A concurrent system can be naturally specified as a rewrite theory R = (Sigma, E, R) where states are elements of the initial algebra of terms modulo E and concurrent transitions are axiomatized by the rewrite rules R. Under simple conditions, narrowing with rules R modulo equations E can be used to symbolically represent the system's state space by means of terms with logical variables. We call this symbolic representation a "logical state space" and it can also be used for model checking verification of LTL properties. Since in general such a logical state space can be infinite, we propose several abstraction techniques for obtaining either an over-approximation or an under-approximation of the logical state space: (i) a folding abstraction that collapses patterns into more general ones, (ii) an easy-to-check method to define (bisimilar) equational abstractions, and (iii) an iterated bounded model checking method that can detect if a logical state space within a given bound is complete. We also show that folding abstractions can be faithful for safety LTL properties, so that they do not generate any spurious counterexamples. These abstraction methods can be used in combination and, as we illustrate with examples, can be effective in making the logical state space finite. We have implemented these techniques in the Maude system, providing the first narrowing-based LTL model checker we are aware of.

Cite as

Kyungmin Bae, Santiago Escobar, and José Meseguer. Abstract Logical Model Checking of Infinite-State Systems Using Narrowing. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 81-96, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{bae_et_al:LIPIcs.RTA.2013.81,
  author =	{Bae, Kyungmin and Escobar, Santiago and Meseguer, Jos\'{e}},
  title =	{{Abstract Logical Model Checking of Infinite-State Systems Using Narrowing}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{81--96},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.81},
  URN =		{urn:nbn:de:0030-drops-40554},
  doi =		{10.4230/LIPIcs.RTA.2013.81},
  annote =	{Keywords: model checking, infinite states, rewrite theories, narrowing}
}
Document
Compression of Rewriting Systems for Termination Analysis

Authors: Alexander Bau, Markus Lohrey, Eric Nöth, and Johannes Waldmann


Abstract
We adapt the TreeRePair tree compression algorithm and use it as an intermediate step in proving termination of term rewriting systems. We introduce a cost function that approximates the size of constraint systems that specify compatibility of matrix interpretations. We show how to integrate the compression algorithm with the Dependency Pairs transformation. Experiments show that compression reduces running times of constraint solvers, and thus improves the power of automated termination provers.

Cite as

Alexander Bau, Markus Lohrey, Eric Nöth, and Johannes Waldmann. Compression of Rewriting Systems for Termination Analysis. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 97-112, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{bau_et_al:LIPIcs.RTA.2013.97,
  author =	{Bau, Alexander and Lohrey, Markus and N\"{o}th, Eric and Waldmann, Johannes},
  title =	{{Compression of Rewriting Systems for Termination Analysis}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{97--112},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.97},
  URN =		{urn:nbn:de:0030-drops-40561},
  doi =		{10.4230/LIPIcs.RTA.2013.97},
  annote =	{Keywords: termination of rewriting, matrix interpretations, constraint solving, tree compression}
}
Document
A Variant of Higher-Order Anti-Unification

Authors: Alexander Baumgartner, Temur Kutsia, Jordi Levy, and Mateu Villaret


Abstract
We present a rule-based Huet's style anti-unification algorithm for simply-typed lambda-terms in eta-long beta-normal form, which computes a least general higher-order pattern generalization. For a pair of arbitrary terms of the same type, such a generalization always exists and is unique modulo alpha-equivalence and variable renaming. The algorithm computes it in cubic time within linear space. It has been implemented and the code is freely available.

Cite as

Alexander Baumgartner, Temur Kutsia, Jordi Levy, and Mateu Villaret. A Variant of Higher-Order Anti-Unification. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 113-127, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{baumgartner_et_al:LIPIcs.RTA.2013.113,
  author =	{Baumgartner, Alexander and Kutsia, Temur and Levy, Jordi and Villaret, Mateu},
  title =	{{A Variant of Higher-Order Anti-Unification}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{113--127},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.113},
  URN =		{urn:nbn:de:0030-drops-40579},
  doi =		{10.4230/LIPIcs.RTA.2013.113},
  annote =	{Keywords: higher-order anti-unification, higher-order patterns}
}
Document
Over-approximating Descendants by Synchronized Tree Languages

Authors: Yohan Boichut, Jacques Chabin, and Pierre Réty


Abstract
Over-approximating the descendants (successors) of a initial set of terms by a rewrite system is used in verification. The success of such verification methods depends on the quality of the approximation. To get better approximations, we are going to use non-regular languages. We present a procedure that always terminates and that computes over-approximation of descendants, using synchronized tree-(tuple) languages expressed by logic programs.

Cite as

Yohan Boichut, Jacques Chabin, and Pierre Réty. Over-approximating Descendants by Synchronized Tree Languages. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 128-142, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{boichut_et_al:LIPIcs.RTA.2013.128,
  author =	{Boichut, Yohan and Chabin, Jacques and R\'{e}ty, Pierre},
  title =	{{Over-approximating Descendants by Synchronized Tree Languages}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{128--142},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.128},
  URN =		{urn:nbn:de:0030-drops-40580},
  doi =		{10.4230/LIPIcs.RTA.2013.128},
  annote =	{Keywords: rewriting systems, non-regular approximations, logic programming, tree languages, descendants}
}
Document
Unifying Nominal Unification

Authors: Christophe Calvès


Abstract
Nominal unification is proven to be quadratic in time and space. It was so by two different approaches, both inspired by the Paterson-Wegman linear unification algorithm, but dramatically different in the way nominal and first-order constraints are dealt with. To handle nominal constraints, Levy and Villaret introduced the notion of replacing while Calves and Fernandez use permutations and sets of atoms. To deal with structural constraints, the former use multi-equations in a way similar to the Martelli-Montanari algorithm while the later mimic Paterson-Wegman. In this paper we abstract over these two approaches and genralize them into the notion of modality, highlighting the general ideas behind nominal unification. We show that replacings and environments are in fact isomorphic. This isomorphism is of prime importance to prove intricate properties on both sides and a step further to the real complexity of nominal unification.

Cite as

Christophe Calvès. Unifying Nominal Unification. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 143-157, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{calves:LIPIcs.RTA.2013.143,
  author =	{Calv\`{e}s, Christophe},
  title =	{{Unifying Nominal Unification}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{143--157},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.143},
  URN =		{urn:nbn:de:0030-drops-40593},
  doi =		{10.4230/LIPIcs.RTA.2013.143},
  annote =	{Keywords: unification, nominal, alpha-equivalence, term-graph rewriting}
}
Document
Rewriting with Linear Inferences in Propositional Logic

Authors: Anupam Das


Abstract
Linear inferences are sound implications of propositional logic where each variable appears exactly once in the premiss and conclusion. We consider a specific set of these inferences, MS, first studied by Straßburger, corresponding to the logical rules in deep inference proof theory. Despite previous results characterising the individual rules of MS, we show that there is no polynomial-time characterisation of MS, assuming that integers cannot be factorised in polynomial time. We also examine the length of rewrite paths in an extended system MSU that also has unit equations, utilising a notion dubbed trivialisation to reduce the case with units to the case without, amongst other observations on MS-rewriting and the set of linear inferences in general.

Cite as

Anupam Das. Rewriting with Linear Inferences in Propositional Logic. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 158-173, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{das:LIPIcs.RTA.2013.158,
  author =	{Das, Anupam},
  title =	{{Rewriting with Linear Inferences in Propositional Logic}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{158--173},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.158},
  URN =		{urn:nbn:de:0030-drops-40609},
  doi =		{10.4230/LIPIcs.RTA.2013.158},
  annote =	{Keywords: proof theory, propositional logic, complexity of proofs, deep inference}
}
Document
Proof Orders for Decreasing Diagrams

Authors: Bertram Felgenhauer and Vincent van Oostrom


Abstract
We present and compare some well-founded proof orders for decreasing diagrams. These proof orders order a conversion above another conversion if the latter is obtained by filling any peak in the former by a (locally) decreasing diagram. Therefore each such proof order entails the decreasing diagrams technique for proving confluence. The proof orders differ with respect to monotonicity and complexity. Our results are developed in the setting of involutive monoids. We extend these results to obtain a decreasing diagrams technique for confluence modulo.

Cite as

Bertram Felgenhauer and Vincent van Oostrom. Proof Orders for Decreasing Diagrams. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 174-189, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{felgenhauer_et_al:LIPIcs.RTA.2013.174,
  author =	{Felgenhauer, Bertram and van Oostrom, Vincent},
  title =	{{Proof Orders for Decreasing Diagrams}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{174--189},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.174},
  URN =		{urn:nbn:de:0030-drops-40616},
  doi =		{10.4230/LIPIcs.RTA.2013.174},
  annote =	{Keywords: involutive monoid, confluence modulo, decreasing diagram, proof order}
}
Document
Decidable structures between Church-style and Curry-style

Authors: Ken-etsu Fujita and Aleksy Schubert


Abstract
It is well-known that the type-checking and type-inference problems are undecidable for second order lambda-calculus in Curry-style, although those for Church-style are decidable. What causes the differences in decidability and undecidability on the problems? We examine crucial conditions on terms for the (un)decidability property from the viewpoint of partially typed terms, and what kinds of type annotations are essential for (un)decidability of type-related problems. It is revealed that there exists an intermediate structure of second order lambda-terms, called a style of hole-application, between Church-style and Curry-style, such that the type-related problems are decidable under the structure. We also extend this idea to the omega-order polymorphic calculus F-omega, and show that the type-checking and type-inference problems then become undecidable.

Cite as

Ken-etsu Fujita and Aleksy Schubert. Decidable structures between Church-style and Curry-style. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 190-205, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{fujita_et_al:LIPIcs.RTA.2013.190,
  author =	{Fujita, Ken-etsu and Schubert, Aleksy},
  title =	{{Decidable structures between Church-style and Curry-style}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{190--205},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.190},
  URN =		{urn:nbn:de:0030-drops-40629},
  doi =		{10.4230/LIPIcs.RTA.2013.190},
  annote =	{Keywords: 2nd-order lambda-calculus, type-checking, type-inference, Church-style and Curry-style}
}
Document
Expressibility in the Lambda Calculus with Mu

Authors: Clemens Grabmayer and Jan Rochel


Abstract
We address a problem connected to the unfolding semantics of functional programming languages: give a useful characterization of those infinite lambda-terms that are lambda-letrec-expressible in the sense that they arise as infinite unfoldings of terms in lambda-letrec, the lambda-calculus with letrec. We provide two characterizations, using concepts we introduce for infinite lambda-terms: regularity, strong regularity, and binding–capturing chains. It turns out that lambda-letrec-expressible infinite lambda-terms form a proper subclass of the regular infinite lambda-terms. In this paper we establish these characterizations only for expressibility in lambda-mu, the lambda-calculus with explicit mu-recursion. We show that for all infinite lambda-terms T the following are equivalent: (i): T is lambda-mu-expressible; (ii): T is strongly regular; (iii): T is regular, and it only has finite binding–capturing chains. We define regularity and strong regularity for infinite lambda-terms as two different generalizations of regularity for infinite first-order terms: as the existence of only finitely many subterms that are defined as the reducts of two rewrite systems for decomposing lambda-terms. These rewrite systems act on infinite lambda-terms furnished with a bracketed prefix of abstractions for collecting decomposed lambda-abstractions and keeping the terms closed under decomposition. They differ in which vacuous abstractions in the prefix are removed.

Cite as

Clemens Grabmayer and Jan Rochel. Expressibility in the Lambda Calculus with Mu. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 206-222, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{grabmayer_et_al:LIPIcs.RTA.2013.206,
  author =	{Grabmayer, Clemens and Rochel, Jan},
  title =	{{Expressibility in the Lambda Calculus with Mu}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{206--222},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.206},
  URN =		{urn:nbn:de:0030-drops-40635},
  doi =		{10.4230/LIPIcs.RTA.2013.206},
  annote =	{Keywords: lambda-calculus, lambda-calculus with letrec, unfolding semantics, regularity for infinite lambda-terms, binding-capturing chain}
}
Document
A Homotopical Completion Procedure with Applications to Coherence of Monoids

Authors: Yves Guiraud, Philippe Malbos, and Samuel Mimram


Abstract
One of the most used algorithm in rewriting theory is the Knuth-Bendix completion procedure which starts from a terminating rewriting system and iteratively adds rules to it, trying to produce an equivalent convergent rewriting system. It is in particular used to study presentations of monoids, since normal forms of the rewriting system provide canonical representatives of words modulo the congruence generated by the rules. Here, we are interested in extending this procedure in order to retrieve information about the low-dimensional homotopy properties of a monoid. We therefore consider the notion of coherent presentation, which is a generalization of rewriting systems that keeps track of the cells generated by confluence diagrams. We extend the Knuth-Bendix completion procedure to this setting, resulting in a homotopical completion procedure. It is based on a generalization of Tietze transformations, which are operations that can be iteratively applied to relate any two presentations of the same monoid. We also explain how these transformations can be used to remove useless generators, rules, or confluence diagrams in a coherent presentation, thus leading to a homotopical reduction procedure. Finally, we apply these techniques to the study of some examples coming from representation theory, to compute minimal coherent presentations for them: braid, plactic and Chinese monoids.

Cite as

Yves Guiraud, Philippe Malbos, and Samuel Mimram. A Homotopical Completion Procedure with Applications to Coherence of Monoids. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 223-238, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{guiraud_et_al:LIPIcs.RTA.2013.223,
  author =	{Guiraud, Yves and Malbos, Philippe and Mimram, Samuel},
  title =	{{A Homotopical Completion Procedure with Applications to Coherence of Monoids}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{223--238},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.223},
  URN =		{urn:nbn:de:0030-drops-40649},
  doi =		{10.4230/LIPIcs.RTA.2013.223},
  annote =	{Keywords: higher-dimensional rewriting, presentation of monoid, Knuth-Bendix completion, Tietze transformation, low-dimensional homotopy for monoids, coherence}
}
Document
Extending Abramsky's Lazy Lambda Calculus: (Non)-Conservativity of Embeddings

Authors: Manfred Schmidt-Schauß, Elena Machkasova, and David Sabel


Abstract
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.

Cite as

Manfred Schmidt-Schauß, Elena Machkasova, and David Sabel. Extending Abramsky's Lazy Lambda Calculus: (Non)-Conservativity of Embeddings. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 239-254, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{schmidtschau_et_al:LIPIcs.RTA.2013.239,
  author =	{Schmidt-Schau{\ss}, Manfred and Machkasova, Elena and Sabel, David},
  title =	{{Extending Abramsky's Lazy Lambda Calculus: (Non)-Conservativity of Embeddings}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{239--254},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.239},
  URN =		{urn:nbn:de:0030-drops-40651},
  doi =		{10.4230/LIPIcs.RTA.2013.239},
  annote =	{Keywords: lazy lambda calculus, contextual semantics, conservativity}
}
Document
Algorithms for Extended Alpha-Equivalence and Complexity

Authors: Manfred Schmidt-Schauß, Conrad Rau, and David Sabel


Abstract
Equality of expressions in lambda-calculi, higher-order programming languages, higher-order programming calculi and process calculi is defined as alpha-equivalence. Permutability of bindings in let-constructs and structural congruence axioms extend alpha-equivalence. We analyse these extended alpha-equivalences and show that there are calculi with polynomial time algorithms, that a multiple-binding "let" may make alpha-equivalence as hard as finding graph-isomorphisms, and that the replication operator in the pi-calculus may lead to an EXPSPACE-hard alpha-equivalence problem.

Cite as

Manfred Schmidt-Schauß, Conrad Rau, and David Sabel. Algorithms for Extended Alpha-Equivalence and Complexity. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 255-270, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{schmidtschau_et_al:LIPIcs.RTA.2013.255,
  author =	{Schmidt-Schau{\ss}, Manfred and Rau, Conrad and Sabel, David},
  title =	{{Algorithms for Extended Alpha-Equivalence and Complexity}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{255--270},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.255},
  URN =		{urn:nbn:de:0030-drops-40667},
  doi =		{10.4230/LIPIcs.RTA.2013.255},
  annote =	{Keywords: alpha-equivalence, higher-order calculi, deduction, pi-calculus}
}
Document
Unification Modulo Nonnested Recursion Schemes via Anchored Semi-Unification

Authors: Gert Smolka and Tobias Tebbi


Abstract
A recursion scheme is an orthogonal rewriting system with rules of the form f(x1,...,xn) -> s. We consider terms to be equivalent if they rewrite to the same redex-free possibly infinite term after infinitary rewriting. For the restriction to the nonnested case, where nested redexes are forbidden, we prove the existence of principal unifiers modulo scheme equivalence. We give an algorithm computing principal unifiers by reducing the problem to a novel fragment of semi-unification we call anchored semi-unification. For anchored semi-unification, we develop a decision algorithm that returns a principal semi-unifier in the positive case.

Cite as

Gert Smolka and Tobias Tebbi. Unification Modulo Nonnested Recursion Schemes via Anchored Semi-Unification. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 271-286, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{smolka_et_al:LIPIcs.RTA.2013.271,
  author =	{Smolka, Gert and Tebbi, Tobias},
  title =	{{Unification Modulo Nonnested Recursion Schemes via Anchored Semi-Unification}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{271--286},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.271},
  URN =		{urn:nbn:de:0030-drops-40674},
  doi =		{10.4230/LIPIcs.RTA.2013.271},
  annote =	{Keywords: recursion schemes, semi-unification, infinitary rewriting}
}
Document
Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion

Authors: Christian Sternagel and René Thiemann


Abstract
We present extensions of our Isabelle Formalization of Rewriting that cover two historically related concepts: the Knuth-Bendix order and the Knuth-Bendix completion procedure. The former, besides being the first development of its kind in a proof assistant, is based on a generalized version of the Knuth-Bendix order. We compare our version to variants from the literature and show all properties required to certify termination proofs of TRSs. The latter comprises the formalization of important facts that are related to completion, like Birkhoff's theorem, the critical pair theorem, and a soundness proof of completion, showing that the strict encompassment condition is superfluous for finite runs. As a result, we are able to certify completion proofs.

Cite as

Christian Sternagel and René Thiemann. Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 287-302, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.RTA.2013.287,
  author =	{Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{287--302},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.287},
  URN =		{urn:nbn:de:0030-drops-40685},
  doi =		{10.4230/LIPIcs.RTA.2013.287},
  annote =	{Keywords: certification, completion, confluence, termination}
}
Document
Automatic Decidability: A Schematic Calculus for Theories with Counting Operators

Authors: Elena Tushkanova, Christophe Ringeissen, Alain Giorgetti, and Olga Kouchnarenko


Abstract
Many verification problems can be reduced to a satisfiability problem modulo theories. For building satisfiability procedures the rewriting-based approach uses a general calculus for equational reasoning named paramodulation. Schematic paramodulation, in turn, provides means to reason on the derivations computed by paramodulation. Until now, schematic paramodulation was only studied for standard paramodulation. We present a schematic paramodulation calculus modulo a fragment of arithmetics, namely the theory of Integer Offsets. This new schematic calculus is used to prove the decidability of the satisfiability problem for some theories equipped with counting operators. We illustrate our theoretical contribution on theories representing extensions of classical data structures, e.g., lists and records. An implementation within the rewriting-based Maude system constitutes a practical contribution. It enables automatic decidability proofs for theories of practical use.

Cite as

Elena Tushkanova, Christophe Ringeissen, Alain Giorgetti, and Olga Kouchnarenko. Automatic Decidability: A Schematic Calculus for Theories with Counting Operators. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 303-318, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{tushkanova_et_al:LIPIcs.RTA.2013.303,
  author =	{Tushkanova, Elena and Ringeissen, Christophe and Giorgetti, Alain and Kouchnarenko, Olga},
  title =	{{Automatic Decidability: A Schematic Calculus for Theories with Counting Operators}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{303--318},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.303},
  URN =		{urn:nbn:de:0030-drops-40696},
  doi =		{10.4230/LIPIcs.RTA.2013.303},
  annote =	{Keywords: decision procedures, superposition, schematic saturation}
}
Document
Normalized Completion Revisited

Authors: Sarah Winkler and Aart Middeldorp


Abstract
Normalized completion (Marché 1996) is a widely applicable and efficient technique for com- pletion modulo theories. If successful, a normalized completion procedure computes a rewrite system that allows to decide the validity problem using normalized rewriting. In this paper we consider a slightly simplified inference system for finite normalized completion runs. We prove correctness, show faithfulness of critical pair criteria in our setting, and propose a different notion of normalizing pairs. We then show how normalized completion procedures can benefit from AC- termination tools instead of relying on a fixed AC-compatible reduction order. We outline our implementation of this approach in the completion tool mkbtt and present experimental results, including new completions.

Cite as

Sarah Winkler and Aart Middeldorp. Normalized Completion Revisited. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 319-334, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{winkler_et_al:LIPIcs.RTA.2013.319,
  author =	{Winkler, Sarah and Middeldorp, Aart},
  title =	{{Normalized Completion Revisited}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{319--334},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.319},
  URN =		{urn:nbn:de:0030-drops-40702},
  doi =		{10.4230/LIPIcs.RTA.2013.319},
  annote =	{Keywords: term rewriting, completion}
}
Document
Beyond Peano Arithmetic – Automatically Proving Termination of the Goodstein Sequence

Authors: Sarah Winkler, Harald Zankl, and Aart Middeldorp


Abstract
Kirby and Paris (1982) proved in a celebrated paper that a theorem of Goodstein (1944) cannot be established in Peano (1889) arithmetic. We present an encoding of Goodstein's theorem as a termination problem of a finite rewrite system. Using a novel implementation of ordinal interpretations, we are able to automatically prove termination of this system, resulting in the first automatic termination proof for a system whose derivational complexity is not multiple recursive. Our method can also cope with the encoding by Touzet (1998) of the battle of Hercules and Hydra, yet another system which has been out of reach for automated tools, until now.

Cite as

Sarah Winkler, Harald Zankl, and Aart Middeldorp. Beyond Peano Arithmetic – Automatically Proving Termination of the Goodstein Sequence. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 335-351, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{winkler_et_al:LIPIcs.RTA.2013.335,
  author =	{Winkler, Sarah and Zankl, Harald and Middeldorp, Aart},
  title =	{{Beyond Peano Arithmetic – Automatically Proving Termination of the Goodstein Sequence}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{335--351},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.335},
  URN =		{urn:nbn:de:0030-drops-40718},
  doi =		{10.4230/LIPIcs.RTA.2013.335},
  annote =	{Keywords: term rewriting, termination, automation, ordinals}
}
Document
Confluence by Decreasing Diagrams – Formalized

Authors: Harald Zankl


Abstract
This paper presents a formalization of decreasing diagrams in the theorem prover Isabelle. It discusses mechanical proofs showing that any locally decreasing abstract rewrite system is confluent. The valley and the conversion version of decreasing diagrams are considered.

Cite as

Harald Zankl. Confluence by Decreasing Diagrams – Formalized. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 352-367, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{zankl:LIPIcs.RTA.2013.352,
  author =	{Zankl, Harald},
  title =	{{Confluence by Decreasing Diagrams – Formalized}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{352--367},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.352},
  URN =		{urn:nbn:de:0030-drops-40723},
  doi =		{10.4230/LIPIcs.RTA.2013.352},
  annote =	{Keywords: term rewriting, confluence, decreasing diagrams, formalization}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail