eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
0
0
10.4230/LIPIcs.RTA.2010
article
LIPIcs, Volume 6, RTA'10, Complete Volume
Lynch, Christopher
LIPIcs, Volume 6, RTA'10, Complete Volume
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010/LIPIcs.RTA.2010.pdf
Programming Techniques, Software Engineering, Programming Languages, Computation by Abstract Devices, Analysis of Algorithms and Problem Complexity Logics and Meanings of Programs, Mathematical Logic and Formal Languages, Symbolic and Algebraic Manipulation, Artificial Intelligence.
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
i
xii
10.4230/LIPIcs.RTA.2010.i
article
Frontmatter (Titlepage, Table of Contents, Author List, PC List, Reviewer List)
Lynch, Christopher
Front matter including table of contents, author list, PC list, and reviewer list.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.i/LIPIcs.RTA.2010.i.pdf
Table of Contents
Author List
PC List
Reviewer List
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
13
14
10.4230/LIPIcs.RTA.2010.XIII
article
Preface
Lynch, Christopher
Preface
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.XIII/LIPIcs.RTA.2010.XIII.pdf
Preface
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
1
4
10.4230/LIPIcs.RTA.2010.1
article
Automata for Data Words and Data Trees
Bojanczyk, Mikolaj
Data words and data trees appear in verification and XML processing. The term ``data'' means that positions of the word, or tree, are decorated with elements of an infinite set of data values, such as natural numbers or ASCII strings. This talk is a survey of the various automaton models that have been developed for data words and data trees.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.1/LIPIcs.RTA.2010.1.pdf
Automata
Data Word
Data Tree
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
5
6
10.4230/LIPIcs.RTA.2010.5
article
Realising Optimal Sharing
van Oostrom, Vincent
Realising Optimal Sharing
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.5/LIPIcs.RTA.2010.5.pdf
Optimal Sharing
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
7
16
10.4230/LIPIcs.RTA.2010.7
article
Automated Confluence Proof by Decreasing Diagrams based on Rule-Labelling
Aoto, Takahito
Decreasing diagrams technique (van Oostrom, 1994) is a technique that
can be widely applied to prove confluence of rewrite systems. To
directly apply the decreasing diagrams technique to prove confluence
of rewrite systems, rule-labelling heuristic has been proposed by van
Oostrom (2008). We show how constraints for ensuring confluence of
term rewriting systems constructed based on the rule-labelling
heuristic are encoded as linear arithmetic constraints suitable for
solving the satisfiability of them by external SMT solvers. We point
out an additional constraint omitted in (van Oostrom, 2008) that is
needed to guarantee the soundness of confluence proofs based on the
rule-labelling heuristic extended to deal with non-right-linear rules.
We also present several extensions of the rule-labelling heuristic by
which the applicability of the technique is enlarged.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.7/LIPIcs.RTA.2010.7.pdf
Confluence
Decreasing Diagrams
Automation
Term Rewriting Systems
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
17
32
10.4230/LIPIcs.RTA.2010.17
article
Higher-Order (Non-)Modularity
Appel, Claus
van Oostrom, Vincent
Simonsen, Jakob Grue
We show that, contrary to the situation in first-order
term rewriting, almost none of the usual properties of
rewriting are modular for higher-order rewriting, irrespective
of the higher-order rewriting format.
We show that for the particular format of simply typed
applicative term rewriting systems modularity of confluence,
normalization, and termination can be recovered
by imposing suitable linearity constraints.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.17/LIPIcs.RTA.2010.17.pdf
Higher-order rewriting
modularity
termination
normalization
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
33
48
10.4230/LIPIcs.RTA.2010.33
article
Closing the Gap Between Runtime Complexity and Polytime Computability
Avanzini, Martin
Moser, Georg
In earlier work, we have shown
that for confluent TRSs, innermost polynomial runtime complexity
induces polytime computability of the functions defined.
In this paper, we generalise this result to full rewriting, for that
we exploit graph rewriting. We give a new proof of the adequacy of
graph rewriting for full rewriting that allows for
a precise control of the resources copied. In sum we
completely describe an implementation of rewriting
on a Turing machine (TM for short). We show that the runtime complexity of
the TRS and the runtime complexity of the TM is polynomially
related.
Our result strengthens the evidence that the complexity of
a rewrite system is truthfully represented through the length
of derivations. Moreover our result allows the classification of
nondeterministic polytime-computation based on runtime
complexity analysis of rewrite systems.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.33/LIPIcs.RTA.2010.33.pdf
Term rewriting
graph rewriting
complexity analysis
polytime computability
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
49
66
10.4230/LIPIcs.RTA.2010.49
article
Abstract Models of Transfinite Reductions
Bahr, Patrick
We investigate transfinite reductions in abstract reduction
systems. To this end, we study two abstract models for transfinite
reductions: a metric model generalising the usual metric approach to
infinitary term rewriting and a novel partial order model. For both
models we distinguish between a weak and a strong variant of
convergence as known from infinitary term rewriting. Furthermore, we
introduce an axiomatic model of reductions that is general enough to
cover all of these models of transfinite reductions as well as the
ordinary model of finite reductions. It is shown that, in this
unifying axiomatic model, many basic relations between termination and
confluence properties known from finite reductions still hold. The
introduced models are applied to term rewriting but also to term graph
rewriting. We can show that for both term rewriting as well as for
term graph rewriting the partial order model forms a conservative
extension to the metric model.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.49/LIPIcs.RTA.2010.49.pdf
Infinitary rewriting
metric
partial order
abstract reduction system
axiomatic
term rewriting
graph rewriting
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
67
84
10.4230/LIPIcs.RTA.2010.67
article
Partial Order Infinitary Term Rewriting and Böhm Trees
Bahr, Patrick
We investigate an alternative model of infinitary term
rewriting. Instead of a metric, a partial order on terms is employed
to formalise (strong) convergence. We compare this partial order
convergence of orthogonal term rewriting systems to the usual metric
convergence of the corresponding B{"o}hm extensions. The B{"o}hm extension
of a term rewriting system contains additional rules to equate
so-called root-active terms. The core result we present is that
reachability w.r.t. partial order convergence coincides with
reachability w.r.t. metric convergence in the B{"o}hm extension. This
result is used to show that, unlike in the metric model, orthogonal
systems are infinitarily confluent and infinitarily normalising in the
partial order model. Moreover, we obtain, as in the metric model, a
compression lemma. A corollary of this lemma is that reachability
w.r.t. partial order convergence is a conservative extension of
reachability w.r.t. metric convergence.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.67/LIPIcs.RTA.2010.67.pdf
Infinitary term rewriting
Böhm trees
partial order
confluence
normalisation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
85
102
10.4230/LIPIcs.RTA.2010.85
article
Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting
Endrullis, Joerg
Grabmayer, Clemens
Hendriks, Dimitri
Klop, Jan Willem
van Oostrom, Vincent
We present some contributions to the theory of infinitary rewriting
for weakly orthogonal term rewrite systems, in which critical pairs
may occur provided they are trivial.
We show that the infinitary unique normal form property (UNinf)
fails by a simple example of a weakly orthogonal TRS with two
collapsing rules. By translating this example, we show that UNinf
also fails for the infinitary lambda-beta-eta-calculus.
As positive results we obtain the following: Infinitary confluence,
and hence UNinf, holds for weakly orthogonal TRSs that do not contain
collapsing rules. To this end we refine the compression lemma.
Furthermore, we consider the triangle and diamond properties
for infinitary developments in weakly orthogonal TRSs,
by refining an earlier cluster-analysis for the finite case.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.85/LIPIcs.RTA.2010.85.pdf
Weakly orthogonal term rewrite systems
unique normal form property
infinitary rewriting
infinitary lambda-beta-eta-calculus,
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
103
118
10.4230/LIPIcs.RTA.2010.103
article
The Undecidability of Type Related Problems in Type-free Style System F
Fujita, Ken-Etsu
Schubert, Aleksy
We consider here a number of variations on the System F, that are predicative second-order systems whose terms are intermediate between the Curry style and Church style. The terms here contain the information on where the universal quantifier elimination and introduction in the type inference process must take place, which is similar to Church forms. However, they omit the information on which types are involved in the rules, which is similar to Curry forms. In this paper we prove the undecidability of the type-checking, type inference and typability problems for the system. Moreover, the proof works for the predicative version of the system with finitely stratified polymorphic types. The result includes the bounds on the Leivant’s level numbers for types used in the instances leading to the undecidability.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.103/LIPIcs.RTA.2010.103.pdf
Lambda calculus and related systems
type checking
typability
partial type inference
2nd order unification
undecidability
Curry style type system
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
119
134
10.4230/LIPIcs.RTA.2010.119
article
On (Un)Soundness of Unravelings
Gmeiner, Karl
Gramlich, Bernhard
Schernhammer, Felix
We revisit (un)soundness of transformations of conditional into
unconditional rewrite systems. The focus here is on so-called
unravelings, the most simple and natural kind of such
transformations, for the class of normal conditional systems without
extra variables. By a systematic and thorough study of existing
counterexamples and of the potential sources of unsoundness we
obtain several new positive and negative results. In particular, we
prove the following new results: Confluence, non-erasingness and
weak left-linearity (of a given conditional system) each guarantee
soundness of the unraveled version w.r.t. the original one. The
latter result substantially extends the only known sufficient
criterion for soundness, namely left-linearity. Furthermore, by
means of counterexamples we refute various other tempting
conjectures about sufficient conditions for soundness.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.119/LIPIcs.RTA.2010.119.pdf
Conditional rewriting
transformation into unconditional systems
unsoundness
unraveling
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
135
150
10.4230/LIPIcs.RTA.2010.135
article
A Proof Calculus Which Reduces Syntactic Bureaucracy
Guglielmi, Alessio
Gundersen, Tom
Parigot, Michel
In usual proof systems, like the sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logic-independent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular the sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deep-inference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.135/LIPIcs.RTA.2010.135.pdf
Logic
Proof theory
Deep Inference
Flow graphs
Proof Systems
Open Deduction
Rewriting
Confluence
Termination
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
151
160
10.4230/LIPIcs.RTA.2010.151
article
A Rewriting Logic Semantics Approach to Modular Program Analysis
Hills, Mark
Rosu, Grigore
The K framework, based on rewriting logic semantics, provides a powerful logic for defining the semantics of programming languages. While most work in this area has focused on defining an evaluation semantics for a language, it is also possible to define an abstract semantics that can be used for program analysis. Using the SILF language (Hills, Serbanuta and Rosu, 2007), this paper describes one technique for defining such a semantics: policy frameworks. In policy frameworks, an analysis-generic, modular framework is first defined for a language. Individual analyses, called policies, are then defined as extensions of this framework, with each policy defining analysis-specific semantic rules and an annotation language which, in combination with support in the language front-end, allows users to annotate program types and functions with information used during program analysis. Standard term rewriting techniques are used to analyze programs by evaluating them in the policy semantics.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.151/LIPIcs.RTA.2010.151.pdf
K
rewriting logic semantics
program analysis
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
161
176
10.4230/LIPIcs.RTA.2010.161
article
Infinitary Rewriting: Foundations Revisited
Kahrs, Stefan
Infinitary Term Rewriting allows to express infinitary terms and infinitary reductions
that converge to them. As their notion of transfinite reduction in general,
and as binary relations in particular two concepts have been studied
in the past: strongly and weakly convergent reductions,
and in the last decade research has mostly focused around the former.
Finitary rewriting has a strong connection to the equational theory of its rule set:
if the rewrite system is confluent this (implies consistency of the theory and) gives rise to a semi-decision procedure for the theory,
and if the rewrite system is in addition terminating this becomes a decision procedure. This connection
is the original reason for the study of these properties in rewriting.
For infinitary rewriting there is barely an established notion of an equational theory.
The reason this issue is not trivial is that such a theory would need to include
some form of ``getting to limits'', and there are different options one can pursue.
These options are being looked at here, as well as several alternatives for the notion of reduction relation
and their relationships to these equational theories.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.161/LIPIcs.RTA.2010.161.pdf
Infinitary rewriting,equivalence
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
177
192
10.4230/LIPIcs.RTA.2010.177
article
Underspecified computation of normal forms
Koller, Alexander
Thater, Stefan
We show how to compute readings of ambiguous natural language sentences that are minimal in some way. Formally, we consider the problem of computing, out of a set C of trees and a rewrite system R, those trees in C that cannot be rewritten into a tree in C. We solve the problem for sets of trees that are described by semantic representations typically used in computational linguistics, and a certain class of rewrite systems that we use to approximate entailment, and show how to compute the irreducible trees efficiently by intersecting tree automata. Our algorithm solves the problem of computing weakest readings that has been open for 25 years in computational linguistics.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.177/LIPIcs.RTA.2010.177.pdf
Rewrite systems tree automata normal forms computational linguistics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
193
208
10.4230/LIPIcs.RTA.2010.193
article
Order-Sorted Unification with Regular Expression Sorts
Kutsia, Temur
Marin, Mircea
We extend first-order order-sorted unification by permitting regular expression sorts for variables and in the domains of function symbols. The set of basic sorts is finite. The obtained signature corresponds to a finite bottom-up hedge automaton. The unification problem in such a theory generalizes some known unification problems. Its unification type is infinitary. We give a complete unification procedure and prove decidability.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.193/LIPIcs.RTA.2010.193.pdf
Unification
sorts
regular expression
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
209
226
10.4230/LIPIcs.RTA.2010.209
article
An Efficient Nominal Unification Algorithm
Levy, Jordi
Villaret, Mateu
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo alpha-equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadratic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.209/LIPIcs.RTA.2010.209.pdf
Nominal logic
unification
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
227
242
10.4230/LIPIcs.RTA.2010.227
article
Computing Critical Pairs in 2-Dimensional Rewriting Systems
Mimram, Samuel
Rewriting systems on words are very useful in the study of monoids. In good cases, they give finite presentations of the monoids, allowing their manipulation by a computer. Even better, when the presentation is confluent and terminating, they provide one with a notion of canonical representative for the elements of the presented monoid. Polygraphs are a higher-dimensional generalization of this notion of presentation, from the setting of monoids to the much more general setting of n-categories. Here, we are interested in proving confluence for polygraphs presenting 2-categories, which can be seen as a generalization of term rewriting systems. For this purpose, we propose an adaptation of the usual algorithm for computing critical pairs. Interestingly, this framework is much richer than term rewriting systems and requires the elaboration of a new theoretical framework for representing critical pairs, based on contexts in compact 2-categories.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.227/LIPIcs.RTA.2010.227.pdf
Rewriting system
polygraph
presentation of a category
critical pair
unification
confluence
compact 2-category
context
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
243
258
10.4230/LIPIcs.RTA.2010.243
article
Polynomial Interpretations over the Reals do not Subsume Polynomial Interpretations over the Integers
Neurauter, Friedrich
Middeldorp, Aart
Polynomial interpretations are a useful technique for proving termination
of term rewrite systems. They come in various flavors:
polynomial interpretations with real, rational and integer coefficients.
In 2006, Lucas proved that there are rewrite systems that can be shown
polynomially terminating by polynomial interpretations with
real (algebraic)
coefficients, but cannot be shown polynomially terminating using
polynomials with rational coefficients only.
He also proved a similar theorem with respect to the use of
rational coefficients versus integer coefficients.
In this paper we show that polynomial interpretations with real or
rational coefficients do not subsume polynomial interpretations with
integer coefficients, contrary to what is commonly believed.
We further show that polynomial interpretations with real
coefficients subsume polynomial interpretations with rational
coefficients.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.243/LIPIcs.RTA.2010.243.pdf
Term rewriting
termination
polynomial interpretations
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
259
276
10.4230/LIPIcs.RTA.2010.259
article
Automated Termination Analysis of Java Bytecode by Term Rewriting
Otto, Carsten
Brockschmidt, Marc
von Essen, Christian
Giesl, Jürgen
We present an automated approach to prove termination of Java Bytecode (JBC) programs by automatically transforming them to term rewrite systems (TRSs). In this way, the numerous techniques and tools developed for TRS termination can now be used for imperative object-oriented languages like Java, which can be compiled into JBC.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.259/LIPIcs.RTA.2010.259.pdf
Java Bytecode
termination
term rewriting
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
277
294
10.4230/LIPIcs.RTA.2010.277
article
Declarative Debugging of Missing Answers for Maude
Riesco, Adrian
Verdejo, Alberto
Marti-Oliet, Narciso
Declarative debugging is a semi-automatic technique that starts
from an incorrect computation and locates a program fragment responsible for the error by building a tree representing this computation and guiding the user through it to find the error. Membership equational logic (MEL) is an equational
logic that in addition to equations allows the statement of membership axioms characterizing the elements of a sort. Rewriting logic is a logic of change that extends MEL by adding rewrite rules, that correspond to transitions between states and can be nondeterministic. In this paper we propose a calculus that allows to
infer normal forms and least sorts with the equational part, and sets of reachable terms through rules. We use an abbreviation of the proof trees computed with this calculus to build appropriate debugging trees for missing answers (results that are erroneous because they are incomplete), whose adequacy for debugging is proved.
Using these trees we have implemented a declarative debugger for Maude, a high-performance system based on rewriting logic, whose use is illustrated with an example.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.277/LIPIcs.RTA.2010.277.pdf
Declarative debugging
Maude
Missing answers
Rewriting
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
295
310
10.4230/LIPIcs.RTA.2010.295
article
Simulation in the Call-by-Need Lambda-Calculus with letrec
Schmidt-Schauss, Manfred
Sabel, David
Machkasova, Elena
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models. We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.295/LIPIcs.RTA.2010.295.pdf
Lambda calculus
semantics
contextual equivalence
bisimulation,call-by-need
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
311
324
10.4230/LIPIcs.RTA.2010.311
article
Weak Convergence and Uniform Normalization in Infinitary Rewriting
Simonsen, Jakob Grue
We study infinitary term rewriting systems containing finitely many rules. For these, we show that if a weakly convergent reduction is not strongly convergent, it contains a term that reduces to itself in one step (but the step itself need not be part of the reduction). Using this result, we prove
the starkly surprising result
that for any orthogonal system with finitely many rules, the system is
weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong convergence.
As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly convergent rewriting in the setting of weak convergence, if systems with finitely many rules are considered. Finally, we give a number of counterexamples showing failure of most of the results when infinite sets of rules are allowed.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.311/LIPIcs.RTA.2010.311.pdf
Infinitary rewriting
weak convergence
uniform normalization
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
325
340
10.4230/LIPIcs.RTA.2010.325
article
Certified Subterm Criterion and Certified Usable Rules
Sternagel, Christian
Thiemann, René
In this paper we present our formalization of two important
termination techniques for term rewrite systems: the subterm criterion and
the reduction pair processor in combination with usable rules. For both
techniques we developed executable check functions in the theorem prover
Isabelle/HOL which can certify the correct application of these techniques
in some given termination proof. As there are several variants of usable
rules we designed our check function in such a way that it accepts all
known variants, even those which are not explicitly spelled out in previous
papers.
We integrated our formalization in the publicly available IsaFoR-library.
This led to a significant increase in the power of CeTA, the corresponding
certified termination proof checker that is extracted from IsaFoR.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.325/LIPIcs.RTA.2010.325.pdf
Term Rewriting
Certification
Termination
Theorem Proving
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
341
356
10.4230/LIPIcs.RTA.2010.341
article
Termination of linear bounded term rewriting systems
Durand, Irène
Sénizergues, Géraud
Sylvestre, Marc
For the whole class of linear term rewriting systems and for each integer k, we define k-bounded rewriting as a restriction of the usual notion of rewriting.
We show that the k-bounded uniform termination, the k-bounded termination, the inverse k-bounded uniform, and the inverse k-bounded problems are decidable.
The k-bounded class (BO(k)) is, by definition, the set of linear systems for
which every derivation can be replaced by a k-bounded derivation. In general, for BO(k) systems, the uniform (respectively inverse uniform) k-bounded termination problem is not equivalent to the uniform (resp. inverse uniform) termination problem, and the k-bounded (respectively inverse k-bounded) termination problem is not equivalent to the termination (respectively inverse termination) problem.
This leads us to
define more restricted classes for which these problems are equivalent: the classes BOLP(k) of
k-bounded systems that have the length preservation property. By definition, a system is BOLP(k) if every derivation of length n can be replaced by a k-bounded derivation of length n.
We define the class BOLP of bounded systems that have the length preservation property as the union of all the BOLP(k) classes.
The class BOLP contains (strictly) several already known classes of systems:
the inverse left-basic semi-Thue systems,
the linear growing term rewriting systems, the inverse Linear-Finite-Path-Ordering systems, the strongly bottom-up systems.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.341/LIPIcs.RTA.2010.341.pdf
Term rewriting
termination
rewriting strategy
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
357
372
10.4230/LIPIcs.RTA.2010.357
article
Polynomially Bounded Matrix Interpretations
Waldmann, Johannes
Matrix interpretations can be used to bound the derivational complexity of rewrite systems. We present a criterion that completely characterizes matrix interpretations that are polynomially bounded. It includes the method of upper triangular interpretations as a special case, and we prove that the inclusion is strict. The criterion can be expressed as a finite domain constraint system.
It translates to a Boolean constraint system with a size that is polynomial in the dimension of the interpretation. We report on performance of an implementation.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.357/LIPIcs.RTA.2010.357.pdf
Derivational complexity of rewriting
matrix interpretation
weighted automata
ambiguity of automata
finite domain constraints
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
373
384
10.4230/LIPIcs.RTA.2010.373
article
Optimizing mkbTT
Winkler, Sarah
Sato, Haruhiko
Middeldorp, Aart
Kurihara, Masahito
We describe performance enhancements that have been added to mkbTT, a
modern completion tool combining multi-completion with the use of
termination tools.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.373/LIPIcs.RTA.2010.373.pdf
Knuth-Bendix completion
termination prover
automated deduction
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
385
400
10.4230/LIPIcs.RTA.2010.385
article
Modular Complexity Analysis via Relative Complexity
Zankl, Harald
Korp, Martin
In this paper we introduce a modular framework which allows to infer
(feasible) upper bounds on the (derivational) complexity of term rewrite
systems by combining different criteria. All current investigations to
analyze the derivational complexity are based on a single termination
proof, possibly preceded by transformations. We prove that the modular
framework is strictly more powerful than the conventional setting.
Furthermore, the results have been implemented and experiments show
significant gains in power.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.385/LIPIcs.RTA.2010.385.pdf
Term rewriting
complexity analysis
relative complexity
derivation length
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-07-06
6
401
416
10.4230/LIPIcs.RTA.2010.401
article
Proving Productivity in Infinite Data Structures
Zantema, Hans
Raffelsieper, Matthias
For a general class of infinite data structures including streams, binary trees, and the combination of finite and infinite lists, we investigate the notion of productivity. This generalizes stream productivity. We develop a general technique to prove productivity
based on proving context-sensitive termination, by which the power of present termination tools can be exploited. In order to treat cases where the approach does not apply directly, we develop transformations extending the power of the basic approach. We present a tool combining these ingredients that can prove productivity of a wide range of examples fully automatically.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol006-rta2010/LIPIcs.RTA.2010.401/LIPIcs.RTA.2010.401.pdf
Productivity
infinite data structures
streams