Volume

LIPIcs, Volume 6

Proceedings of the 21st International Conference on Rewriting Techniques and Applications



Thumbnail PDF

Publication Details

  • published at: 2010-07-06
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-939897-18-7
  • DBLP: db/conf/rta/rta2010

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 6, RTA'10, Complete Volume

Authors: Christopher Lynch


Abstract
LIPIcs, Volume 6, RTA'10, Complete Volume

Cite as

Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Proceedings{lynch:LIPIcs.RTA.2010,
  title =	{{LIPIcs, Volume 6, RTA'10, Complete Volume}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010},
  URN =		{urn:nbn:de:0030-drops-41003},
  doi =		{10.4230/LIPIcs.RTA.2010},
  annote =	{Keywords: Programming Techniques, Software Engineering, Programming Languages, Computation by Abstract Devices, Analysis of Algorithms and Problem Complexity Logics and Meanings of Programs, Mathematical Logic and Formal Languages, Symbolic and Algebraic Manipulation, Artificial Intelligence.}
}
Document
Front Matter
Frontmatter (Titlepage, Table of Contents, Author List, PC List, Reviewer List)

Authors: Christopher Lynch


Abstract
Front matter including table of contents, author list, PC list, and reviewer list.

Cite as

Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. i-xii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{lynch:LIPIcs.RTA.2010.i,
  author =	{Lynch, Christopher},
  title =	{{Frontmatter (Titlepage, Table of Contents, Author List, PC List, Reviewer List)}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{i--xii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.i},
  URN =		{urn:nbn:de:0030-drops-26684},
  doi =		{10.4230/LIPIcs.RTA.2010.i},
  annote =	{Keywords: Table of Contents, Author List, PC List, Reviewer List}
}
Document
Preface

Authors: Christopher Lynch


Abstract
Preface

Cite as

Christopher Lynch. Preface. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 13-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{lynch:LIPIcs.RTA.2010.XIII,
  author =	{Lynch, Christopher},
  title =	{{Preface}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{13--14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.XIII},
  URN =		{urn:nbn:de:0030-drops-26675},
  doi =		{10.4230/LIPIcs.RTA.2010.XIII},
  annote =	{Keywords: Preface}
}
Document
Automata for Data Words and Data Trees

Authors: Mikolaj Bojanczyk


Abstract
Data words and data trees appear in verification and XML processing. The term ``data'' means that positions of the word, or tree, are decorated with elements of an infinite set of data values, such as natural numbers or ASCII strings. This talk is a survey of the various automaton models that have been developed for data words and data trees.

Cite as

Mikolaj Bojanczyk. Automata for Data Words and Data Trees. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 1-4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{bojanczyk:LIPIcs.RTA.2010.1,
  author =	{Bojanczyk, Mikolaj},
  title =	{{Automata for Data Words and Data Trees}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{1--4},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.1},
  URN =		{urn:nbn:de:0030-drops-26397},
  doi =		{10.4230/LIPIcs.RTA.2010.1},
  annote =	{Keywords: Automata, Data Word, Data Tree}
}
Document
Realising Optimal Sharing

Authors: Vincent van Oostrom


Abstract
Realising Optimal Sharing

Cite as

Vincent van Oostrom. Realising Optimal Sharing. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 5-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{vanoostrom:LIPIcs.RTA.2010.5,
  author =	{van Oostrom, Vincent},
  title =	{{Realising Optimal Sharing}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{5--6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.5},
  URN =		{urn:nbn:de:0030-drops-26403},
  doi =		{10.4230/LIPIcs.RTA.2010.5},
  annote =	{Keywords: Optimal Sharing}
}
Document
Automated Confluence Proof by Decreasing Diagrams based on Rule-Labelling

Authors: Takahito Aoto


Abstract
Decreasing diagrams technique (van Oostrom, 1994) is a technique that can be widely applied to prove confluence of rewrite systems. To directly apply the decreasing diagrams technique to prove confluence of rewrite systems, rule-labelling heuristic has been proposed by van Oostrom (2008). We show how constraints for ensuring confluence of term rewriting systems constructed based on the rule-labelling heuristic are encoded as linear arithmetic constraints suitable for solving the satisfiability of them by external SMT solvers. We point out an additional constraint omitted in (van Oostrom, 2008) that is needed to guarantee the soundness of confluence proofs based on the rule-labelling heuristic extended to deal with non-right-linear rules. We also present several extensions of the rule-labelling heuristic by which the applicability of the technique is enlarged.

Cite as

Takahito Aoto. Automated Confluence Proof by Decreasing Diagrams based on Rule-Labelling. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 7-16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{aoto:LIPIcs.RTA.2010.7,
  author =	{Aoto, Takahito},
  title =	{{Automated Confluence Proof by Decreasing Diagrams based on Rule-Labelling}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{7--16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.7},
  URN =		{urn:nbn:de:0030-drops-26418},
  doi =		{10.4230/LIPIcs.RTA.2010.7},
  annote =	{Keywords: Confluence, Decreasing Diagrams, Automation, Term Rewriting Systems}
}
Document
Higher-Order (Non-)Modularity

Authors: Claus Appel, Vincent van Oostrom, and Jakob Grue Simonsen


Abstract
We show that, contrary to the situation in first-order term rewriting, almost none of the usual properties of rewriting are modular for higher-order rewriting, irrespective of the higher-order rewriting format. We show that for the particular format of simply typed applicative term rewriting systems modularity of confluence, normalization, and termination can be recovered by imposing suitable linearity constraints.

Cite as

Claus Appel, Vincent van Oostrom, and Jakob Grue Simonsen. Higher-Order (Non-)Modularity. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 17-32, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{appel_et_al:LIPIcs.RTA.2010.17,
  author =	{Appel, Claus and van Oostrom, Vincent and Simonsen, Jakob Grue},
  title =	{{Higher-Order (Non-)Modularity}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{17--32},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.17},
  URN =		{urn:nbn:de:0030-drops-26427},
  doi =		{10.4230/LIPIcs.RTA.2010.17},
  annote =	{Keywords: Higher-order rewriting, modularity, termination, normalization}
}
Document
Closing the Gap Between Runtime Complexity and Polytime Computability

Authors: Martin Avanzini and Georg Moser


Abstract
In earlier work, we have shown that for confluent TRSs, innermost polynomial runtime complexity induces polytime computability of the functions defined. In this paper, we generalise this result to full rewriting, for that we exploit graph rewriting. We give a new proof of the adequacy of graph rewriting for full rewriting that allows for a precise control of the resources copied. In sum we completely describe an implementation of rewriting on a Turing machine (TM for short). We show that the runtime complexity of the TRS and the runtime complexity of the TM is polynomially related. Our result strengthens the evidence that the complexity of a rewrite system is truthfully represented through the length of derivations. Moreover our result allows the classification of nondeterministic polytime-computation based on runtime complexity analysis of rewrite systems.

Cite as

Martin Avanzini and Georg Moser. Closing the Gap Between Runtime Complexity and Polytime Computability. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 33-48, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2010.33,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Closing the Gap Between Runtime Complexity and Polytime Computability}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{33--48},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.33},
  URN =		{urn:nbn:de:0030-drops-26436},
  doi =		{10.4230/LIPIcs.RTA.2010.33},
  annote =	{Keywords: Term rewriting, graph rewriting, complexity analysis, polytime computability}
}
Document
Abstract Models of Transfinite Reductions

Authors: Patrick Bahr


Abstract
We investigate transfinite reductions in abstract reduction systems. To this end, we study two abstract models for transfinite reductions: a metric model generalising the usual metric approach to infinitary term rewriting and a novel partial order model. For both models we distinguish between a weak and a strong variant of convergence as known from infinitary term rewriting. Furthermore, we introduce an axiomatic model of reductions that is general enough to cover all of these models of transfinite reductions as well as the ordinary model of finite reductions. It is shown that, in this unifying axiomatic model, many basic relations between termination and confluence properties known from finite reductions still hold. The introduced models are applied to term rewriting but also to term graph rewriting. We can show that for both term rewriting as well as for term graph rewriting the partial order model forms a conservative extension to the metric model.

Cite as

Patrick Bahr. Abstract Models of Transfinite Reductions. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 49-66, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{bahr:LIPIcs.RTA.2010.49,
  author =	{Bahr, Patrick},
  title =	{{Abstract Models of Transfinite Reductions}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{49--66},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.49},
  URN =		{urn:nbn:de:0030-drops-26445},
  doi =		{10.4230/LIPIcs.RTA.2010.49},
  annote =	{Keywords: Infinitary rewriting, metric, partial order, abstract reduction system, axiomatic, term rewriting, graph rewriting}
}
Document
Partial Order Infinitary Term Rewriting and Böhm Trees

Authors: Patrick Bahr


Abstract
We investigate an alternative model of infinitary term rewriting. Instead of a metric, a partial order on terms is employed to formalise (strong) convergence. We compare this partial order convergence of orthogonal term rewriting systems to the usual metric convergence of the corresponding B{"o}hm extensions. The B{"o}hm extension of a term rewriting system contains additional rules to equate so-called root-active terms. The core result we present is that reachability w.r.t. partial order convergence coincides with reachability w.r.t. metric convergence in the B{"o}hm extension. This result is used to show that, unlike in the metric model, orthogonal systems are infinitarily confluent and infinitarily normalising in the partial order model. Moreover, we obtain, as in the metric model, a compression lemma. A corollary of this lemma is that reachability w.r.t. partial order convergence is a conservative extension of reachability w.r.t. metric convergence.

Cite as

Patrick Bahr. Partial Order Infinitary Term Rewriting and Böhm Trees. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 67-84, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{bahr:LIPIcs.RTA.2010.67,
  author =	{Bahr, Patrick},
  title =	{{Partial Order Infinitary Term Rewriting and B\"{o}hm Trees}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{67--84},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.67},
  URN =		{urn:nbn:de:0030-drops-26455},
  doi =		{10.4230/LIPIcs.RTA.2010.67},
  annote =	{Keywords: Infinitary term rewriting, B\"{o}hm trees, partial order, confluence, normalisation}
}
Document
Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting

Authors: Joerg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van Oostrom


Abstract
We present some contributions to the theory of infinitary rewriting for weakly orthogonal term rewrite systems, in which critical pairs may occur provided they are trivial. We show that the infinitary unique normal form property (UNinf) fails by a simple example of a weakly orthogonal TRS with two collapsing rules. By translating this example, we show that UNinf also fails for the infinitary lambda-beta-eta-calculus. As positive results we obtain the following: Infinitary confluence, and hence UNinf, holds for weakly orthogonal TRSs that do not contain collapsing rules. To this end we refine the compression lemma. Furthermore, we consider the triangle and diamond properties for infinitary developments in weakly orthogonal TRSs, by refining an earlier cluster-analysis for the finite case.

Cite as

Joerg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van Oostrom. Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 85-102, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{endrullis_et_al:LIPIcs.RTA.2010.85,
  author =	{Endrullis, Joerg and Grabmayer, Clemens and Hendriks, Dimitri and Klop, Jan Willem and van Oostrom, Vincent},
  title =	{{Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{85--102},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.85},
  URN =		{urn:nbn:de:0030-drops-26469},
  doi =		{10.4230/LIPIcs.RTA.2010.85},
  annote =	{Keywords: Weakly orthogonal term rewrite systems, unique normal form property, infinitary rewriting, infinitary lambda-beta-eta-calculus,}
}
Document
The Undecidability of Type Related Problems in Type-free Style System F

Authors: Ken-Etsu Fujita and Aleksy Schubert


Abstract
We consider here a number of variations on the System F, that are predicative second-order systems whose terms are intermediate between the Curry style and Church style. The terms here contain the information on where the universal quantifier elimination and introduction in the type inference process must take place, which is similar to Church forms. However, they omit the information on which types are involved in the rules, which is similar to Curry forms. In this paper we prove the undecidability of the type-checking, type inference and typability problems for the system. Moreover, the proof works for the predicative version of the system with finitely stratified polymorphic types. The result includes the bounds on the Leivant’s level numbers for types used in the instances leading to the undecidability.

Cite as

Ken-Etsu Fujita and Aleksy Schubert. The Undecidability of Type Related Problems in Type-free Style System F. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 103-118, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{fujita_et_al:LIPIcs.RTA.2010.103,
  author =	{Fujita, Ken-Etsu and Schubert, Aleksy},
  title =	{{The Undecidability of Type Related Problems in Type-free Style System F}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{103--118},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.103},
  URN =		{urn:nbn:de:0030-drops-26475},
  doi =		{10.4230/LIPIcs.RTA.2010.103},
  annote =	{Keywords: Lambda calculus and related systems, type checking, typability, partial type inference, 2nd order unification, undecidability, Curry style type system}
}
Document
On (Un)Soundness of Unravelings

Authors: Karl Gmeiner, Bernhard Gramlich, and Felix Schernhammer


Abstract
We revisit (un)soundness of transformations of conditional into unconditional rewrite systems. The focus here is on so-called unravelings, the most simple and natural kind of such transformations, for the class of normal conditional systems without extra variables. By a systematic and thorough study of existing counterexamples and of the potential sources of unsoundness we obtain several new positive and negative results. In particular, we prove the following new results: Confluence, non-erasingness and weak left-linearity (of a given conditional system) each guarantee soundness of the unraveled version w.r.t. the original one. The latter result substantially extends the only known sufficient criterion for soundness, namely left-linearity. Furthermore, by means of counterexamples we refute various other tempting conjectures about sufficient conditions for soundness.

Cite as

Karl Gmeiner, Bernhard Gramlich, and Felix Schernhammer. On (Un)Soundness of Unravelings. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 119-134, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{gmeiner_et_al:LIPIcs.RTA.2010.119,
  author =	{Gmeiner, Karl and Gramlich, Bernhard and Schernhammer, Felix},
  title =	{{On (Un)Soundness of Unravelings}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{119--134},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.119},
  URN =		{urn:nbn:de:0030-drops-26485},
  doi =		{10.4230/LIPIcs.RTA.2010.119},
  annote =	{Keywords: Conditional rewriting, transformation into unconditional systems, unsoundness, unraveling}
}
Document
A Proof Calculus Which Reduces Syntactic Bureaucracy

Authors: Alessio Guglielmi, Tom Gundersen, and Michel Parigot


Abstract
In usual proof systems, like the sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logic-independent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular the sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deep-inference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows.

Cite as

Alessio Guglielmi, Tom Gundersen, and Michel Parigot. A Proof Calculus Which Reduces Syntactic Bureaucracy. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 135-150, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{guglielmi_et_al:LIPIcs.RTA.2010.135,
  author =	{Guglielmi, Alessio and Gundersen, Tom and Parigot, Michel},
  title =	{{A Proof Calculus Which Reduces Syntactic Bureaucracy}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{135--150},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.135},
  URN =		{urn:nbn:de:0030-drops-26490},
  doi =		{10.4230/LIPIcs.RTA.2010.135},
  annote =	{Keywords: Logic, Proof theory, Deep Inference, Flow graphs, Proof Systems, Open Deduction, Rewriting, Confluence, Termination}
}
Document
A Rewriting Logic Semantics Approach to Modular Program Analysis

Authors: Mark Hills and Grigore Rosu


Abstract
The K framework, based on rewriting logic semantics, provides a powerful logic for defining the semantics of programming languages. While most work in this area has focused on defining an evaluation semantics for a language, it is also possible to define an abstract semantics that can be used for program analysis. Using the SILF language (Hills, Serbanuta and Rosu, 2007), this paper describes one technique for defining such a semantics: policy frameworks. In policy frameworks, an analysis-generic, modular framework is first defined for a language. Individual analyses, called policies, are then defined as extensions of this framework, with each policy defining analysis-specific semantic rules and an annotation language which, in combination with support in the language front-end, allows users to annotate program types and functions with information used during program analysis. Standard term rewriting techniques are used to analyze programs by evaluating them in the policy semantics.

Cite as

Mark Hills and Grigore Rosu. A Rewriting Logic Semantics Approach to Modular Program Analysis. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 151-160, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{hills_et_al:LIPIcs.RTA.2010.151,
  author =	{Hills, Mark and Rosu, Grigore},
  title =	{{A Rewriting Logic Semantics Approach to Modular Program Analysis}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{151--160},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.151},
  URN =		{urn:nbn:de:0030-drops-26506},
  doi =		{10.4230/LIPIcs.RTA.2010.151},
  annote =	{Keywords: K, rewriting logic semantics, program analysis}
}
Document
Infinitary Rewriting: Foundations Revisited

Authors: Stefan Kahrs


Abstract
Infinitary Term Rewriting allows to express infinitary terms and infinitary reductions that converge to them. As their notion of transfinite reduction in general, and as binary relations in particular two concepts have been studied in the past: strongly and weakly convergent reductions, and in the last decade research has mostly focused around the former. Finitary rewriting has a strong connection to the equational theory of its rule set: if the rewrite system is confluent this (implies consistency of the theory and) gives rise to a semi-decision procedure for the theory, and if the rewrite system is in addition terminating this becomes a decision procedure. This connection is the original reason for the study of these properties in rewriting. For infinitary rewriting there is barely an established notion of an equational theory. The reason this issue is not trivial is that such a theory would need to include some form of ``getting to limits'', and there are different options one can pursue. These options are being looked at here, as well as several alternatives for the notion of reduction relation and their relationships to these equational theories.

Cite as

Stefan Kahrs. Infinitary Rewriting: Foundations Revisited. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 161-176, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{kahrs:LIPIcs.RTA.2010.161,
  author =	{Kahrs, Stefan},
  title =	{{Infinitary Rewriting: Foundations Revisited}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{161--176},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.161},
  URN =		{urn:nbn:de:0030-drops-26510},
  doi =		{10.4230/LIPIcs.RTA.2010.161},
  annote =	{Keywords: Infinitary rewriting,equivalence}
}
Document
Underspecified computation of normal forms

Authors: Alexander Koller and Stefan Thater


Abstract
We show how to compute readings of ambiguous natural language sentences that are minimal in some way. Formally, we consider the problem of computing, out of a set C of trees and a rewrite system R, those trees in C that cannot be rewritten into a tree in C. We solve the problem for sets of trees that are described by semantic representations typically used in computational linguistics, and a certain class of rewrite systems that we use to approximate entailment, and show how to compute the irreducible trees efficiently by intersecting tree automata. Our algorithm solves the problem of computing weakest readings that has been open for 25 years in computational linguistics.

Cite as

Alexander Koller and Stefan Thater. Underspecified computation of normal forms. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 177-192, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{koller_et_al:LIPIcs.RTA.2010.177,
  author =	{Koller, Alexander and Thater, Stefan},
  title =	{{Underspecified computation of normal forms}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{177--192},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.177},
  URN =		{urn:nbn:de:0030-drops-26521},
  doi =		{10.4230/LIPIcs.RTA.2010.177},
  annote =	{Keywords: Rewrite systems tree automata normal forms computational linguistics}
}
Document
Order-Sorted Unification with Regular Expression Sorts

Authors: Temur Kutsia and Mircea Marin


Abstract
We extend first-order order-sorted unification by permitting regular expression sorts for variables and in the domains of function symbols. The set of basic sorts is finite. The obtained signature corresponds to a finite bottom-up hedge automaton. The unification problem in such a theory generalizes some known unification problems. Its unification type is infinitary. We give a complete unification procedure and prove decidability.

Cite as

Temur Kutsia and Mircea Marin. Order-Sorted Unification with Regular Expression Sorts. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 193-208, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{kutsia_et_al:LIPIcs.RTA.2010.193,
  author =	{Kutsia, Temur and Marin, Mircea},
  title =	{{Order-Sorted Unification with Regular Expression Sorts}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{193--208},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.193},
  URN =		{urn:nbn:de:0030-drops-26537},
  doi =		{10.4230/LIPIcs.RTA.2010.193},
  annote =	{Keywords: Unification, sorts, regular expression}
}
Document
An Efficient Nominal Unification Algorithm

Authors: Jordi Levy and Mateu Villaret


Abstract
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo alpha-equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadratic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently.

Cite as

Jordi Levy and Mateu Villaret. An Efficient Nominal Unification Algorithm. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 209-226, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{levy_et_al:LIPIcs.RTA.2010.209,
  author =	{Levy, Jordi and Villaret, Mateu},
  title =	{{An Efficient Nominal Unification Algorithm}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{209--226},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.209},
  URN =		{urn:nbn:de:0030-drops-26544},
  doi =		{10.4230/LIPIcs.RTA.2010.209},
  annote =	{Keywords: Nominal logic, unification}
}
Document
Computing Critical Pairs in 2-Dimensional Rewriting Systems

Authors: Samuel Mimram


Abstract
Rewriting systems on words are very useful in the study of monoids. In good cases, they give finite presentations of the monoids, allowing their manipulation by a computer. Even better, when the presentation is confluent and terminating, they provide one with a notion of canonical representative for the elements of the presented monoid. Polygraphs are a higher-dimensional generalization of this notion of presentation, from the setting of monoids to the much more general setting of n-categories. Here, we are interested in proving confluence for polygraphs presenting 2-categories, which can be seen as a generalization of term rewriting systems. For this purpose, we propose an adaptation of the usual algorithm for computing critical pairs. Interestingly, this framework is much richer than term rewriting systems and requires the elaboration of a new theoretical framework for representing critical pairs, based on contexts in compact 2-categories.

Cite as

Samuel Mimram. Computing Critical Pairs in 2-Dimensional Rewriting Systems. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 227-242, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{mimram:LIPIcs.RTA.2010.227,
  author =	{Mimram, Samuel},
  title =	{{Computing Critical Pairs in 2-Dimensional Rewriting Systems}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{227--242},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.227},
  URN =		{urn:nbn:de:0030-drops-26552},
  doi =		{10.4230/LIPIcs.RTA.2010.227},
  annote =	{Keywords: Rewriting system, polygraph, presentation of a category, critical pair, unification, confluence, compact 2-category, context}
}
Document
Polynomial Interpretations over the Reals do not Subsume Polynomial Interpretations over the Integers

Authors: Friedrich Neurauter and Aart Middeldorp


Abstract
Polynomial interpretations are a useful technique for proving termination of term rewrite systems. They come in various flavors: polynomial interpretations with real, rational and integer coefficients. In 2006, Lucas proved that there are rewrite systems that can be shown polynomially terminating by polynomial interpretations with real (algebraic) coefficients, but cannot be shown polynomially terminating using polynomials with rational coefficients only. He also proved a similar theorem with respect to the use of rational coefficients versus integer coefficients. In this paper we show that polynomial interpretations with real or rational coefficients do not subsume polynomial interpretations with integer coefficients, contrary to what is commonly believed. We further show that polynomial interpretations with real coefficients subsume polynomial interpretations with rational coefficients.

Cite as

Friedrich Neurauter and Aart Middeldorp. Polynomial Interpretations over the Reals do not Subsume Polynomial Interpretations over the Integers. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 243-258, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{neurauter_et_al:LIPIcs.RTA.2010.243,
  author =	{Neurauter, Friedrich and Middeldorp, Aart},
  title =	{{Polynomial Interpretations over the Reals do not Subsume Polynomial Interpretations over the Integers}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{243--258},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.243},
  URN =		{urn:nbn:de:0030-drops-26565},
  doi =		{10.4230/LIPIcs.RTA.2010.243},
  annote =	{Keywords: Term rewriting, termination, polynomial interpretations}
}
Document
Automated Termination Analysis of Java Bytecode by Term Rewriting

Authors: Carsten Otto, Marc Brockschmidt, Christian von Essen, and Jürgen Giesl


Abstract
We present an automated approach to prove termination of Java Bytecode (JBC) programs by automatically transforming them to term rewrite systems (TRSs). In this way, the numerous techniques and tools developed for TRS termination can now be used for imperative object-oriented languages like Java, which can be compiled into JBC.

Cite as

Carsten Otto, Marc Brockschmidt, Christian von Essen, and Jürgen Giesl. Automated Termination Analysis of Java Bytecode by Term Rewriting. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 259-276, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{otto_et_al:LIPIcs.RTA.2010.259,
  author =	{Otto, Carsten and Brockschmidt, Marc and von Essen, Christian and Giesl, J\"{u}rgen},
  title =	{{Automated Termination Analysis of Java Bytecode by Term Rewriting}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{259--276},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.259},
  URN =		{urn:nbn:de:0030-drops-26570},
  doi =		{10.4230/LIPIcs.RTA.2010.259},
  annote =	{Keywords: Java Bytecode, termination, term rewriting}
}
Document
Declarative Debugging of Missing Answers for Maude

Authors: Adrian Riesco, Alberto Verdejo, and Narciso Marti-Oliet


Abstract
Declarative debugging is a semi-automatic technique that starts from an incorrect computation and locates a program fragment responsible for the error by building a tree representing this computation and guiding the user through it to find the error. Membership equational logic (MEL) is an equational logic that in addition to equations allows the statement of membership axioms characterizing the elements of a sort. Rewriting logic is a logic of change that extends MEL by adding rewrite rules, that correspond to transitions between states and can be nondeterministic. In this paper we propose a calculus that allows to infer normal forms and least sorts with the equational part, and sets of reachable terms through rules. We use an abbreviation of the proof trees computed with this calculus to build appropriate debugging trees for missing answers (results that are erroneous because they are incomplete), whose adequacy for debugging is proved. Using these trees we have implemented a declarative debugger for Maude, a high-performance system based on rewriting logic, whose use is illustrated with an example.

Cite as

Adrian Riesco, Alberto Verdejo, and Narciso Marti-Oliet. Declarative Debugging of Missing Answers for Maude. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 277-294, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{riesco_et_al:LIPIcs.RTA.2010.277,
  author =	{Riesco, Adrian and Verdejo, Alberto and Marti-Oliet, Narciso},
  title =	{{Declarative Debugging of Missing Answers for Maude}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{277--294},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.277},
  URN =		{urn:nbn:de:0030-drops-26589},
  doi =		{10.4230/LIPIcs.RTA.2010.277},
  annote =	{Keywords: Declarative debugging, Maude, Missing answers, Rewriting}
}
Document
Simulation in the Call-by-Need Lambda-Calculus with letrec

Authors: Manfred Schmidt-Schauss, David Sabel, and Elena Machkasova


Abstract
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models. We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.

Cite as

Manfred Schmidt-Schauss, David Sabel, and Elena Machkasova. Simulation in the Call-by-Need Lambda-Calculus with letrec. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 295-310, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{schmidtschauss_et_al:LIPIcs.RTA.2010.295,
  author =	{Schmidt-Schauss, Manfred and Sabel, David and Machkasova, Elena},
  title =	{{Simulation in the Call-by-Need Lambda-Calculus with letrec}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{295--310},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.295},
  URN =		{urn:nbn:de:0030-drops-26593},
  doi =		{10.4230/LIPIcs.RTA.2010.295},
  annote =	{Keywords: Lambda calculus, semantics, contextual equivalence, bisimulation,call-by-need}
}
Document
Weak Convergence and Uniform Normalization in Infinitary Rewriting

Authors: Jakob Grue Simonsen


Abstract
We study infinitary term rewriting systems containing finitely many rules. For these, we show that if a weakly convergent reduction is not strongly convergent, it contains a term that reduces to itself in one step (but the step itself need not be part of the reduction). Using this result, we prove the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly convergent rewriting in the setting of weak convergence, if systems with finitely many rules are considered. Finally, we give a number of counterexamples showing failure of most of the results when infinite sets of rules are allowed.

Cite as

Jakob Grue Simonsen. Weak Convergence and Uniform Normalization in Infinitary Rewriting. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 311-324, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{simonsen:LIPIcs.RTA.2010.311,
  author =	{Simonsen, Jakob Grue},
  title =	{{Weak Convergence and Uniform Normalization in Infinitary Rewriting}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{311--324},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.311},
  URN =		{urn:nbn:de:0030-drops-26609},
  doi =		{10.4230/LIPIcs.RTA.2010.311},
  annote =	{Keywords: Infinitary rewriting, weak convergence, uniform normalization}
}
Document
Certified Subterm Criterion and Certified Usable Rules

Authors: Christian Sternagel and René Thiemann


Abstract
In this paper we present our formalization of two important termination techniques for term rewrite systems: the subterm criterion and the reduction pair processor in combination with usable rules. For both techniques we developed executable check functions in the theorem prover Isabelle/HOL which can certify the correct application of these techniques in some given termination proof. As there are several variants of usable rules we designed our check function in such a way that it accepts all known variants, even those which are not explicitly spelled out in previous papers. We integrated our formalization in the publicly available IsaFoR-library. This led to a significant increase in the power of CeTA, the corresponding certified termination proof checker that is extracted from IsaFoR.

Cite as

Christian Sternagel and René Thiemann. Certified Subterm Criterion and Certified Usable Rules. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 325-340, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.RTA.2010.325,
  author =	{Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Certified Subterm Criterion and Certified Usable Rules}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{325--340},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.325},
  URN =		{urn:nbn:de:0030-drops-26611},
  doi =		{10.4230/LIPIcs.RTA.2010.325},
  annote =	{Keywords: Term Rewriting, Certification, Termination, Theorem Proving}
}
Document
Termination of linear bounded term rewriting systems

Authors: Irène Durand, Géraud Sénizergues, and Marc Sylvestre


Abstract
For the whole class of linear term rewriting systems and for each integer k, we define k-bounded rewriting as a restriction of the usual notion of rewriting. We show that the k-bounded uniform termination, the k-bounded termination, the inverse k-bounded uniform, and the inverse k-bounded problems are decidable. The k-bounded class (BO(k)) is, by definition, the set of linear systems for which every derivation can be replaced by a k-bounded derivation. In general, for BO(k) systems, the uniform (respectively inverse uniform) k-bounded termination problem is not equivalent to the uniform (resp. inverse uniform) termination problem, and the k-bounded (respectively inverse k-bounded) termination problem is not equivalent to the termination (respectively inverse termination) problem. This leads us to define more restricted classes for which these problems are equivalent: the classes BOLP(k) of k-bounded systems that have the length preservation property. By definition, a system is BOLP(k) if every derivation of length n can be replaced by a k-bounded derivation of length n. We define the class BOLP of bounded systems that have the length preservation property as the union of all the BOLP(k) classes. The class BOLP contains (strictly) several already known classes of systems: the inverse left-basic semi-Thue systems, the linear growing term rewriting systems, the inverse Linear-Finite-Path-Ordering systems, the strongly bottom-up systems.

Cite as

Irène Durand, Géraud Sénizergues, and Marc Sylvestre. Termination of linear bounded term rewriting systems. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 341-356, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{durand_et_al:LIPIcs.RTA.2010.341,
  author =	{Durand, Ir\`{e}ne and S\'{e}nizergues, G\'{e}raud and Sylvestre, Marc},
  title =	{{Termination of linear bounded term rewriting systems}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{341--356},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.341},
  URN =		{urn:nbn:de:0030-drops-26625},
  doi =		{10.4230/LIPIcs.RTA.2010.341},
  annote =	{Keywords: Term rewriting, termination, rewriting strategy}
}
Document
Polynomially Bounded Matrix Interpretations

Authors: Johannes Waldmann


Abstract
Matrix interpretations can be used to bound the derivational complexity of rewrite systems. We present a criterion that completely characterizes matrix interpretations that are polynomially bounded. It includes the method of upper triangular interpretations as a special case, and we prove that the inclusion is strict. The criterion can be expressed as a finite domain constraint system. It translates to a Boolean constraint system with a size that is polynomial in the dimension of the interpretation. We report on performance of an implementation.

Cite as

Johannes Waldmann. Polynomially Bounded Matrix Interpretations. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 357-372, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{waldmann:LIPIcs.RTA.2010.357,
  author =	{Waldmann, Johannes},
  title =	{{Polynomially Bounded Matrix Interpretations}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{357--372},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.357},
  URN =		{urn:nbn:de:0030-drops-26637},
  doi =		{10.4230/LIPIcs.RTA.2010.357},
  annote =	{Keywords: Derivational complexity of rewriting, matrix interpretation, weighted automata, ambiguity of automata, finite domain constraints}
}
Document
Optimizing mkbTT

Authors: Sarah Winkler, Haruhiko Sato, Aart Middeldorp, and Masahito Kurihara


Abstract
We describe performance enhancements that have been added to mkbTT, a modern completion tool combining multi-completion with the use of termination tools.

Cite as

Sarah Winkler, Haruhiko Sato, Aart Middeldorp, and Masahito Kurihara. Optimizing mkbTT. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 373-384, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{winkler_et_al:LIPIcs.RTA.2010.373,
  author =	{Winkler, Sarah and Sato, Haruhiko and Middeldorp, Aart and Kurihara, Masahito},
  title =	{{Optimizing mkbTT}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{373--384},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.373},
  URN =		{urn:nbn:de:0030-drops-26643},
  doi =		{10.4230/LIPIcs.RTA.2010.373},
  annote =	{Keywords: Knuth-Bendix completion, termination prover, automated deduction}
}
Document
Modular Complexity Analysis via Relative Complexity

Authors: Harald Zankl and Martin Korp


Abstract
In this paper we introduce a modular framework which allows to infer (feasible) upper bounds on the (derivational) complexity of term rewrite systems by combining different criteria. All current investigations to analyze the derivational complexity are based on a single termination proof, possibly preceded by transformations. We prove that the modular framework is strictly more powerful than the conventional setting. Furthermore, the results have been implemented and experiments show significant gains in power.

Cite as

Harald Zankl and Martin Korp. Modular Complexity Analysis via Relative Complexity. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 385-400, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{zankl_et_al:LIPIcs.RTA.2010.385,
  author =	{Zankl, Harald and Korp, Martin},
  title =	{{Modular Complexity Analysis via Relative Complexity}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{385--400},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.385},
  URN =		{urn:nbn:de:0030-drops-26659},
  doi =		{10.4230/LIPIcs.RTA.2010.385},
  annote =	{Keywords: Term rewriting, complexity analysis, relative complexity, derivation length}
}
Document
Proving Productivity in Infinite Data Structures

Authors: Hans Zantema and Matthias Raffelsieper


Abstract
For a general class of infinite data structures including streams, binary trees, and the combination of finite and infinite lists, we investigate the notion of productivity. This generalizes stream productivity. We develop a general technique to prove productivity based on proving context-sensitive termination, by which the power of present termination tools can be exploited. In order to treat cases where the approach does not apply directly, we develop transformations extending the power of the basic approach. We present a tool combining these ingredients that can prove productivity of a wide range of examples fully automatically.

Cite as

Hans Zantema and Matthias Raffelsieper. Proving Productivity in Infinite Data Structures. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 401-416, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{zantema_et_al:LIPIcs.RTA.2010.401,
  author =	{Zantema, Hans and Raffelsieper, Matthias},
  title =	{{Proving Productivity in Infinite Data Structures}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{401--416},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.401},
  URN =		{urn:nbn:de:0030-drops-26661},
  doi =		{10.4230/LIPIcs.RTA.2010.401},
  annote =	{Keywords: Productivity, infinite data structures, streams}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail