Search Results

Documents authored by Moser, Georg


Document
On Complexity of Confluence and Church-Rosser Proofs

Authors: Arnold Beckmann and Georg Moser

Published in: LIPIcs, Volume 306, 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)


Abstract
In this paper, we investigate confluence and the Church-Rosser property - two well-studied properties of rewriting and the λ-calculus - from the viewpoint of proof complexity. With respect to confluence, and focusing on orthogonal term rewrite systems, our main contribution is that the size, measured in number of symbols, of the smallest rewrite proof is polynomial in the size of the peak. For the Church-Rosser property we obtain exponential lower bounds for the size of the join in the size of the equality proof. Finally, we study the complexity of proving confluence in the context of the λ-calculus. Here, we establish an exponential (worst-case) lower bound of the size of the join in the size of the peak.

Cite as

Arnold Beckmann and Georg Moser. On Complexity of Confluence and Church-Rosser Proofs. In 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 306, pp. 21:1-21:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{beckmann_et_al:LIPIcs.MFCS.2024.21,
  author =	{Beckmann, Arnold and Moser, Georg},
  title =	{{On Complexity of Confluence and Church-Rosser Proofs}},
  booktitle =	{49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)},
  pages =	{21:1--21:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-335-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{306},
  editor =	{Kr\'{a}lovi\v{c}, Rastislav and Ku\v{c}era, Anton{\'\i}n},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2024.21},
  URN =		{urn:nbn:de:0030-drops-205774},
  doi =		{10.4230/LIPIcs.MFCS.2024.21},
  annote =	{Keywords: logic, bounded arithmetic, consistency, rewriting}
}
Document
α-Avoidance

Authors: Samuel Frontull, Georg Moser, and Vincent van Oostrom

Published in: LIPIcs, Volume 260, 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)


Abstract
When substitutions and bindings interact, there is a risk of undesired side effects if the substitution is applied naïvely. The λ-calculus captures this phenomenon concretely, as β-reduction may require the renaming of bound variables to avoid variable capture. In this paper we introduce α-paths as an estimation for α-avoidance, roughly expressing that α-conversions are not required to prevent variable capture. These paths provide a novel method to analyse and predict the potential need for α in different calculi. In particular, we show how α-path characterises α-avoidance for several sub-calculi of the λ-calculus like (i) developments, (ii) affine/linear λ-calculi, (iii) the weak λ-calculus, (iv) μ-unfolding and (iv) finally the safe λ-calculus. Furthermore, we study the unavoidability of α-conversions in untyped and simply-typed λ-calculi and prove undecidability of the need of α-conversions for (leftmost-outermost reductions) in the untyped λ-calculus. To ease the work with α-paths, we have implemented the method and the tool is publicly available.

Cite as

Samuel Frontull, Georg Moser, and Vincent van Oostrom. α-Avoidance. In 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 260, pp. 22:1-22:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{frontull_et_al:LIPIcs.FSCD.2023.22,
  author =	{Frontull, Samuel and Moser, Georg and van Oostrom, Vincent},
  title =	{{\alpha-Avoidance}},
  booktitle =	{8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)},
  pages =	{22:1--22:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-277-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{260},
  editor =	{Gaboardi, Marco and van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2023.22},
  URN =		{urn:nbn:de:0030-drops-180068},
  doi =		{10.4230/LIPIcs.FSCD.2023.22},
  annote =	{Keywords: \lambda-calculus, variable capture, \alpha-conversion, developments, safe \lambda-calculus, undecidability}
}
Document
Invited Talk
Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses (Invited Talk)

Authors: Georg Moser

Published in: LIPIcs, Volume 84, 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)


Abstract
In this talk, I'll describe how rewriting techniques can be successfully employed to built state-of-the-art automated resource analysis tools which favourably compare to other approaches. Furthermore I'll sketch the genesis of a uniform framework for resource analysis, emphasising success stories, without hiding intricate weaknesses. The talk ends with the discussion of open problems.

Cite as

Georg Moser. Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses (Invited Talk). In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 2:1-2:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{moser:LIPIcs.FSCD.2017.2,
  author =	{Moser, Georg},
  title =	{{Uniform Resource Analysis by Rewriting: Strenghts and Weaknesses}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{2:1--2:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.2},
  URN =		{urn:nbn:de:0030-drops-77438},
  doi =		{10.4230/LIPIcs.FSCD.2017.2},
  annote =	{Keywords: resource analysis, term rewriting, automation}
}
Document
Complexity of Acyclic Term Graph Rewriting

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 52, 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)


Abstract
Term rewriting has been used as a formal model to reason about the complexity of logic, functional, and imperative programs. In contrast to term rewriting, term graph rewriting permits sharing of common sub-expressions, and consequently is able to capture more closely reasonable implementations of rule based languages. However, the automated complexity analysis of term graph rewriting has received little to no attention. With this work, we provide first steps towards overcoming this situation. We present adaptions of two prominent complexity techniques from term rewriting, viz, the interpretation method and dependency tuples. Our adaptions are non-trivial, in the sense that they can observe not only term but also graph structures, i.e. take sharing into account. In turn, the developed methods allow us to more precisely estimate the runtime complexity of programs where sharing of sub-expressions is essential.

Cite as

Martin Avanzini and Georg Moser. Complexity of Acyclic Term Graph Rewriting. In 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 52, pp. 10:1-10:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.FSCD.2016.10,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Complexity of Acyclic Term Graph Rewriting}},
  booktitle =	{1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)},
  pages =	{10:1--10:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-010-1},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{52},
  editor =	{Kesner, Delia and Pientka, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2016.10},
  URN =		{urn:nbn:de:0030-drops-59901},
  doi =		{10.4230/LIPIcs.FSCD.2016.10},
  annote =	{Keywords: Program Analysis, Graph Rewriting, Complexity Analysis}
}
Document
Leftmost Outermost Revisited

Authors: Nao Hirokawa, Aart Middeldorp, and Georg Moser

Published in: LIPIcs, Volume 36, 26th International Conference on Rewriting Techniques and Applications (RTA 2015)


Abstract
We present an elementary proof of the classical result that the leftmost outermost strategy is normalizing for left-normal orthogonal rewrite systems. Our proof is local and extends to hyper-normalization and weakly orthogonal systems. Based on the new proof, we study basic normalization, i.e., we study normalization if the set of considered starting terms is restricted to basic terms. This allows us to weaken the left-normality restriction. We show that the leftmost outermost strategy is hyper-normalizing for basically left-normal orthogonal rewrite systems. This shift of focus greatly extends the applicability of the classical result, as evidenced by the experimental data provided.

Cite as

Nao Hirokawa, Aart Middeldorp, and Georg Moser. Leftmost Outermost Revisited. In 26th International Conference on Rewriting Techniques and Applications (RTA 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 36, pp. 209-222, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{hirokawa_et_al:LIPIcs.RTA.2015.209,
  author =	{Hirokawa, Nao and Middeldorp, Aart and Moser, Georg},
  title =	{{Leftmost Outermost Revisited}},
  booktitle =	{26th International Conference on Rewriting Techniques and Applications (RTA 2015)},
  pages =	{209--222},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-85-9},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{36},
  editor =	{Fern\'{a}ndez, Maribel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2015.209},
  URN =		{urn:nbn:de:0030-drops-51986},
  doi =		{10.4230/LIPIcs.RTA.2015.209},
  annote =	{Keywords: term rewriting, strategies, normalization}
}
Document
Multivariate Amortised Resource Analysis for Term Rewrite Systems

Authors: Martin Hofmann and Georg Moser

Published in: LIPIcs, Volume 38, 13th International Conference on Typed Lambda Calculi and Applications (TLCA 2015)


Abstract
We study amortised resource analysis in the context of term rewrite systems. We introduce a novel amortised analysis based on the potential method. The method is represented in an inference system akin to a type system and gives rise to polynomial bounds on the innermost runtime complexity of the analysed rewrite system. The crucial feature of the inference system is the admittance of multivariate bounds in the context of arbitrary data structures in a completely uniform way. This extends our earlier univariate resource analysis of typed term rewrite systems and continues our program of applying automated amortised resource analysis to rewriting.

Cite as

Martin Hofmann and Georg Moser. Multivariate Amortised Resource Analysis for Term Rewrite Systems. In 13th International Conference on Typed Lambda Calculi and Applications (TLCA 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 38, pp. 241-256, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{hofmann_et_al:LIPIcs.TLCA.2015.241,
  author =	{Hofmann, Martin and Moser, Georg},
  title =	{{Multivariate Amortised Resource Analysis for Term Rewrite Systems}},
  booktitle =	{13th International Conference on Typed Lambda Calculi and Applications (TLCA 2015)},
  pages =	{241--256},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-87-3},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{38},
  editor =	{Altenkirch, Thorsten},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TLCA.2015.241},
  URN =		{urn:nbn:de:0030-drops-51675},
  doi =		{10.4230/LIPIcs.TLCA.2015.241},
  annote =	{Keywords: program analysis,amortised analysis, term rewriting,multivariate bounds}
}
Document
The Structure of Interaction

Authors: Stéphane Gimenez and Georg Moser

Published in: LIPIcs, Volume 23, Computer Science Logic 2013 (CSL 2013)


Abstract
Interaction nets form a local and strongly confluent model of computation that is per se parallel. We introduce a Curry–Howard correspondence between well-formed interaction nets and a deep-inference deduction system based on linear logic. In particular, linear logic itself is easily expressed in the system and its computational aspects materialise though the correspondence. The system of interaction nets obtained is a typed variant of already well-known sharing graphs. Due to a strong confluence property, strong normalisation for this system follows from weak normalisation. The latter is obtained via an adaptation of Girard's reducibility method. The approach is modular, readily gives rise to generalisations (e.g. second order, known as polymorphism to the programmer) and could therefore be extended to various systems of interaction nets.

Cite as

Stéphane Gimenez and Georg Moser. The Structure of Interaction. In Computer Science Logic 2013 (CSL 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 23, pp. 316-331, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{gimenez_et_al:LIPIcs.CSL.2013.316,
  author =	{Gimenez, St\'{e}phane and Moser, Georg},
  title =	{{The Structure of Interaction}},
  booktitle =	{Computer Science Logic 2013 (CSL 2013)},
  pages =	{316--331},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-60-6},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{23},
  editor =	{Ronchi Della Rocca, Simona},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2013.316},
  URN =		{urn:nbn:de:0030-drops-42052},
  doi =		{10.4230/LIPIcs.CSL.2013.316},
  annote =	{Keywords: Interaction Nets, Linear Logic, Curry–Howard Correspondence, Deep Inference, Calculus of Structures, Strong Normalisation, Reducibility}
}
Document
A Combination Framework for Complexity

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
In this paper we present a combination framework for the automated polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis, and is employed as theoretical foundation in the automated complexity tool TCT. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity.

Cite as

Martin Avanzini and Georg Moser. A Combination Framework for Complexity. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 55-70, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.55,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{A Combination Framework for Complexity}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{55--70},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.55},
  URN =		{urn:nbn:de:0030-drops-40534},
  doi =		{10.4230/LIPIcs.RTA.2013.55},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
Tyrolean Complexity Tool: Features and Usage

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
The Tyrolean Complexity Tool, TCT for short, is an open source complexity analyser for term rewrite systems. Our tool TCT features a majority of the known techniques for the automated characterisation of polynomial complexity of rewrite systems and can investigate derivational and runtime complexity, for full and innermost rewriting. This system description outlines features and provides a short introduction to the usage of TCT.

Cite as

Martin Avanzini and Georg Moser. Tyrolean Complexity Tool: Features and Usage. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 71-80, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.71,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Tyrolean Complexity Tool: Features and Usage}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{71--80},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.71},
  URN =		{urn:nbn:de:0030-drops-40540},
  doi =		{10.4230/LIPIcs.RTA.2013.71},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
A Path Order for Rewrite Systems that Compute Exponential Time Functions

Authors: Martin Avanzini, Naohi Eguchi, and Georg Moser

Published in: LIPIcs, Volume 10, 22nd International Conference on Rewriting Techniques and Applications (RTA'11) (2011)


Abstract
In this paper we present a new path order for rewrite systems, the exponential path order EPO*. Suppose a term rewrite system is compatible with EPO*, then the runtime complexity of this rewrite system is bounded from above by an exponential function. Furthermore, the class of function computed by a rewrite system compatible with EPO* equals the class of functions computable in exponential time on a Turing machine.

Cite as

Martin Avanzini, Naohi Eguchi, and Georg Moser. A Path Order for Rewrite Systems that Compute Exponential Time Functions. In 22nd International Conference on Rewriting Techniques and Applications (RTA'11). Leibniz International Proceedings in Informatics (LIPIcs), Volume 10, pp. 123-138, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2011.123,
  author =	{Avanzini, Martin and Eguchi, Naohi and Moser, Georg},
  title =	{{A Path Order for Rewrite Systems that Compute Exponential Time Functions}},
  booktitle =	{22nd International Conference on Rewriting Techniques and Applications (RTA'11)},
  pages =	{123--138},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-30-9},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{10},
  editor =	{Schmidt-Schauss, Manfred},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2011.123},
  URN =		{urn:nbn:de:0030-drops-31127},
  doi =		{10.4230/LIPIcs.RTA.2011.123},
  annote =	{Keywords: Runtime Complexity, Exponential Time Functions, Implicit Computational Complexity}
}
Document
Termination Proofs in the Dependency Pair Framework May Induce Multiple Recursive Derivational Complexity

Authors: Georg Moser and Andreas Schnabl

Published in: LIPIcs, Volume 10, 22nd International Conference on Rewriting Techniques and Applications (RTA'11) (2011)


Abstract
We study the complexity of rewrite systems shown terminating via the dependency pair framework using processors for reduction pairs, dependency graphs, or the subterm criterion. The complexity of such systems is bounded by a multiple recursive function, provided the complexity induced by the employed base techniques is at most multiple recursive. Moreover this upper bound is tight.

Cite as

Georg Moser and Andreas Schnabl. Termination Proofs in the Dependency Pair Framework May Induce Multiple Recursive Derivational Complexity. In 22nd International Conference on Rewriting Techniques and Applications (RTA'11). Leibniz International Proceedings in Informatics (LIPIcs), Volume 10, pp. 235-250, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{moser_et_al:LIPIcs.RTA.2011.235,
  author =	{Moser, Georg and Schnabl, Andreas},
  title =	{{Termination Proofs in the Dependency Pair Framework May Induce Multiple Recursive Derivational Complexity}},
  booktitle =	{22nd International Conference on Rewriting Techniques and Applications (RTA'11)},
  pages =	{235--250},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-30-9},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{10},
  editor =	{Schmidt-Schauss, Manfred},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2011.235},
  URN =		{urn:nbn:de:0030-drops-31208},
  doi =		{10.4230/LIPIcs.RTA.2011.235},
  annote =	{Keywords: Complexity, DP Framework, Multiple Recursive Functions}
}
Document
Closing the Gap Between Runtime Complexity and Polytime Computability

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
In earlier work, we have shown that for confluent TRSs, innermost polynomial runtime complexity induces polytime computability of the functions defined. In this paper, we generalise this result to full rewriting, for that we exploit graph rewriting. We give a new proof of the adequacy of graph rewriting for full rewriting that allows for a precise control of the resources copied. In sum we completely describe an implementation of rewriting on a Turing machine (TM for short). We show that the runtime complexity of the TRS and the runtime complexity of the TM is polynomially related. Our result strengthens the evidence that the complexity of a rewrite system is truthfully represented through the length of derivations. Moreover our result allows the classification of nondeterministic polytime-computation based on runtime complexity analysis of rewrite systems.

Cite as

Martin Avanzini and Georg Moser. Closing the Gap Between Runtime Complexity and Polytime Computability. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 33-48, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2010.33,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Closing the Gap Between Runtime Complexity and Polytime Computability}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{33--48},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.33},
  URN =		{urn:nbn:de:0030-drops-26436},
  doi =		{10.4230/LIPIcs.RTA.2010.33},
  annote =	{Keywords: Term rewriting, graph rewriting, complexity analysis, polytime computability}
}
Document
Complexity Analysis of Term Rewriting Based on Matrix and Context Dependent Interpretations

Authors: Georg Moser, Andreas Schnabl, and Johannes Waldmann

Published in: LIPIcs, Volume 2, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (2008)


Abstract
For a given (terminating) term rewriting system one can often estimate its \emph{derivational complexity} indirectly by looking at the proof method that established termination. In this spirit we investigate two instances of the interpretation method: \emph{matrix interpretations} and \emph{context dependent interpretations}. We introduce a subclass of matrix interpretations, denoted as \emph{triangular matrix interpretations}, which induce polynomial derivational complexity and establish tight correspondence results between a subclass of context dependent interpretations and restricted triangular matrix interpretations. The thus obtained new results are easy to implement and considerably extend the analytic power of existing results. We provide ample numerical data for assessing the viability of the method.

Cite as

Georg Moser, Andreas Schnabl, and Johannes Waldmann. Complexity Analysis of Term Rewriting Based on Matrix and Context Dependent Interpretations. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 2, pp. 304-315, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{moser_et_al:LIPIcs.FSTTCS.2008.1762,
  author =	{Moser, Georg and Schnabl, Andreas and Waldmann, Johannes},
  title =	{{Complexity Analysis of Term Rewriting Based on  Matrix and Context Dependent Interpretations}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science},
  pages =	{304--315},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-08-8},
  ISSN =	{1868-8969},
  year =	{2008},
  volume =	{2},
  editor =	{Hariharan, Ramesh and Mukund, Madhavan and Vinay, V},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2008.1762},
  URN =		{urn:nbn:de:0030-drops-17626},
  doi =		{10.4230/LIPIcs.FSTTCS.2008.1762},
  annote =	{Keywords: Term Rewriting, Derivational Complexity, Termination, Automation}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail