12 Search Results for "Avanzini, Martin"


Document
Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits

Authors: Neil J. Ross and Scott Wesley

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Many promising quantum algorithms in economics, medical science, and material science rely on circuits that are parameterized by a large number of angles. To ensure that these algorithms are efficient, these parameterized circuits must be heavily optimized. However, most quantum circuit optimizers are not verified, so this procedure is known to be error-prone. For this reason, there is growing interest in the design of equivalence checking algorithms for parameterized quantum circuits. In this paper, we define a generalized class of parameterized circuits with arbitrary rotations and show that this problem is decidable for cyclotomic gate sets. We propose a cutoff-based procedure which reduces the problem of verifying the equivalence of parameterized quantum circuits to the problem of verifying the equivalence of finitely many parameter-free quantum circuits. Because the number of parameter-free circuits grows exponentially with the number of parameters, we also propose a probabilistic variant of the algorithm for cases when the number of parameters is intractably large. We show that our techniques extend to equivalence modulo global phase, and describe an efficient angle sampling procedure for cyclotomic gate sets.

Cite as

Neil J. Ross and Scott Wesley. Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 85:1-85:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ross_et_al:LIPIcs.MFCS.2025.85,
  author =	{Ross, Neil J. and Wesley, Scott},
  title =	{{Cutoff Theorems for the Equivalence of Parameterized Quantum Circuits}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{85:1--85:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.85},
  URN =		{urn:nbn:de:0030-drops-241921},
  doi =		{10.4230/LIPIcs.MFCS.2025.85},
  annote =	{Keywords: Quantum Circuits, Parameterized Equivalence Checking}
}
Document
Deciding Termination of Simple Randomized Loops

Authors: Éléanore Meyer and Jürgen Giesl

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
We show that universal positive almost sure termination (UPAST) is decidable for a class of simple randomized programs, i.e., it is decidable whether the expected runtime of such a program is finite for all inputs. Our class contains all programs that consist of a single loop, with a linear loop guard and a loop body composed of two linear commuting and diagonalizable updates. In each iteration of the loop, the update to be carried out is picked at random, according to a fixed probability. We show the decidability of UPAST for this class of programs, where the program’s variables and inputs may range over various sub-semirings of the real numbers. In this way, we extend a line of research initiated by Tiwari in 2004 into the realm of randomized programs.

Cite as

Éléanore Meyer and Jürgen Giesl. Deciding Termination of Simple Randomized Loops. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 76:1-76:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{meyer_et_al:LIPIcs.MFCS.2025.76,
  author =	{Meyer, \'{E}l\'{e}anore and Giesl, J\"{u}rgen},
  title =	{{Deciding Termination of Simple Randomized Loops}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{76:1--76:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.76},
  URN =		{urn:nbn:de:0030-drops-241833},
  doi =		{10.4230/LIPIcs.MFCS.2025.76},
  annote =	{Keywords: decision procedures, randomized programs, linear loops, positive almost sure termination}
}
Document
Weighted Rewriting: Semiring Semantics for Abstract Reduction Systems

Authors: Emma Ahrens, Jan-Christoph Kassing, Jürgen Giesl, and Joost-Pieter Katoen

Published in: LIPIcs, Volume 337, 10th International Conference on Formal Structures for Computation and Deduction (FSCD 2025)


Abstract
We present novel semiring semantics for abstract reduction systems (ARSs). More precisely, we provide a weighted version of ARSs, where the reduction steps induce weights from a semiring. Inspired by provenance analysis in database theory and logic, we obtain a formalism that can be used for provenance analysis of arbitrary ARSs. Our semantics handle (possibly unbounded) non-determinism and possibly infinite reductions. Moreover, we develop several techniques to prove upper and lower bounds on the weights resulting from our semantics, and show that in this way one obtains a uniform approach to analyze several different properties like termination, derivational complexity, space complexity, safety, as well as combinations of these properties.

Cite as

Emma Ahrens, Jan-Christoph Kassing, Jürgen Giesl, and Joost-Pieter Katoen. Weighted Rewriting: Semiring Semantics for Abstract Reduction Systems. In 10th International Conference on Formal Structures for Computation and Deduction (FSCD 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 337, pp. 6:1-6:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ahrens_et_al:LIPIcs.FSCD.2025.6,
  author =	{Ahrens, Emma and Kassing, Jan-Christoph and Giesl, J\"{u}rgen and Katoen, Joost-Pieter},
  title =	{{Weighted Rewriting: Semiring Semantics for Abstract Reduction Systems}},
  booktitle =	{10th International Conference on Formal Structures for Computation and Deduction (FSCD 2025)},
  pages =	{6:1--6:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-374-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{337},
  editor =	{Fern\'{a}ndez, Maribel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2025.6},
  URN =		{urn:nbn:de:0030-drops-236215},
  doi =		{10.4230/LIPIcs.FSCD.2025.6},
  annote =	{Keywords: Rewriting, Semirings, Semantics, Termination, Verification}
}
Document
Simple Types for Probabilistic Termination

Authors: Willem Heijltjes and Georgina Majury

Published in: LIPIcs, Volume 326, 33rd EACSL Annual Conference on Computer Science Logic (CSL 2025)


Abstract
We present a new typing discipline to guarantee the probability of termination in probabilistic lambda-calculi. The main contribution is a particular naturality and simplicity: our probabilistic types are as simple types, but generated from probabilities as base types, representing a least probability of termination. Simple types are recovered by restricting probabilities to one. Our vehicle is the Probabilistic Event Lambda-Calculus by Dal Lago, Guerrieri, and Heijltjes, which presents a solution to the issue of confluence in probabilistic lambda-calculi. Our probabilistic type system provides an alternative solution to that using counting quantifiers by Antonelli, Dal Lago, and Pistone, for the same calculus. The problem that both type systems address is to give a lower bound on the probability that terms head-normalize. Following the recent Functional Machine Calculus by Heijltjes, our development takes the (simplified) Krivine machine as primary, and proceeds via an extension of the calculus with sequential composition and identity on the machine. Our type system then gives a natural account of termination probability on the Krivine machine, reflected back onto head-normalization for the original calculus. In this way we are able to avoid the use of counting quantifiers, while improving on the termination bounds given by Antonelli, Dal Lago, and Pistone.

Cite as

Willem Heijltjes and Georgina Majury. Simple Types for Probabilistic Termination. In 33rd EACSL Annual Conference on Computer Science Logic (CSL 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 326, pp. 31:1-31:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{heijltjes_et_al:LIPIcs.CSL.2025.31,
  author =	{Heijltjes, Willem and Majury, Georgina},
  title =	{{Simple Types for Probabilistic Termination}},
  booktitle =	{33rd EACSL Annual Conference on Computer Science Logic (CSL 2025)},
  pages =	{31:1--31:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-362-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{326},
  editor =	{Endrullis, J\"{o}rg and Schmitz, Sylvain},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2025.31},
  URN =		{urn:nbn:de:0030-drops-227885},
  doi =		{10.4230/LIPIcs.CSL.2025.31},
  annote =	{Keywords: lambda-calculus, probabilistic termination, simple types}
}
Document
A Rewriting Theory for Quantum λ-Calculus

Authors: Claudia Faggian, Gaetan Lopez, and Benoît Valiron

Published in: LIPIcs, Volume 326, 33rd EACSL Annual Conference on Computer Science Logic (CSL 2025)


Abstract
Quantum lambda calculus has been studied mainly as an idealized programming language - the evaluation essentially corresponds to a deterministic abstract machine. Very little work has been done to develop a rewriting theory for quantum lambda calculus. Recent advances in the theory of probabilistic rewriting give us a way to tackle this task with tools unavailable a decade ago. Our primary focus are standardization and normalization results.

Cite as

Claudia Faggian, Gaetan Lopez, and Benoît Valiron. A Rewriting Theory for Quantum λ-Calculus. In 33rd EACSL Annual Conference on Computer Science Logic (CSL 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 326, pp. 47:1-47:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{faggian_et_al:LIPIcs.CSL.2025.47,
  author =	{Faggian, Claudia and Lopez, Gaetan and Valiron, Beno\^{i}t},
  title =	{{A Rewriting Theory for Quantum \lambda-Calculus}},
  booktitle =	{33rd EACSL Annual Conference on Computer Science Logic (CSL 2025)},
  pages =	{47:1--47:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-362-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{326},
  editor =	{Endrullis, J\"{o}rg and Schmitz, Sylvain},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2025.47},
  URN =		{urn:nbn:de:0030-drops-228046},
  doi =		{10.4230/LIPIcs.CSL.2025.47},
  annote =	{Keywords: quantum lambda-calculus, probabilistic rewriting, operational semantics, asymptotic normalization, principles of quantum programming languages}
}
Document
Complexity of Acyclic Term Graph Rewriting

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 52, 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)


Abstract
Term rewriting has been used as a formal model to reason about the complexity of logic, functional, and imperative programs. In contrast to term rewriting, term graph rewriting permits sharing of common sub-expressions, and consequently is able to capture more closely reasonable implementations of rule based languages. However, the automated complexity analysis of term graph rewriting has received little to no attention. With this work, we provide first steps towards overcoming this situation. We present adaptions of two prominent complexity techniques from term rewriting, viz, the interpretation method and dependency tuples. Our adaptions are non-trivial, in the sense that they can observe not only term but also graph structures, i.e. take sharing into account. In turn, the developed methods allow us to more precisely estimate the runtime complexity of programs where sharing of sub-expressions is essential.

Cite as

Martin Avanzini and Georg Moser. Complexity of Acyclic Term Graph Rewriting. In 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 52, pp. 10:1-10:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.FSCD.2016.10,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Complexity of Acyclic Term Graph Rewriting}},
  booktitle =	{1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)},
  pages =	{10:1--10:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-010-1},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{52},
  editor =	{Kesner, Delia and Pientka, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2016.10},
  URN =		{urn:nbn:de:0030-drops-59901},
  doi =		{10.4230/LIPIcs.FSCD.2016.10},
  annote =	{Keywords: Program Analysis, Graph Rewriting, Complexity Analysis}
}
Document
Certification of Complexity Proofs using CeTA

Authors: Martin Avanzini, Christian Sternagel, and René Thiemann

Published in: LIPIcs, Volume 36, 26th International Conference on Rewriting Techniques and Applications (RTA 2015)


Abstract
Nowadays certification is widely employed by automated termination tools for term rewriting, where certifiers support most available techniques. In complexity analysis, the situation is quite different. Although tools support certification in principle, current certifiers implement only the most basic technique, namely, suitably tamed versions of reduction orders. As a consequence, only a small fraction of the proofs generated by state-of-the-art complexity tools can be certified. To improve upon this situation, we formalized a framework for the certification of modular complexity proofs and incorporated it into CeTA. We report on this extension and present the newly supported techniques (match-bounds, weak dependency pairs, dependency tuples, usable rules, and usable replacement maps), resulting in a significant increase in the number of certifiable complexity proofs. During our work we detected conflicts in theoretical results as well as bugs in existing complexity tools.

Cite as

Martin Avanzini, Christian Sternagel, and René Thiemann. Certification of Complexity Proofs using CeTA. In 26th International Conference on Rewriting Techniques and Applications (RTA 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 36, pp. 23-39, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2015.23,
  author =	{Avanzini, Martin and Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Certification of Complexity Proofs using CeTA}},
  booktitle =	{26th International Conference on Rewriting Techniques and Applications (RTA 2015)},
  pages =	{23--39},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-85-9},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{36},
  editor =	{Fern\'{a}ndez, Maribel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2015.23},
  URN =		{urn:nbn:de:0030-drops-51875},
  doi =		{10.4230/LIPIcs.RTA.2015.23},
  annote =	{Keywords: complexity analysis, certification, match-bounds, weak dependency pairs, dependency tuples, usable rules, usable replacement maps}
}
Document
On Sharing, Memoization, and Polynomial Time

Authors: Martin Avanzini and Ugo Dal Lago

Published in: LIPIcs, Volume 30, 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)


Abstract
We study how the adoption of an evaluation mechanism with sharing and memoization impacts the class of functions which can be computed in polynomial time. We first show how a natural cost model in which lookup for an already computed result has no cost is indeed invariant. As a corollary, we then prove that the most general notion of ramified recurrence is sound for polynomial time, this way settling an open problem in implicit computational complexity.

Cite as

Martin Avanzini and Ugo Dal Lago. On Sharing, Memoization, and Polynomial Time. In 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 30, pp. 62-75, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.STACS.2015.62,
  author =	{Avanzini, Martin and Dal Lago, Ugo},
  title =	{{On Sharing, Memoization, and Polynomial Time}},
  booktitle =	{32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)},
  pages =	{62--75},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-78-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{30},
  editor =	{Mayr, Ernst W. and Ollinger, Nicolas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2015.62},
  URN =		{urn:nbn:de:0030-drops-49042},
  doi =		{10.4230/LIPIcs.STACS.2015.62},
  annote =	{Keywords: implicit computational complexity, data-tiering, polynomial time}
}
Document
A Combination Framework for Complexity

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
In this paper we present a combination framework for the automated polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis, and is employed as theoretical foundation in the automated complexity tool TCT. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity.

Cite as

Martin Avanzini and Georg Moser. A Combination Framework for Complexity. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 55-70, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.55,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{A Combination Framework for Complexity}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{55--70},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.55},
  URN =		{urn:nbn:de:0030-drops-40534},
  doi =		{10.4230/LIPIcs.RTA.2013.55},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
Tyrolean Complexity Tool: Features and Usage

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
The Tyrolean Complexity Tool, TCT for short, is an open source complexity analyser for term rewrite systems. Our tool TCT features a majority of the known techniques for the automated characterisation of polynomial complexity of rewrite systems and can investigate derivational and runtime complexity, for full and innermost rewriting. This system description outlines features and provides a short introduction to the usage of TCT.

Cite as

Martin Avanzini and Georg Moser. Tyrolean Complexity Tool: Features and Usage. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 71-80, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2013.71,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Tyrolean Complexity Tool: Features and Usage}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{71--80},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.71},
  URN =		{urn:nbn:de:0030-drops-40540},
  doi =		{10.4230/LIPIcs.RTA.2013.71},
  annote =	{Keywords: program analysis, term rewriting, complexity analysis, automation}
}
Document
A Path Order for Rewrite Systems that Compute Exponential Time Functions

Authors: Martin Avanzini, Naohi Eguchi, and Georg Moser

Published in: LIPIcs, Volume 10, 22nd International Conference on Rewriting Techniques and Applications (RTA'11) (2011)


Abstract
In this paper we present a new path order for rewrite systems, the exponential path order EPO*. Suppose a term rewrite system is compatible with EPO*, then the runtime complexity of this rewrite system is bounded from above by an exponential function. Furthermore, the class of function computed by a rewrite system compatible with EPO* equals the class of functions computable in exponential time on a Turing machine.

Cite as

Martin Avanzini, Naohi Eguchi, and Georg Moser. A Path Order for Rewrite Systems that Compute Exponential Time Functions. In 22nd International Conference on Rewriting Techniques and Applications (RTA'11). Leibniz International Proceedings in Informatics (LIPIcs), Volume 10, pp. 123-138, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2011.123,
  author =	{Avanzini, Martin and Eguchi, Naohi and Moser, Georg},
  title =	{{A Path Order for Rewrite Systems that Compute Exponential Time Functions}},
  booktitle =	{22nd International Conference on Rewriting Techniques and Applications (RTA'11)},
  pages =	{123--138},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-30-9},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{10},
  editor =	{Schmidt-Schauss, Manfred},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2011.123},
  URN =		{urn:nbn:de:0030-drops-31127},
  doi =		{10.4230/LIPIcs.RTA.2011.123},
  annote =	{Keywords: Runtime Complexity, Exponential Time Functions, Implicit Computational Complexity}
}
Document
Closing the Gap Between Runtime Complexity and Polytime Computability

Authors: Martin Avanzini and Georg Moser

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
In earlier work, we have shown that for confluent TRSs, innermost polynomial runtime complexity induces polytime computability of the functions defined. In this paper, we generalise this result to full rewriting, for that we exploit graph rewriting. We give a new proof of the adequacy of graph rewriting for full rewriting that allows for a precise control of the resources copied. In sum we completely describe an implementation of rewriting on a Turing machine (TM for short). We show that the runtime complexity of the TRS and the runtime complexity of the TM is polynomially related. Our result strengthens the evidence that the complexity of a rewrite system is truthfully represented through the length of derivations. Moreover our result allows the classification of nondeterministic polytime-computation based on runtime complexity analysis of rewrite systems.

Cite as

Martin Avanzini and Georg Moser. Closing the Gap Between Runtime Complexity and Polytime Computability. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 33-48, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2010.33,
  author =	{Avanzini, Martin and Moser, Georg},
  title =	{{Closing the Gap Between Runtime Complexity and Polytime Computability}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{33--48},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.33},
  URN =		{urn:nbn:de:0030-drops-26436},
  doi =		{10.4230/LIPIcs.RTA.2010.33},
  annote =	{Keywords: Term rewriting, graph rewriting, complexity analysis, polytime computability}
}
  • Refine by Type
  • 12 Document/PDF
  • 5 Document/HTML

  • Refine by Publication Year
  • 5 2025
  • 1 2016
  • 2 2015
  • 2 2013
  • 1 2011
  • Show More...

  • Refine by Author
  • 7 Avanzini, Martin
  • 5 Moser, Georg
  • 2 Giesl, Jürgen
  • 1 Ahrens, Emma
  • 1 Dal Lago, Ugo
  • Show More...

  • Refine by Series/Journal
  • 12 LIPIcs

  • Refine by Classification
  • 2 Theory of computation → Equational logic and rewriting
  • 2 Theory of computation → Probabilistic computation
  • 1 Hardware → Equivalence checking
  • 1 Hardware → Quantum computation
  • 1 Theory of computation → Lambda calculus
  • Show More...

  • Refine by Keyword
  • 4 complexity analysis
  • 2 automation
  • 2 program analysis
  • 2 term rewriting
  • 1 Complexity Analysis
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail