9 Search Results for "Sternagel, Christian"


Document
Z; Syntax-Free Developments

Authors: Vincent van Oostrom

Published in: LIPIcs, Volume 195, 6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021)


Abstract
We present the Z-property and instantiate it to various rewrite systems: associativity, positive braids, self-distributivity, the lambda-calculus, lambda-calculi with explicit substitutions, orthogonal TRSs, .... The Z-property is proven equivalent to Takahashi’s angle property by means of a syntax-free notion of development. We show that several classical consequences of having developments such as confluence, normalisation, and recurrence, can be regained in a syntax-free way, and investigate how the notion corresponds to the classical syntactic notion of development in term rewriting.

Cite as

Vincent van Oostrom. Z; Syntax-Free Developments. In 6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 195, pp. 24:1-24:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{vanoostrom:LIPIcs.FSCD.2021.24,
  author =	{van Oostrom, Vincent},
  title =	{{Z; Syntax-Free Developments}},
  booktitle =	{6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021)},
  pages =	{24:1--24:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-191-7},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{195},
  editor =	{Kobayashi, Naoki},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2021.24},
  URN =		{urn:nbn:de:0030-drops-142620},
  doi =		{10.4230/LIPIcs.FSCD.2021.24},
  annote =	{Keywords: rewrite system, confluence, normalisation, recurrence}
}
Document
Invited Talk
Certifying the Weighted Path Order (Invited Talk)

Authors: René Thiemann, Jonas Schöpf, Christian Sternagel, and Akihisa Yamada

Published in: LIPIcs, Volume 167, 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)


Abstract
The weighted path order (WPO) unifies and extends several termination proving techniques that are known in term rewriting. Consequently, the first tool implementing WPO could prove termination of rewrite systems for which all previous tools failed. However, we should not blindly trust such results, since there might be problems with the implementation or the paper proof of WPO. In this work, we increase the reliability of these automatically generated proofs. To this end, we first formally prove the properties of WPO in Isabelle/HOL, and then develop a verified algorithm to certify termination proofs that are generated by tools using WPO. We also include support for max-polynomial interpretations, an important ingredient in WPO. Here we establish a connection to an existing verified SMT solver. Moreover, we extend the termination tools NaTT and TTT2, so that they can now generate certifiable WPO proofs.

Cite as

René Thiemann, Jonas Schöpf, Christian Sternagel, and Akihisa Yamada. Certifying the Weighted Path Order (Invited Talk). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 4:1-4:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{thiemann_et_al:LIPIcs.FSCD.2020.4,
  author =	{Thiemann, Ren\'{e} and Sch\"{o}pf, Jonas and Sternagel, Christian and Yamada, Akihisa},
  title =	{{Certifying the Weighted Path Order}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{4:1--4:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.4},
  URN =		{urn:nbn:de:0030-drops-123263},
  doi =		{10.4230/LIPIcs.FSCD.2020.4},
  annote =	{Keywords: certification, Isabelle/HOL, reduction order, termination analysis}
}
Document
Infinite Runs in Abstract Completion

Authors: Nao Hirokawa, Aart Middeldorp, Christian Sternagel, and Sarah Winkler

Published in: LIPIcs, Volume 84, 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)


Abstract
Completion is one of the first and most studied techniques in term rewriting and fundamental to automated reasoning with equalities. In an earlier paper we presented a new and formalized correctness proof of abstract completion for finite runs. In this paper we extend our analysis and our formalization to infinite runs, resulting in a new proof that fair infinite runs produce complete presentations of the initial equations. We further consider ordered completion - an important extension of completion that aims to produce ground-complete presentations of the initial equations. Moreover, we revisit and extend results of Métivier concerning canonicity of rewrite systems. All proofs presented in the paper have been formalized in Isabelle/HOL.

Cite as

Nao Hirokawa, Aart Middeldorp, Christian Sternagel, and Sarah Winkler. Infinite Runs in Abstract Completion. In 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 19:1-19:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{hirokawa_et_al:LIPIcs.FSCD.2017.19,
  author =	{Hirokawa, Nao and Middeldorp, Aart and Sternagel, Christian and Winkler, Sarah},
  title =	{{Infinite Runs in Abstract Completion}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{19:1--19:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.19},
  URN =		{urn:nbn:de:0030-drops-77252},
  doi =		{10.4230/LIPIcs.FSCD.2017.19},
  annote =	{Keywords: term rewriting, abstract completion, ordered completion, canonicity, Isabelle/HOL}
}
Document
AC Dependency Pairs Revisited

Authors: Akihisa Yamada, Christian Sternagel, René Thiemann, and Keiichirou Kusakari

Published in: LIPIcs, Volume 62, 25th EACSL Annual Conference on Computer Science Logic (CSL 2016)


Abstract
Rewriting modulo AC, i.e., associativity and/or commutativity of certain symbols, is among the most frequently used extensions of term rewriting by equational theories. In this paper we present a generalization of the dependency pair framework for termination analysis to rewriting modulo AC. It subsumes existing variants of AC dependency pairs, admits standard dependency graph analyses, and in particular enjoys the minimality property in the standard sense. As a direct benefit, important termination techniques are easily extended; we describe usable rules and the subterm criterion for AC termination, which properly generalize the non-AC versions. We also perform these extensions within IsaFoR - the Isabelle formalization of rewriting - and thereby provide the first formalization of AC dependency pairs. Consequently, our certifier CeTA now supports checking proofs of AC termination.

Cite as

Akihisa Yamada, Christian Sternagel, René Thiemann, and Keiichirou Kusakari. AC Dependency Pairs Revisited. In 25th EACSL Annual Conference on Computer Science Logic (CSL 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 62, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{yamada_et_al:LIPIcs.CSL.2016.8,
  author =	{Yamada, Akihisa and Sternagel, Christian and Thiemann, Ren\'{e} and Kusakari, Keiichirou},
  title =	{{AC Dependency Pairs Revisited}},
  booktitle =	{25th EACSL Annual Conference on Computer Science Logic (CSL 2016)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-022-4},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{62},
  editor =	{Talbot, Jean-Marc and Regnier, Laurent},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2016.8},
  URN =		{urn:nbn:de:0030-drops-65488},
  doi =		{10.4230/LIPIcs.CSL.2016.8},
  annote =	{Keywords: Equational Rewriting, Termination, Dependency Pairs, Certification}
}
Document
Certifying Confluence of Almost Orthogonal CTRSs via Exact Tree Automata Completion

Authors: Christian Sternagel and Thomas Sternagel

Published in: LIPIcs, Volume 52, 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)


Abstract
Suzuki et al. showed that properly oriented, right-stable, orthogonal, and oriented conditional term rewrite systems with extra variables in right-hand sides are confluent. We present our Isabelle/HOL formalization of this result, including two generalizations. On the one hand, we relax proper orientedness and orthogonality to extended proper orientedness and almost orthogonality modulo infeasibility, as suggested by Suzuki et al. On the other hand, we further loosen the requirements of the latter, enabling more powerful methods for proving infeasibility of conditional critical pairs. Furthermore, we formalized a construction by Jacquemard that employs exact tree automata completion for non-reachability analysis and apply it to certify infeasibility of conditional critical pairs. Combining these two results and extending the conditional confluence checker ConCon accordingly, we are able to automatically prove and certify confluence of an important class of conditional term rewrite systems.

Cite as

Christian Sternagel and Thomas Sternagel. Certifying Confluence of Almost Orthogonal CTRSs via Exact Tree Automata Completion. In 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 52, pp. 29:1-29:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.FSCD.2016.29,
  author =	{Sternagel, Christian and Sternagel, Thomas},
  title =	{{Certifying Confluence of Almost Orthogonal CTRSs via Exact Tree Automata Completion}},
  booktitle =	{1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)},
  pages =	{29:1--29:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-010-1},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{52},
  editor =	{Kesner, Delia and Pientka, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2016.29},
  URN =		{urn:nbn:de:0030-drops-59996},
  doi =		{10.4230/LIPIcs.FSCD.2016.29},
  annote =	{Keywords: certification, conditional term rewriting, confluence, infeasible critical pairs, non-reachability}
}
Document
Certification of Complexity Proofs using CeTA

Authors: Martin Avanzini, Christian Sternagel, and René Thiemann

Published in: LIPIcs, Volume 36, 26th International Conference on Rewriting Techniques and Applications (RTA 2015)


Abstract
Nowadays certification is widely employed by automated termination tools for term rewriting, where certifiers support most available techniques. In complexity analysis, the situation is quite different. Although tools support certification in principle, current certifiers implement only the most basic technique, namely, suitably tamed versions of reduction orders. As a consequence, only a small fraction of the proofs generated by state-of-the-art complexity tools can be certified. To improve upon this situation, we formalized a framework for the certification of modular complexity proofs and incorporated it into CeTA. We report on this extension and present the newly supported techniques (match-bounds, weak dependency pairs, dependency tuples, usable rules, and usable replacement maps), resulting in a significant increase in the number of certifiable complexity proofs. During our work we detected conflicts in theoretical results as well as bugs in existing complexity tools.

Cite as

Martin Avanzini, Christian Sternagel, and René Thiemann. Certification of Complexity Proofs using CeTA. In 26th International Conference on Rewriting Techniques and Applications (RTA 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 36, pp. 23-39, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{avanzini_et_al:LIPIcs.RTA.2015.23,
  author =	{Avanzini, Martin and Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Certification of Complexity Proofs using CeTA}},
  booktitle =	{26th International Conference on Rewriting Techniques and Applications (RTA 2015)},
  pages =	{23--39},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-85-9},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{36},
  editor =	{Fern\'{a}ndez, Maribel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2015.23},
  URN =		{urn:nbn:de:0030-drops-51875},
  doi =		{10.4230/LIPIcs.RTA.2015.23},
  annote =	{Keywords: complexity analysis, certification, match-bounds, weak dependency pairs, dependency tuples, usable rules, usable replacement maps}
}
Document
Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion

Authors: Christian Sternagel and René Thiemann

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
We present extensions of our Isabelle Formalization of Rewriting that cover two historically related concepts: the Knuth-Bendix order and the Knuth-Bendix completion procedure. The former, besides being the first development of its kind in a proof assistant, is based on a generalized version of the Knuth-Bendix order. We compare our version to variants from the literature and show all properties required to certify termination proofs of TRSs. The latter comprises the formalization of important facts that are related to completion, like Birkhoff's theorem, the critical pair theorem, and a soundness proof of completion, showing that the strict encompassment condition is superfluous for finite runs. As a result, we are able to certify completion proofs.

Cite as

Christian Sternagel and René Thiemann. Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 287-302, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.RTA.2013.287,
  author =	{Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{287--302},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.287},
  URN =		{urn:nbn:de:0030-drops-40685},
  doi =		{10.4230/LIPIcs.RTA.2013.287},
  annote =	{Keywords: certification, completion, confluence, termination}
}
Document
Modular and Certified Semantic Labeling and Unlabeling

Authors: Christian Sternagel and René Thiemann

Published in: LIPIcs, Volume 10, 22nd International Conference on Rewriting Techniques and Applications (RTA'11) (2011)


Abstract
Semantic labeling is a powerful transformation technique to prove termination of term rewrite systems. The dual technique is unlabeling. For unlabeling it is essential to drop the so called decreasing rules which sometimes have to be added when applying semantic labeling. We indicate two problems concerning unlabeling and present our solutions. The first problem is that currently unlabeling cannot be applied as a modular step, since the decreasing rules are determined by a semantic labeling step which may have taken place much earlier. To this end, we give an implicit definition of decreasing rules that does not depend on any knowledge about preceding labelings. The second problem is that unlabeling is in general unsound. To solve this issue, we introduce the notion of extended termination problems. Moreover, we show how existing termination techniques can be lifted to operate on extended termination problems. All our proofs have been formalized in Isabelle/HOL as part of the IsaFoR/CeTA project.

Cite as

Christian Sternagel and René Thiemann. Modular and Certified Semantic Labeling and Unlabeling. In 22nd International Conference on Rewriting Techniques and Applications (RTA'11). Leibniz International Proceedings in Informatics (LIPIcs), Volume 10, pp. 329-344, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.RTA.2011.329,
  author =	{Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Modular and Certified Semantic Labeling and Unlabeling}},
  booktitle =	{22nd International Conference on Rewriting Techniques and Applications (RTA'11)},
  pages =	{329--344},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-30-9},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{10},
  editor =	{Schmidt-Schauss, Manfred},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2011.329},
  URN =		{urn:nbn:de:0030-drops-31333},
  doi =		{10.4230/LIPIcs.RTA.2011.329},
  annote =	{Keywords: semantic labeling, certification, term rewriting, unlabeling}
}
Document
Certified Subterm Criterion and Certified Usable Rules

Authors: Christian Sternagel and René Thiemann

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
In this paper we present our formalization of two important termination techniques for term rewrite systems: the subterm criterion and the reduction pair processor in combination with usable rules. For both techniques we developed executable check functions in the theorem prover Isabelle/HOL which can certify the correct application of these techniques in some given termination proof. As there are several variants of usable rules we designed our check function in such a way that it accepts all known variants, even those which are not explicitly spelled out in previous papers. We integrated our formalization in the publicly available IsaFoR-library. This led to a significant increase in the power of CeTA, the corresponding certified termination proof checker that is extracted from IsaFoR.

Cite as

Christian Sternagel and René Thiemann. Certified Subterm Criterion and Certified Usable Rules. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 325-340, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{sternagel_et_al:LIPIcs.RTA.2010.325,
  author =	{Sternagel, Christian and Thiemann, Ren\'{e}},
  title =	{{Certified Subterm Criterion and Certified Usable Rules}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{325--340},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.325},
  URN =		{urn:nbn:de:0030-drops-26611},
  doi =		{10.4230/LIPIcs.RTA.2010.325},
  annote =	{Keywords: Term Rewriting, Certification, Termination, Theorem Proving}
}
  • Refine by Author
  • 8 Sternagel, Christian
  • 6 Thiemann, René
  • 2 Yamada, Akihisa
  • 1 Avanzini, Martin
  • 1 Hirokawa, Nao
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Equational logic and rewriting
  • 1 Theory of computation → Logic and verification

  • Refine by Keyword
  • 5 certification
  • 3 confluence
  • 2 Certification
  • 2 Isabelle/HOL
  • 2 Termination
  • Show More...

  • Refine by Type
  • 9 document

  • Refine by Publication Year
  • 2 2016
  • 1 2010
  • 1 2011
  • 1 2013
  • 1 2015
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail