14 Search Results for "van Oostrom, Vincent"


Document
Invited Talk
Meaningfulness and Genericity in a Subsuming Framework (Invited Talk)

Authors: Delia Kesner, Victor Arrial, and Giulio Guerrieri

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
This paper studies the notion of meaningfulness for a unifying framework called dBang-calculus, which subsumes both call-by-name (dCBN) and call-by-value (dCBV). We first define meaningfulness in dBang and then characterize it by means of typability and inhabitation in an associated non-idempotent intersection type system previously appearing in the literature. We validate the proposed notion of meaningfulness by showing two properties: (1) consistency of the smallest theory, called ℋ, equating all meaningless terms, and (2) genericity, stating that meaningless subterms have no bearing on the significance of meaningful terms. The theory ℋ is also shown to have a unique consistent and maximal extension ℋ*, which coincides with a well-known notion of observational equivalence. Last but not least, we show that the notions of meaningfulness and genericity in the literature for dCBN and dCBV are subsumed by the corresponding ones proposed here for the dBang-calculus.

Cite as

Delia Kesner, Victor Arrial, and Giulio Guerrieri. Meaningfulness and Genericity in a Subsuming Framework (Invited Talk). In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 1:1-1:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kesner_et_al:LIPIcs.FSCD.2024.1,
  author =	{Kesner, Delia and Arrial, Victor and Guerrieri, Giulio},
  title =	{{Meaningfulness and Genericity in a Subsuming Framework}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{1:1--1:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.1},
  URN =		{urn:nbn:de:0030-drops-203305},
  doi =		{10.4230/LIPIcs.FSCD.2024.1},
  annote =	{Keywords: Lambda calculus, Solvability, Meaningfulness, Inhabitation, Genericity}
}
Document
Simulating Dependency Pairs by Semantic Labeling

Authors: Teppei Saito and Nao Hirokawa

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
We show that termination proofs by a version of the dependency pair method can be simulated by semantic labeling plus multiset path orders. By incorporating a flattening technique into multiset path orders the simulation result can be extended to the dependency pair method for relative termination, introduced by Iborra et al. This result allows us to improve applicability of their dependency pair method.

Cite as

Teppei Saito and Nao Hirokawa. Simulating Dependency Pairs by Semantic Labeling. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 13:1-13:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{saito_et_al:LIPIcs.FSCD.2024.13,
  author =	{Saito, Teppei and Hirokawa, Nao},
  title =	{{Simulating Dependency Pairs by Semantic Labeling}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{13:1--13:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.13},
  URN =		{urn:nbn:de:0030-drops-203423},
  doi =		{10.4230/LIPIcs.FSCD.2024.13},
  annote =	{Keywords: Term rewriting, Relative termination, Semantic labeling, Dependency pairs}
}
Document
Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti

Authors: Thiago Felicissimo and Théo Winterhalter

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
Proof assistants such as Coq implement a type theory featuring three important features: impredicativity, cumulativity and product covariance. This combination has proven difficult to be expressed in the logical framework Dedukti, and previous attempts have failed in providing an encoding that is proven confluent, sound and conservative. In this work we solve this longstanding open problem by providing an encoding of these three features that we prove to be confluent, sound and to satisfy a restricted (but, we argue, strong enough) form of conservativity. Our proof of confluence is a contribution by itself, and combines various criteria and proof techniques from rewriting theory. Our proof of soundness also contributes a new strategy in which the result is shown in terms of an inverse translation function, fixing a common flaw made in some previous encoding attempts.

Cite as

Thiago Felicissimo and Théo Winterhalter. Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 21:1-21:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{felicissimo_et_al:LIPIcs.FSCD.2024.21,
  author =	{Felicissimo, Thiago and Winterhalter, Th\'{e}o},
  title =	{{Impredicativity, Cumulativity and Product Covariance in the Logical Framework Dedukti}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{21:1--21:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.21},
  URN =		{urn:nbn:de:0030-drops-203503},
  doi =		{10.4230/LIPIcs.FSCD.2024.21},
  annote =	{Keywords: Dedukti, Rewriting, Confluence, Dependent types, Cumulativity, Universes}
}
Document
Termination of Generalized Term Rewriting Systems

Authors: Salvador Lucas

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
We investigate termination of Generalized Term Rewriting Systems (GTRS), which extend Conditional Term Rewriting Systems by considering replacement restrictions on selected arguments of function symbols, as in Context-Sensitive Rewriting, and conditional rewriting rules whose conditional part may include not only a mix of the usual (reachability, joinability,...) conditions, but also atoms defined by a set of definite Horn clauses. GTRS can be used to prove confluence and termination of Generalized Rewrite Theories and Maude programs. We have characterized confluence of terminating GTRS as the joinability of a finite set of conditional pairs. Since termination of GTRS is underexplored to date, this paper introduces a Dependency Pair Framework which is well-suited to automatically (dis)prove termination of GTRS.

Cite as

Salvador Lucas. Termination of Generalized Term Rewriting Systems. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 32:1-32:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lucas:LIPIcs.FSCD.2024.32,
  author =	{Lucas, Salvador},
  title =	{{Termination of Generalized Term Rewriting Systems}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{32:1--32:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.32},
  URN =		{urn:nbn:de:0030-drops-203616},
  doi =		{10.4230/LIPIcs.FSCD.2024.32},
  annote =	{Keywords: Program Analysis, Reduction-Based Systems, Termination}
}
Document
α-Avoidance

Authors: Samuel Frontull, Georg Moser, and Vincent van Oostrom

Published in: LIPIcs, Volume 260, 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)


Abstract
When substitutions and bindings interact, there is a risk of undesired side effects if the substitution is applied naïvely. The λ-calculus captures this phenomenon concretely, as β-reduction may require the renaming of bound variables to avoid variable capture. In this paper we introduce α-paths as an estimation for α-avoidance, roughly expressing that α-conversions are not required to prevent variable capture. These paths provide a novel method to analyse and predict the potential need for α in different calculi. In particular, we show how α-path characterises α-avoidance for several sub-calculi of the λ-calculus like (i) developments, (ii) affine/linear λ-calculi, (iii) the weak λ-calculus, (iv) μ-unfolding and (iv) finally the safe λ-calculus. Furthermore, we study the unavoidability of α-conversions in untyped and simply-typed λ-calculi and prove undecidability of the need of α-conversions for (leftmost-outermost reductions) in the untyped λ-calculus. To ease the work with α-paths, we have implemented the method and the tool is publicly available.

Cite as

Samuel Frontull, Georg Moser, and Vincent van Oostrom. α-Avoidance. In 8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 260, pp. 22:1-22:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{frontull_et_al:LIPIcs.FSCD.2023.22,
  author =	{Frontull, Samuel and Moser, Georg and van Oostrom, Vincent},
  title =	{{\alpha-Avoidance}},
  booktitle =	{8th International Conference on Formal Structures for Computation and Deduction (FSCD 2023)},
  pages =	{22:1--22:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-277-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{260},
  editor =	{Gaboardi, Marco and van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2023.22},
  URN =		{urn:nbn:de:0030-drops-180068},
  doi =		{10.4230/LIPIcs.FSCD.2023.22},
  annote =	{Keywords: \lambda-calculus, variable capture, \alpha-conversion, developments, safe \lambda-calculus, undecidability}
}
Document
The Functional Machine Calculus II: Semantics

Authors: Chris Barrett, Willem Heijltjes, and Guy McCusker

Published in: LIPIcs, Volume 252, 31st EACSL Annual Conference on Computer Science Logic (CSL 2023)


Abstract
The Functional Machine Calculus (FMC), recently introduced by the second author, is a generalization of the lambda-calculus which may faithfully encode the effects of higher-order mutable store, I/O and probabilistic/non-deterministic input. Significantly, it remains confluent and can be simply typed in the presence of these effects. In this paper, we explore the denotational semantics of the FMC. We have three main contributions: first, we argue that its syntax - in which both effects and lambda-calculus are realised using the same syntactic constructs - is semantically natural, corresponding closely to the structure of a Scott-style domain theoretic semantics. Second, we show that simple types confer strong normalization by extending Gandy’s proof for the lambda-calculus, including a small simplification of the technique. Finally, we show that the typed FMC (without considering the specifics of encoded effects), modulo an appropriate equational theory, is a complete language for Cartesian closed categories.

Cite as

Chris Barrett, Willem Heijltjes, and Guy McCusker. The Functional Machine Calculus II: Semantics. In 31st EACSL Annual Conference on Computer Science Logic (CSL 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 252, pp. 10:1-10:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{barrett_et_al:LIPIcs.CSL.2023.10,
  author =	{Barrett, Chris and Heijltjes, Willem and McCusker, Guy},
  title =	{{The Functional Machine Calculus II: Semantics}},
  booktitle =	{31st EACSL Annual Conference on Computer Science Logic (CSL 2023)},
  pages =	{10:1--10:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-264-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{252},
  editor =	{Klin, Bartek and Pimentel, Elaine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2023.10},
  URN =		{urn:nbn:de:0030-drops-174716},
  doi =		{10.4230/LIPIcs.CSL.2023.10},
  annote =	{Keywords: lambda-calculus, computational effects, denotational semantics, strong normalization}
}
Document
Compositional Confluence Criteria

Authors: Kiraku Shintani and Nao Hirokawa

Published in: LIPIcs, Volume 228, 7th International Conference on Formal Structures for Computation and Deduction (FSCD 2022)


Abstract
We show how confluence criteria based on decreasing diagrams are generalized to ones composable with other criteria. For demonstration of the method, the confluence criteria of orthogonality, rule labeling, and critical pair systems for term rewriting are recast into composable forms. In addition to them, we prove that Toyama’s parallel closedness result based on parallel critical pairs subsumes his almost parallel closedness theorem.

Cite as

Kiraku Shintani and Nao Hirokawa. Compositional Confluence Criteria. In 7th International Conference on Formal Structures for Computation and Deduction (FSCD 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 228, pp. 28:1-28:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{shintani_et_al:LIPIcs.FSCD.2022.28,
  author =	{Shintani, Kiraku and Hirokawa, Nao},
  title =	{{Compositional Confluence Criteria}},
  booktitle =	{7th International Conference on Formal Structures for Computation and Deduction (FSCD 2022)},
  pages =	{28:1--28:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-233-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{228},
  editor =	{Felty, Amy P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2022.28},
  URN =		{urn:nbn:de:0030-drops-163095},
  doi =		{10.4230/LIPIcs.FSCD.2022.28},
  annote =	{Keywords: term rewriting, confluence, decreasing diagrams}
}
Document
Z; Syntax-Free Developments

Authors: Vincent van Oostrom

Published in: LIPIcs, Volume 195, 6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021)


Abstract
We present the Z-property and instantiate it to various rewrite systems: associativity, positive braids, self-distributivity, the lambda-calculus, lambda-calculi with explicit substitutions, orthogonal TRSs, .... The Z-property is proven equivalent to Takahashi’s angle property by means of a syntax-free notion of development. We show that several classical consequences of having developments such as confluence, normalisation, and recurrence, can be regained in a syntax-free way, and investigate how the notion corresponds to the classical syntactic notion of development in term rewriting.

Cite as

Vincent van Oostrom. Z; Syntax-Free Developments. In 6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 195, pp. 24:1-24:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{vanoostrom:LIPIcs.FSCD.2021.24,
  author =	{van Oostrom, Vincent},
  title =	{{Z; Syntax-Free Developments}},
  booktitle =	{6th International Conference on Formal Structures for Computation and Deduction (FSCD 2021)},
  pages =	{24:1--24:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-191-7},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{195},
  editor =	{Kobayashi, Naoki},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2021.24},
  URN =		{urn:nbn:de:0030-drops-142620},
  doi =		{10.4230/LIPIcs.FSCD.2021.24},
  annote =	{Keywords: rewrite system, confluence, normalisation, recurrence}
}
Document
Normalisation by Random Descent

Authors: Vincent van Oostrom and Yoshihito Toyama

Published in: LIPIcs, Volume 52, 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)


Abstract
We present abstract hyper-normalisation results for strategies. These results are then applied to term rewriting systems, both first and higher-order. For example, we show hyper-normalisation of the left--outer strategy for, what we call, left--outer pattern rewrite systems, a class comprising both Combinatory Logic and the lambda-calculus but also systems with critical pairs. Our results apply to strategies that need not be deterministic but do have Newman's random descent property: all reductions to normal form have the same length, with Huet and Lévy's external strategy being an example. Technically, we base our development on supplementing the usual notion of commutation diagram with a notion of order, expressing that the measure of its right leg does not exceed that of its left leg, where measure is an abstraction of the usual notion of length. We give an exact characterisation of such global commutation diagrams, for pairs of reductions, by means of local ones, for pairs of steps, we dub Dyck diagrams.

Cite as

Vincent van Oostrom and Yoshihito Toyama. Normalisation by Random Descent. In 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 52, pp. 32:1-32:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{vanoostrom_et_al:LIPIcs.FSCD.2016.32,
  author =	{van Oostrom, Vincent and Toyama, Yoshihito},
  title =	{{Normalisation by Random Descent}},
  booktitle =	{1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)},
  pages =	{32:1--32:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-010-1},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{52},
  editor =	{Kesner, Delia and Pientka, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2016.32},
  URN =		{urn:nbn:de:0030-drops-59862},
  doi =		{10.4230/LIPIcs.FSCD.2016.32},
  annote =	{Keywords: strategy, hyper-normalisation, commutation, random descent}
}
Document
Proof Orders for Decreasing Diagrams

Authors: Bertram Felgenhauer and Vincent van Oostrom

Published in: LIPIcs, Volume 21, 24th International Conference on Rewriting Techniques and Applications (RTA 2013)


Abstract
We present and compare some well-founded proof orders for decreasing diagrams. These proof orders order a conversion above another conversion if the latter is obtained by filling any peak in the former by a (locally) decreasing diagram. Therefore each such proof order entails the decreasing diagrams technique for proving confluence. The proof orders differ with respect to monotonicity and complexity. Our results are developed in the setting of involutive monoids. We extend these results to obtain a decreasing diagrams technique for confluence modulo.

Cite as

Bertram Felgenhauer and Vincent van Oostrom. Proof Orders for Decreasing Diagrams. In 24th International Conference on Rewriting Techniques and Applications (RTA 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 21, pp. 174-189, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{felgenhauer_et_al:LIPIcs.RTA.2013.174,
  author =	{Felgenhauer, Bertram and van Oostrom, Vincent},
  title =	{{Proof Orders for Decreasing Diagrams}},
  booktitle =	{24th International Conference on Rewriting Techniques and Applications (RTA 2013)},
  pages =	{174--189},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-53-8},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{21},
  editor =	{van Raamsdonk, Femke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2013.174},
  URN =		{urn:nbn:de:0030-drops-40616},
  doi =		{10.4230/LIPIcs.RTA.2013.174},
  annote =	{Keywords: involutive monoid, confluence modulo, decreasing diagram, proof order}
}
Document
Triangulation in Rewriting

Authors: Vincent van Oostrom and Hans Zantema

Published in: LIPIcs, Volume 15, 23rd International Conference on Rewriting Techniques and Applications (RTA'12) (2012)


Abstract
We introduce a process, dubbed triangulation, turning any rewrite relation into a confluent one. It is more direct than usual completion, in the sense that objects connected by a peak are directly related rather than their normal forms. We investigate conditions under which this process preserves desirable properties such as termination.

Cite as

Vincent van Oostrom and Hans Zantema. Triangulation in Rewriting. In 23rd International Conference on Rewriting Techniques and Applications (RTA'12). Leibniz International Proceedings in Informatics (LIPIcs), Volume 15, pp. 240-255, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{vanoostrom_et_al:LIPIcs.RTA.2012.240,
  author =	{van Oostrom, Vincent and Zantema, Hans},
  title =	{{Triangulation in Rewriting}},
  booktitle =	{23rd International Conference on Rewriting Techniques and Applications (RTA'12)},
  pages =	{240--255},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-38-5},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{15},
  editor =	{Tiwari, Ashish},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2012.240},
  URN =		{urn:nbn:de:0030-drops-34964},
  doi =		{10.4230/LIPIcs.RTA.2012.240},
  annote =	{Keywords: triangulation,codeterminism,completion,(co)confluence,(co)termination}
}
Document
Realising Optimal Sharing

Authors: Vincent van Oostrom

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
Realising Optimal Sharing

Cite as

Vincent van Oostrom. Realising Optimal Sharing. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 5-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{vanoostrom:LIPIcs.RTA.2010.5,
  author =	{van Oostrom, Vincent},
  title =	{{Realising Optimal Sharing}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{5--6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.5},
  URN =		{urn:nbn:de:0030-drops-26403},
  doi =		{10.4230/LIPIcs.RTA.2010.5},
  annote =	{Keywords: Optimal Sharing}
}
Document
Higher-Order (Non-)Modularity

Authors: Claus Appel, Vincent van Oostrom, and Jakob Grue Simonsen

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
We show that, contrary to the situation in first-order term rewriting, almost none of the usual properties of rewriting are modular for higher-order rewriting, irrespective of the higher-order rewriting format. We show that for the particular format of simply typed applicative term rewriting systems modularity of confluence, normalization, and termination can be recovered by imposing suitable linearity constraints.

Cite as

Claus Appel, Vincent van Oostrom, and Jakob Grue Simonsen. Higher-Order (Non-)Modularity. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 17-32, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{appel_et_al:LIPIcs.RTA.2010.17,
  author =	{Appel, Claus and van Oostrom, Vincent and Simonsen, Jakob Grue},
  title =	{{Higher-Order (Non-)Modularity}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{17--32},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.17},
  URN =		{urn:nbn:de:0030-drops-26427},
  doi =		{10.4230/LIPIcs.RTA.2010.17},
  annote =	{Keywords: Higher-order rewriting, modularity, termination, normalization}
}
Document
Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting

Authors: Joerg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van Oostrom

Published in: LIPIcs, Volume 6, Proceedings of the 21st International Conference on Rewriting Techniques and Applications (2010)


Abstract
We present some contributions to the theory of infinitary rewriting for weakly orthogonal term rewrite systems, in which critical pairs may occur provided they are trivial. We show that the infinitary unique normal form property (UNinf) fails by a simple example of a weakly orthogonal TRS with two collapsing rules. By translating this example, we show that UNinf also fails for the infinitary lambda-beta-eta-calculus. As positive results we obtain the following: Infinitary confluence, and hence UNinf, holds for weakly orthogonal TRSs that do not contain collapsing rules. To this end we refine the compression lemma. Furthermore, we consider the triangle and diamond properties for infinitary developments in weakly orthogonal TRSs, by refining an earlier cluster-analysis for the finite case.

Cite as

Joerg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van Oostrom. Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting. In Proceedings of the 21st International Conference on Rewriting Techniques and Applications. Leibniz International Proceedings in Informatics (LIPIcs), Volume 6, pp. 85-102, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{endrullis_et_al:LIPIcs.RTA.2010.85,
  author =	{Endrullis, Joerg and Grabmayer, Clemens and Hendriks, Dimitri and Klop, Jan Willem and van Oostrom, Vincent},
  title =	{{Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting}},
  booktitle =	{Proceedings of the 21st International Conference on Rewriting Techniques and Applications},
  pages =	{85--102},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-18-7},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{6},
  editor =	{Lynch, Christopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.RTA.2010.85},
  URN =		{urn:nbn:de:0030-drops-26469},
  doi =		{10.4230/LIPIcs.RTA.2010.85},
  annote =	{Keywords: Weakly orthogonal term rewrite systems, unique normal form property, infinitary rewriting, infinitary lambda-beta-eta-calculus,}
}
  • Refine by Author
  • 8 van Oostrom, Vincent
  • 2 Hirokawa, Nao
  • 1 Appel, Claus
  • 1 Arrial, Victor
  • 1 Barrett, Chris
  • Show More...

  • Refine by Classification
  • 5 Theory of computation → Equational logic and rewriting
  • 1 Theory of computation
  • 1 Theory of computation → Automated reasoning
  • 1 Theory of computation → Logic and verification
  • 1 Theory of computation → Operational semantics
  • Show More...

  • Refine by Keyword
  • 2 confluence
  • 1 Confluence
  • 1 Cumulativity
  • 1 Dedukti
  • 1 Dependency pairs
  • Show More...

  • Refine by Type
  • 14 document

  • Refine by Publication Year
  • 4 2024
  • 3 2010
  • 2 2023
  • 1 2012
  • 1 2013
  • Show More...