7 Search Results for "de Lorenzo, Stefan"


Document
Behavioural Up/down Casting For Statically Typed Languages

Authors: Lorenzo Bacchiani, Mario Bravetti, Marco Giunti, João Mota, and António Ravara

Published in: LIPIcs, Volume 313, 38th European Conference on Object-Oriented Programming (ECOOP 2024)


Abstract
We provide support for polymorphism in static typestate analysis for object-oriented languages with upcasts and downcasts. Recent work has shown how typestate analysis can be embedded in the development of Java programs to obtain safer behaviour at runtime, e.g., absence of null pointer errors and protocol completion. In that approach, inheritance is supported at the price of limiting casts in source code, thus only allowing those at the beginning of the protocol, i.e., immediately after objects creation, or at the end, and in turn seriously affecting the applicability of the analysis. In this paper, we provide a solution to this open problem in typestate analysis by introducing a theory based on a richer data structure, named typestate tree, which supports upcast and downcast operations at any point of the protocol by leveraging union and intersection types. The soundness of the typestate tree-based approach has been mechanised in Coq. The theory can be applied to most object-oriented languages statically analysable through typestates, thus opening new scenarios for acceptance of programs exploiting inheritance and casting. To defend this thesis, we show an application of the theory, by embedding the typestate tree mechanism in a Java-like object-oriented language, and proving its soundness.

Cite as

Lorenzo Bacchiani, Mario Bravetti, Marco Giunti, João Mota, and António Ravara. Behavioural Up/down Casting For Statically Typed Languages. In 38th European Conference on Object-Oriented Programming (ECOOP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 313, pp. 5:1-5:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bacchiani_et_al:LIPIcs.ECOOP.2024.5,
  author =	{Bacchiani, Lorenzo and Bravetti, Mario and Giunti, Marco and Mota, Jo\~{a}o and Ravara, Ant\'{o}nio},
  title =	{{Behavioural Up/down Casting For Statically Typed Languages}},
  booktitle =	{38th European Conference on Object-Oriented Programming (ECOOP 2024)},
  pages =	{5:1--5:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-341-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{313},
  editor =	{Aldrich, Jonathan and Salvaneschi, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2024.5},
  URN =		{urn:nbn:de:0030-drops-208543},
  doi =		{10.4230/LIPIcs.ECOOP.2024.5},
  annote =	{Keywords: Behavioural types, object-oriented programming, subtyping, cast, typestates}
}
Document
Formalizing, Mechanizing, and Verifying Class-Based Refinement Types

Authors: Ke Sun, Di Wang, Sheng Chen, Meng Wang, and Dan Hao

Published in: LIPIcs, Volume 313, 38th European Conference on Object-Oriented Programming (ECOOP 2024)


Abstract
Refinement types have been extensively used in class-based languages to specify and verify fine-grained logical specifications. Despite the advances in practical aspects such as applicability and usability, two fundamental issues persist. First, the soundness of existing class-based refinement type systems is inadequately explored, casting doubts on their reliability. Second, the expressiveness of existing systems is limited, restricting the depiction of semantic properties related to object-oriented constructs. This work tackles these issues through a systematic framework. We formalize a declarative class-based refinement type calculus (named RFJ), that is expressive and concise. We rigorously develop the soundness meta-theory of this calculus, followed by its mechanization in Coq. Finally, to ensure the calculus’s verifiability, we propose an algorithmic verification approach based on a fragment of first-order logic (named LFJ), and implement this approach as a type checker.

Cite as

Ke Sun, Di Wang, Sheng Chen, Meng Wang, and Dan Hao. Formalizing, Mechanizing, and Verifying Class-Based Refinement Types. In 38th European Conference on Object-Oriented Programming (ECOOP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 313, pp. 39:1-39:30, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{sun_et_al:LIPIcs.ECOOP.2024.39,
  author =	{Sun, Ke and Wang, Di and Chen, Sheng and Wang, Meng and Hao, Dan},
  title =	{{Formalizing, Mechanizing, and Verifying Class-Based Refinement Types}},
  booktitle =	{38th European Conference on Object-Oriented Programming (ECOOP 2024)},
  pages =	{39:1--39:30},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-341-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{313},
  editor =	{Aldrich, Jonathan and Salvaneschi, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2024.39},
  URN =		{urn:nbn:de:0030-drops-208881},
  doi =		{10.4230/LIPIcs.ECOOP.2024.39},
  annote =	{Keywords: Refinement Types, Program Verification, Object-oriented Programming}
}
Document
Weighted Basic Parallel Processes and Combinatorial Enumeration

Authors: Lorenzo Clemente

Published in: LIPIcs, Volume 311, 35th International Conference on Concurrency Theory (CONCUR 2024)


Abstract
We study weighted basic parallel processes (WBPP), a nonlinear recursive generalisation of weighted finite automata inspired from process algebra and Petri net theory. Our main result is an algorithm of 2-EXPSPACE complexity for the WBPP equivalence problem. While (unweighted) BPP language equivalence is undecidable, we can use this algorithm to decide multiplicity equivalence of BPP and language equivalence of unambiguous BPP, with the same complexity. These are long-standing open problems for the related model of weighted context-free grammars. Our second contribution is a connection between WBPP, power series solutions of systems of polynomial differential equations, and combinatorial enumeration. To this end we consider constructible differentially finite power series (CDF), a class of multivariate differentially algebraic series introduced by Bergeron and Reutenauer in order to provide a combinatorial interpretation to differential equations. CDF series generalise rational, algebraic, and a large class of D-finite (holonomic) series, for which no complexity upper bound for equivalence was known. We show that CDF series correspond to commutative WBPP series. As a consequence of our result on WBPP and commutativity, we show that equivalence of CDF power series can be decided with 2-EXPTIME complexity. In order to showcase the CDF equivalence algorithm, we show that CDF power series naturally arise from combinatorial enumeration, namely as the exponential generating series of constructible species of structures. Examples of such species include sequences, binary trees, ordered trees, Cayley trees, set partitions, series-parallel graphs, and many others. As a consequence of this connection, we obtain an algorithm to decide multiplicity equivalence of constructible species, decidability of which was not known before. The complexity analysis is based on effective bounds from algebraic geometry, namely on the length of chains of polynomial ideals constructed by repeated application of finitely many, not necessarily commuting derivations of a multivariate polynomial ring. This is obtained by generalising a result of Novikov and Yakovenko in the case of a single derivation, which is noteworthy since generic bounds on ideal chains are non-primitive recursive in general. On the way, we develop the theory of WBPP series and CDF power series, exposing several of their appealing properties.

Cite as

Lorenzo Clemente. Weighted Basic Parallel Processes and Combinatorial Enumeration. In 35th International Conference on Concurrency Theory (CONCUR 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 311, pp. 18:1-18:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{clemente:LIPIcs.CONCUR.2024.18,
  author =	{Clemente, Lorenzo},
  title =	{{Weighted Basic Parallel Processes and Combinatorial Enumeration}},
  booktitle =	{35th International Conference on Concurrency Theory (CONCUR 2024)},
  pages =	{18:1--18:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-339-3},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{311},
  editor =	{Majumdar, Rupak and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2024.18},
  URN =		{urn:nbn:de:0030-drops-207903},
  doi =		{10.4230/LIPIcs.CONCUR.2024.18},
  annote =	{Keywords: weighted automata, combinatorial enumeration, shuffle, algebraic differential equations, process algebra, basic parallel processes, species of structures}
}
Document
Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories

Authors: Ralph Matthes, Kobe Wullaert, and Benedikt Ahrens

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
We describe a generic construction of non-wellfounded syntax involving variable binding and its monadic substitution operation. Our construction of the syntax and its substitution takes place in category theory, notably by using monoidal categories and strong functors between them. A language is specified by a multi-sorted binding signature, say Σ. First, we provide sufficient criteria for Σ to generate a language of possibly infinite terms, through ω-continuity. Second, we construct a monadic substitution operation for the language generated by Σ. A cornerstone in this construction is a mild generalization of the notion of heterogeneous substitution systems developed by Matthes and Uustalu; such a system encapsulates the necessary corecursion scheme for implementing substitution. The results are formalized in the Coq proof assistant, through the UniMath library of univalent mathematics.

Cite as

Ralph Matthes, Kobe Wullaert, and Benedikt Ahrens. Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 25:1-25:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{matthes_et_al:LIPIcs.FSCD.2024.25,
  author =	{Matthes, Ralph and Wullaert, Kobe and Ahrens, Benedikt},
  title =	{{Substitution for Non-Wellfounded Syntax with Binders Through Monoidal Categories}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{25:1--25:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.25},
  URN =		{urn:nbn:de:0030-drops-203540},
  doi =		{10.4230/LIPIcs.FSCD.2024.25},
  annote =	{Keywords: Non-wellfounded syntax, Substitution, Monoidal categories, Actegories, Tensorial strength, Proof assistant Coq, UniMath library}
}
Document
Position
Grounding Stream Reasoning Research

Authors: Pieter Bonte, Jean-Paul Calbimonte, Daniel de Leng, Daniele Dell'Aglio, Emanuele Della Valle, Thomas Eiter, Federico Giannini, Fredrik Heintz, Konstantin Schekotihin, Danh Le-Phuoc, Alessandra Mileo, Patrik Schneider, Riccardo Tommasini, Jacopo Urbani, and Giacomo Ziffer

Published in: TGDK, Volume 2, Issue 1 (2024): Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge, Volume 2, Issue 1


Abstract
In the last decade, there has been a growing interest in applying AI technologies to implement complex data analytics over data streams. To this end, researchers in various fields have been organising a yearly event called the "Stream Reasoning Workshop" to share perspectives, challenges, and experiences around this topic. In this paper, the previous organisers of the workshops and other community members provide a summary of the main research results that have been discussed during the first six editions of the event. These results can be categorised into four main research areas: The first is concerned with the technological challenges related to handling large data streams. The second area aims at adapting and extending existing semantic technologies to data streams. The third and fourth areas focus on how to implement reasoning techniques, either considering deductive or inductive techniques, to extract new and valuable knowledge from the data in the stream. This summary is written not only to provide a crystallisation of the field, but also to point out distinctive traits of the stream reasoning community. Moreover, it also provides a foundation for future research by enumerating a list of use cases and open challenges, to stimulate others to join this exciting research area.

Cite as

Pieter Bonte, Jean-Paul Calbimonte, Daniel de Leng, Daniele Dell'Aglio, Emanuele Della Valle, Thomas Eiter, Federico Giannini, Fredrik Heintz, Konstantin Schekotihin, Danh Le-Phuoc, Alessandra Mileo, Patrik Schneider, Riccardo Tommasini, Jacopo Urbani, and Giacomo Ziffer. Grounding Stream Reasoning Research. In Special Issue on Trends in Graph Data and Knowledge - Part 2. Transactions on Graph Data and Knowledge (TGDK), Volume 2, Issue 1, pp. 2:1-2:47, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{bonte_et_al:TGDK.2.1.2,
  author =	{Bonte, Pieter and Calbimonte, Jean-Paul and de Leng, Daniel and Dell'Aglio, Daniele and Della Valle, Emanuele and Eiter, Thomas and Giannini, Federico and Heintz, Fredrik and Schekotihin, Konstantin and Le-Phuoc, Danh and Mileo, Alessandra and Schneider, Patrik and Tommasini, Riccardo and Urbani, Jacopo and Ziffer, Giacomo},
  title =	{{Grounding Stream Reasoning Research}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{2:1--2:47},
  ISSN =	{2942-7517},
  year =	{2024},
  volume =	{2},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.2.1.2},
  URN =		{urn:nbn:de:0030-drops-198597},
  doi =		{10.4230/TGDK.2.1.2},
  annote =	{Keywords: Stream Reasoning, Stream Processing, RDF streams, Streaming Linked Data, Continuous query processing, Temporal Logics, High-performance computing, Databases}
}
Document
An Efficient, Practical Algorithm and Implementation for Computing Multiplicatively Weighted Voronoi Diagrams

Authors: Martin Held and Stefan de Lorenzo

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
We present a simple wavefront-like approach for computing multiplicatively weighted Voronoi diagrams of points and straight-line segments in the Euclidean plane. If the input sites may be assumed to be randomly weighted points then the use of a so-called overlay arrangement [Har-Peled & Raichel, Discrete Comput. Geom. 53:547 - 568, 2015] allows to achieve an expected runtime complexity of 𝒪(n log⁴ n), while still maintaining the simplicity of our approach. We implemented the full algorithm for weighted points as input sites, based on CGAL. The results of an experimental evaluation of our implementation suggest 𝒪(n log² n) as a practical bound on the runtime. Our algorithm can be extended to handle also additive weights in addition to multiplicative weights, and it yields a truly simple 𝒪(n log n) solution for solving the one-dimensional version of this problem.

Cite as

Martin Held and Stefan de Lorenzo. An Efficient, Practical Algorithm and Implementation for Computing Multiplicatively Weighted Voronoi Diagrams. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 56:1-56:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{held_et_al:LIPIcs.ESA.2020.56,
  author =	{Held, Martin and de Lorenzo, Stefan},
  title =	{{An Efficient, Practical Algorithm and Implementation for Computing Multiplicatively Weighted Voronoi Diagrams}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{56:1--56:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.56},
  URN =		{urn:nbn:de:0030-drops-129224},
  doi =		{10.4230/LIPIcs.ESA.2020.56},
  annote =	{Keywords: Voronoi Diagram, multiplicative weight, additive weight, arc expansion, overlay arrangement, implementation, experiments, CGAL, exact arithmetic}
}
Document
CG Challenge
Computing Low-Cost Convex Partitions for Planar Point Sets Based on Tailored Decompositions (CG Challenge)

Authors: Günther Eder, Martin Held, Stefan de Lorenzo, and Peter Palfrader

Published in: LIPIcs, Volume 164, 36th International Symposium on Computational Geometry (SoCG 2020)


Abstract
Our work on minimum convex decompositions is based on two key components: (1) different strategies for computing initial decompositions, partly adapted to the characteristics of the input data, and (2) local optimizations for reducing the number of convex faces of a decomposition. We discuss our main heuristics and show how they helped to reduce the face count.

Cite as

Günther Eder, Martin Held, Stefan de Lorenzo, and Peter Palfrader. Computing Low-Cost Convex Partitions for Planar Point Sets Based on Tailored Decompositions (CG Challenge). In 36th International Symposium on Computational Geometry (SoCG 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 164, pp. 85:1-85:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{eder_et_al:LIPIcs.SoCG.2020.85,
  author =	{Eder, G\"{u}nther and Held, Martin and de Lorenzo, Stefan and Palfrader, Peter},
  title =	{{Computing Low-Cost Convex Partitions for Planar Point Sets Based on Tailored Decompositions}},
  booktitle =	{36th International Symposium on Computational Geometry (SoCG 2020)},
  pages =	{85:1--85:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-143-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{164},
  editor =	{Cabello, Sergio and Chen, Danny Z.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2020.85},
  URN =		{urn:nbn:de:0030-drops-122438},
  doi =		{10.4230/LIPIcs.SoCG.2020.85},
  annote =	{Keywords: Computational Geometry, geometric optimization, algorithm engineering, convex decomposition}
}
  • Refine by Author
  • 2 Held, Martin
  • 2 de Lorenzo, Stefan
  • 1 Ahrens, Benedikt
  • 1 Bacchiani, Lorenzo
  • 1 Bonte, Pieter
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Computational geometry
  • 2 Theory of computation → Type theory
  • 1 Computing methodologies → Description logics
  • 1 Computing methodologies → Temporal reasoning
  • 1 Information systems → Data streams
  • Show More...

  • Refine by Keyword
  • 1 Actegories
  • 1 Behavioural types
  • 1 CGAL
  • 1 Computational Geometry
  • 1 Continuous query processing
  • Show More...

  • Refine by Type
  • 7 document

  • Refine by Publication Year
  • 5 2024
  • 2 2020

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail