16 Search Results for "Chen, Zhuo"


Document
A Verified Cost Model for Call-By-Push-Value

Authors: Zhuo Zoey Chen, Johannes Åman Pohjola, and Christine Rizkallah

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
The call-by-push-value λ-calculus allows for syntactically specifying the order of evaluation as part of the term language. Hence, it serves as a unifying language for embedding various evaluation strategies including call-by-value and call-by-name. Given the impact of call-by-push-value, it is remarkable that its adequacy as a model for computational complexity theory has not yet been studied. In this paper, we show that the call-by-push-value λ-calculus is reasonable for both time and space complexity. A reasonable cost model can encode other reasonable cost models with polynomial overhead in time and constant factor overhead in space. We achieve this by encoding call-by-push-value λ-calculus into Turing machines, following a simulation strategy by Forster et al.; for the converse direction, we prove that Levy’s encoding of the call-by-value λ-calculus has reasonable complexity bounds. The main results have been formalised in the HOL4 theorem prover.

Cite as

Zhuo Zoey Chen, Johannes Åman Pohjola, and Christine Rizkallah. A Verified Cost Model for Call-By-Push-Value. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 7:1-7:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITP.2025.7,
  author =	{Chen, Zhuo Zoey and \r{A}man Pohjola, Johannes and Rizkallah, Christine},
  title =	{{A Verified Cost Model for Call-By-Push-Value}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{7:1--7:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.7},
  URN =		{urn:nbn:de:0030-drops-246067},
  doi =		{10.4230/LIPIcs.ITP.2025.7},
  annote =	{Keywords: lambda calculus, formalizations of computational models, computability theory, HOL, call-by-push-value reduction, time and space complexity, abstract machines}
}
Artifact
Software
cbpv-reasonable-HOL

Authors: Zhuo Zoey Chen, Johannes Åman Pohjola, and Christine Rizkallah


Abstract

Cite as

Zhuo Zoey Chen, Johannes Åman Pohjola, Christine Rizkallah. cbpv-reasonable-HOL (Software, Mechanised Proof). Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@misc{dagstuhl-artifact-24718,
   title = {{cbpv-reasonable-HOL}}, 
   author = {Chen, Zhuo Zoey and \r{A}man Pohjola, Johannes and Rizkallah, Christine},
   note = {Software, swhId: \href{https://archive.softwareheritage.org/swh:1:dir:df18377e9fa5e35255f2687ad66ddbc2f010b934;origin=https://github.com/ZhuoZoeyChen/cbpv-reasonable-HOL;visit=swh:1:snp:9fefd6f03db3694a8bdc7e5ea8ff0f4a1fbde680;anchor=swh:1:rev:4e4f4692c9e6e1e23c566ec4730f81eafde32f3c}{\texttt{swh:1:dir:df18377e9fa5e35255f2687ad66ddbc2f010b934}} (visited on 2025-09-22)},
   url = {https://github.com/ZhuoZoeyChen/cbpv-reasonable-HOL/},
   doi = {10.4230/artifacts.24718},
}
Document
Parallel MIP Solving with Dynamic Task Decomposition

Authors: Peng Lin, Shaowei Cai, Mengchuan Zou, and Shengqi Chen

Published in: LIPIcs, Volume 340, 31st International Conference on Principles and Practice of Constraint Programming (CP 2025)


Abstract
Mixed Integer Programming (MIP) is a foundational model in operations research. Although significant progress has been made in enhancing sequential MIP solvers through sophisticated techniques and heuristics, remarkable developments in computing resources have made parallel solving a promising direction for performance improvement. In this work, we propose a novel parallel MIP solving framework that employs dynamic task decomposition in a divide-and-conquer paradigm. Our framework incorporates a hardness estimate heuristic to identify challenging solving tasks and a reward decaying mechanism to reinforce the task decomposition decision. We apply our framework to two state-of-the-art open-source MIP solvers, SCIP and HiGHS, yielding efficient parallel solvers. Extensive experiments on the full MIPLIB benchmark, using up to 128 cores, demonstrate that our framework yields substantial performance improvements over modern divide-and-conquer parallel solvers. Moreover, our parallel solvers have established new best known solutions for 16 open MIPLIB instances.

Cite as

Peng Lin, Shaowei Cai, Mengchuan Zou, and Shengqi Chen. Parallel MIP Solving with Dynamic Task Decomposition. In 31st International Conference on Principles and Practice of Constraint Programming (CP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 340, pp. 26:1-26:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lin_et_al:LIPIcs.CP.2025.26,
  author =	{Lin, Peng and Cai, Shaowei and Zou, Mengchuan and Chen, Shengqi},
  title =	{{Parallel MIP Solving with Dynamic Task Decomposition}},
  booktitle =	{31st International Conference on Principles and Practice of Constraint Programming (CP 2025)},
  pages =	{26:1--26:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-380-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{340},
  editor =	{de la Banda, Maria Garcia},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2025.26},
  URN =		{urn:nbn:de:0030-drops-238871},
  doi =		{10.4230/LIPIcs.CP.2025.26},
  annote =	{Keywords: Mixed Integer Programming, Parallel Computing, Complete Search, Task Decomposition}
}
Document
SLS-Enhanced Core-Boosted Linear Search for Anytime Maximum Satisfiability

Authors: Ole Lübke and Jeremias Berg

Published in: LIPIcs, Volume 340, 31st International Conference on Principles and Practice of Constraint Programming (CP 2025)


Abstract
Maximum Satisfiability (MaxSAT), the constraint paradigm of minimizing a linear expression over Boolean (0-1) variables subject to a set of propositional clauses, is today used for solving NP-hard combinatorial optimization problems in various domains. Especially anytime MaxSAT solvers that compute low-cost solutions within a limited available computational time have significantly improved in recent years. Such solvers can be divided into SAT-based methods that use sophisticated reasoning, and stochastic local search (SLS) methods that heuristically explore the search space. The two are complementary; roughly speaking, SLS struggles with finding feasible solutions, and SAT-based methods with minimizing cost. Consequently, most state-of-the-art anytime MaxSAT solvers run SLS before a SAT-based algorithm with minimal communication between the two. In this paper, we aim to harness the complementary strengths of SAT-based, and SLS approaches in the context of anytime MaxSAT. More precisely, we describe several ways to enhance the performance of the so-called core-boosted linear search algorithm for anytime MaxSAT with SLS techniques. Core-boosted linear search is a three-phase algorithm where each phase uses different types of reasoning. Beyond MaxSAT, core-boosted search has also been successful in the related paradigms of pseudo-boolean optimization and constraint programming. We describe how an SLS approach to MaxSAT can be tightly integrated with all three phases of the algorithm, resulting in non-trivial information exchange in both directions between the SLS algorithm and the reasoning methods. We evaluate our techniques on standard benchmarks from the latest MaxSAT Evaluation and demonstrate that our techniques can noticeably improve on implementations of core-boosted search and SLS.

Cite as

Ole Lübke and Jeremias Berg. SLS-Enhanced Core-Boosted Linear Search for Anytime Maximum Satisfiability. In 31st International Conference on Principles and Practice of Constraint Programming (CP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 340, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lubke_et_al:LIPIcs.CP.2025.28,
  author =	{L\"{u}bke, Ole and Berg, Jeremias},
  title =	{{SLS-Enhanced Core-Boosted Linear Search for Anytime Maximum Satisfiability}},
  booktitle =	{31st International Conference on Principles and Practice of Constraint Programming (CP 2025)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-380-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{340},
  editor =	{de la Banda, Maria Garcia},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2025.28},
  URN =		{urn:nbn:de:0030-drops-238897},
  doi =		{10.4230/LIPIcs.CP.2025.28},
  annote =	{Keywords: Maximum Satisfiability, MaxSAT, SAT, SLS, Anytime Optimization}
}
Document
Streamlining Distributed SAT Solver Design

Authors: Dominik Schreiber, Niccolò Rigi-Luperti, and Armin Biere

Published in: LIPIcs, Volume 341, 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)


Abstract
Distributed clause-sharing SAT solvers have recently been established as powerful automated reasoning tools that can conquer previously infeasible instances. A common design of distributed SAT solvers is to run many off-the-shelf sequential solvers in parallel, employ some diversification (e.g., restart intervals or decision orders), and share conflict clauses among the solver threads. This approach, naïvely, adopts all best practices of sequential solver design for distributed solving, where these practices may be less useful or even actively detrimental. In this work we diagnose such shortcomings in the state-of-the-art system MallobSat and propose first effective mitigations. In particular, we replace the redundant pre- and inprocessing at all threads with single-core preprocessing that runs next to the parallel search, remove LBD values from the clause-sharing operation, and slim down solver diversification to very few lightweight and uniform methods. Experimental evaluations on up to 3072 cores (64 nodes) confirm that our measures improve performance while also drastically simplifying the SAT solving program that is run in parallel.

Cite as

Dominik Schreiber, Niccolò Rigi-Luperti, and Armin Biere. Streamlining Distributed SAT Solver Design. In 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 341, pp. 27:1-27:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{schreiber_et_al:LIPIcs.SAT.2025.27,
  author =	{Schreiber, Dominik and Rigi-Luperti, Niccol\`{o} and Biere, Armin},
  title =	{{Streamlining Distributed SAT Solver Design}},
  booktitle =	{28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)},
  pages =	{27:1--27:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-381-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{341},
  editor =	{Berg, Jeremias and Nordstr\"{o}m, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2025.27},
  URN =		{urn:nbn:de:0030-drops-237615},
  doi =		{10.4230/LIPIcs.SAT.2025.27},
  annote =	{Keywords: Satisfiability, parallel SAT solving, distributed computing, preprocessing}
}
Document
Elements for Weighted Answer-Set Programming

Authors: Francisco Coelho, Bruno Dinis, Dietmar Seipel, and Salvador Abreu

Published in: OASIcs, Volume 135, 14th Symposium on Languages, Applications and Technologies (SLATE 2025)


Abstract
Logic programs, more specifically, answer-set programs, can be annotated with probabilities on facts to express uncertainty. We address the problem of propagating weight annotations on facts (e.g. probabilities) of an answer-set program to its stable models, and from there to events (defined as sets of atoms) in a dataset over the program’s domain. We propose a novel approach which is algebraic in the sense that it relies on an equivalence relation over the set of events. Uncertainty is then described as polynomial expressions over variables. We propagate the weight function in the space of models and events, rather than doing so within the syntax of the program. As evidence that our approach is sound, we show that certain facts behave as expected. Our approach allows us to investigate weight annotated programs and to determine how suitable a given one is for modeling a given dataset containing events. It’s core is illustrated by a running example and the encoding of a Bayesian network.

Cite as

Francisco Coelho, Bruno Dinis, Dietmar Seipel, and Salvador Abreu. Elements for Weighted Answer-Set Programming. In 14th Symposium on Languages, Applications and Technologies (SLATE 2025). Open Access Series in Informatics (OASIcs), Volume 135, pp. 3:1-3:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{coelho_et_al:OASIcs.SLATE.2025.3,
  author =	{Coelho, Francisco and Dinis, Bruno and Seipel, Dietmar and Abreu, Salvador},
  title =	{{Elements for Weighted Answer-Set Programming}},
  booktitle =	{14th Symposium on Languages, Applications and Technologies (SLATE 2025)},
  pages =	{3:1--3:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-387-4},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{135},
  editor =	{Baptista, Jorge and Barateiro, Jos\'{e}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SLATE.2025.3},
  URN =		{urn:nbn:de:0030-drops-236836},
  doi =		{10.4230/OASIcs.SLATE.2025.3},
  annote =	{Keywords: Answer-Set Programming, Stable Models, Probabilistic Logic Programming}
}
Document
Formal Verification in Solidity and Move: Insights from a Comparative Analysis

Authors: Massimo Bartoletti, Silvia Crafa, and Enrico Lipparini

Published in: OASIcs, Volume 129, 6th International Workshop on Formal Methods for Blockchains (FMBC 2025)


Abstract
Formal verification plays a crucial role in making smart contracts safer, being able to find bugs or to guarantee their absence, as well as checking whether the business logic is correctly implemented. For Solidity, even though there already exist several mature verification tools, the semantical quirks of the language can make verification quite hard in practice. Move, on the other hand, has been designed with security and verification in mind, and it has been accompanied since its early stages by a formal verification tool, the Move Prover. In this paper, we investigate through a comparative analysis: 1) how the different designs of the two contract languages impact verification, and 2) what is the state-of-the-art of verification tools for the two languages, and how do they compare on three paradigmatic use cases. Our investigation is supported by an open dataset of verification tasks performed in Certora and in the Aptos Move Prover.

Cite as

Massimo Bartoletti, Silvia Crafa, and Enrico Lipparini. Formal Verification in Solidity and Move: Insights from a Comparative Analysis. In 6th International Workshop on Formal Methods for Blockchains (FMBC 2025). Open Access Series in Informatics (OASIcs), Volume 129, pp. 3:1-3:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bartoletti_et_al:OASIcs.FMBC.2025.3,
  author =	{Bartoletti, Massimo and Crafa, Silvia and Lipparini, Enrico},
  title =	{{Formal Verification in Solidity and Move: Insights from a Comparative Analysis}},
  booktitle =	{6th International Workshop on Formal Methods for Blockchains (FMBC 2025)},
  pages =	{3:1--3:18},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-371-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{129},
  editor =	{Marmsoler, Diego and Xu, Meng},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.FMBC.2025.3},
  URN =		{urn:nbn:de:0030-drops-230302},
  doi =		{10.4230/OASIcs.FMBC.2025.3},
  annote =	{Keywords: Smart contracts, Solidity, Move, Verification, Blockchain}
}
Document
O(1)-Round MPC Algorithms for Multi-Dimensional Grid Graph Connectivity, Euclidean MST and DBSCAN

Authors: Junhao Gan, Anthony Wirth, and Zhuo Zhang

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
In this paper, we investigate three fundamental problems in the Massively Parallel Computation (MPC) model: (i) grid graph connectivity, (ii) approximate Euclidean Minimum Spanning Tree (EMST), and (iii) approximate DBSCAN. Our first result is a O(1)-round Las Vegas (i.e., succeeding with high probability) MPC algorithm for computing the connected components on a d-dimensional c-penetration grid graph ((d,c)-grid graph), where both d and c are positive integer constants. In such a grid graph, each vertex is a point with integer coordinates in ℕ^d, and an edge can only exist between two distinct vertices with 𝓁_∞-norm at most c. To our knowledge, the current best existing result for computing the connected components (CC’s) on (d,c)-grid graphs in the MPC model is to run the state-of-the-art MPC CC algorithms that are designed for general graphs: they achieve O(log log n + log D) [Behnezhad et al., 2019] and O(log log n + log 1/(λ)) [Sepehr Assadi et al., 2019] rounds, respectively, where D is the diameter and λ is the spectral gap of the graph. With our grid graph connectivity technique, our second main result is a O(1)-round Las Vegas MPC algorithm for computing approximate Euclidean MST. The existing state-of-the-art result on this problem is the O(1)-round MPC algorithm proposed by Andoni et al. [Alexandr Andoni et al., 2014], which only guarantees an approximation on the overall weight in expectation. In contrast, our algorithm not only guarantees a deterministic overall weight approximation, but also achieves a deterministic edge-wise weight approximation. The latter property is crucial to many applications, such as finding the Bichromatic Closest Pair and Single-Linkage Clustering. Last, but not least, our third main result is a O(1)-round Las Vegas MPC algorithm for computing an approximate DBSCAN clustering in O(1)-dimensional Euclidean space.

Cite as

Junhao Gan, Anthony Wirth, and Zhuo Zhang. O(1)-Round MPC Algorithms for Multi-Dimensional Grid Graph Connectivity, Euclidean MST and DBSCAN. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 7:1-7:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gan_et_al:LIPIcs.ICDT.2025.7,
  author =	{Gan, Junhao and Wirth, Anthony and Zhang, Zhuo},
  title =	{{O(1)-Round MPC Algorithms for Multi-Dimensional Grid Graph Connectivity, Euclidean MST and DBSCAN}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{7:1--7:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.7},
  URN =		{urn:nbn:de:0030-drops-229483},
  doi =		{10.4230/LIPIcs.ICDT.2025.7},
  annote =	{Keywords: Massively Parallel Computation, Graph Connectivity, Grid Graphs, Euclidean Minimum Spanning Tree, DBSCAN}
}
Document
Optimal Oblivious Algorithms for Multi-Way Joins

Authors: Xiao Hu and Zhiang Wu

Published in: LIPIcs, Volume 328, 28th International Conference on Database Theory (ICDT 2025)


Abstract
In cloud databases, cloud computation over sensitive data uploaded by clients inevitably causes concern about data security and privacy. Even if cryptographic primitives and trusted computing environments are integrated into query processing to safeguard the actual contents of the data, access patterns of algorithms can still leak private information about data. Oblivious RAM (ORAM) and circuits are two generic approaches to address this issue, ensuring that access patterns of algorithms remain oblivious to the data. However, deploying these methods on insecure algorithms, particularly for multi-way join processing, is computationally expensive and inherently challenging. In this paper, we propose a novel sorting-based algorithm for multi-way join processing that operates without relying on ORAM simulations or other security assumptions. Our algorithm is a non-trivial, provably oblivious composition of basic primitives, with time complexity matching the insecure worst-case optimal join algorithm, up to a logarithmic factor. Furthermore, it is cache-agnostic, with cache complexity matching the insecure lower bound, also up to a logarithmic factor. This clean and straightforward approach has the potential to be extended to other security settings and implemented in practical database systems.

Cite as

Xiao Hu and Zhiang Wu. Optimal Oblivious Algorithms for Multi-Way Joins. In 28th International Conference on Database Theory (ICDT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 328, pp. 25:1-25:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{hu_et_al:LIPIcs.ICDT.2025.25,
  author =	{Hu, Xiao and Wu, Zhiang},
  title =	{{Optimal Oblivious Algorithms for Multi-Way Joins}},
  booktitle =	{28th International Conference on Database Theory (ICDT 2025)},
  pages =	{25:1--25:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-364-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{328},
  editor =	{Roy, Sudeepa and Kara, Ahmet},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2025.25},
  URN =		{urn:nbn:de:0030-drops-229662},
  doi =		{10.4230/LIPIcs.ICDT.2025.25},
  annote =	{Keywords: oblivious algorithms, multi-way joins, worst-case optimality}
}
Document
Online Balanced Allocation of Dynamic Components

Authors: Rajmohan Rajaraman and Omer Wasim

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
We introduce Online Balanced Allocation of Dynamic Components (OBADC), a problem motivated by the practical challenge of dynamic resource allocation for large-scale distributed applications. In OBADC, we need to allocate a dynamic set of at most k𝓁 vertices (representing processes) in 𝓁 > 0 clusters. We consider an over-provisioned setup in which each cluster can hold at most k(1+ε) vertices, for an arbitrary constant ε > 0. The communication requirements among the vertices are modeled by the notion of a dynamically changing component, which is a subset of vertices that need to be co-located in the same cluster. At each time t, a request r_t of one of the following types arrives: 1) insertion of a vertex v forming a singleton component v at unit cost. 2) merge of (u,v) requiring that the components containing u and v be merged and co-located thereafter. 3) deletion of an existing vertex v at zero cost. Before serving any request, an algorithm can migrate vertices from one cluster to another, at a unit migration cost per vertex. We seek an online algorithm to minimize the total migration cost incurred for an arbitrary request sequence σ = (r_t)_{t > 0}, while simultaneously minimizing the number of clusters utilized. We analyze competitiveness with respect to an optimal clairvoyant offline algorithm with identical (over-provisioned) capacity constraints. We give an O(log k)-competitive algorithm for OBADC, and a matching lower-bound. The number of clusters utilized by our algorithm is always within a (2+ε) factor of the minimum. Furthermore, in a resource augmented setting where the optimal offline algorithm is constrained to capacity k per cluster, our algorithm obtains O(log k) competitiveness and utilizes a number of clusters within (1+ε) factor of the minimum. We also consider OBADC in the context of machine-learned predictions, where for each newly inserted vertex v at time t: i) with probability η > 0, the set of vertices (that exist at time t) in the component of v is revealed and, ii) with probability 1-η, no information is revealed. For OBADC with predictions, we give a O(1)-consistent and O(min(log 1/(η), log k))-robust algorithm.

Cite as

Rajmohan Rajaraman and Omer Wasim. Online Balanced Allocation of Dynamic Components. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 81:1-81:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{rajaraman_et_al:LIPIcs.ITCS.2025.81,
  author =	{Rajaraman, Rajmohan and Wasim, Omer},
  title =	{{Online Balanced Allocation of Dynamic Components}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{81:1--81:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.81},
  URN =		{urn:nbn:de:0030-drops-227090},
  doi =		{10.4230/LIPIcs.ITCS.2025.81},
  annote =	{Keywords: online algorithms, competitive ratio, algorithms with predictions}
}
Document
Position
Knowledge Graphs for the Life Sciences: Recent Developments, Challenges and Opportunities

Authors: Jiaoyan Chen, Hang Dong, Janna Hastings, Ernesto Jiménez-Ruiz, Vanessa López, Pierre Monnin, Catia Pesquita, Petr Škoda, and Valentina Tamma

Published in: TGDK, Volume 1, Issue 1 (2023): Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge, Volume 1, Issue 1


Abstract
The term life sciences refers to the disciplines that study living organisms and life processes, and include chemistry, biology, medicine, and a range of other related disciplines. Research efforts in life sciences are heavily data-driven, as they produce and consume vast amounts of scientific data, much of which is intrinsically relational and graph-structured. The volume of data and the complexity of scientific concepts and relations referred to therein promote the application of advanced knowledge-driven technologies for managing and interpreting data, with the ultimate aim to advance scientific discovery. In this survey and position paper, we discuss recent developments and advances in the use of graph-based technologies in life sciences and set out a vision for how these technologies will impact these fields into the future. We focus on three broad topics: the construction and management of Knowledge Graphs (KGs), the use of KGs and associated technologies in the discovery of new knowledge, and the use of KGs in artificial intelligence applications to support explanations (explainable AI). We select a few exemplary use cases for each topic, discuss the challenges and open research questions within these topics, and conclude with a perspective and outlook that summarizes the overarching challenges and their potential solutions as a guide for future research.

Cite as

Jiaoyan Chen, Hang Dong, Janna Hastings, Ernesto Jiménez-Ruiz, Vanessa López, Pierre Monnin, Catia Pesquita, Petr Škoda, and Valentina Tamma. Knowledge Graphs for the Life Sciences: Recent Developments, Challenges and Opportunities. In Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge (TGDK), Volume 1, Issue 1, pp. 5:1-5:33, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@Article{chen_et_al:TGDK.1.1.5,
  author =	{Chen, Jiaoyan and Dong, Hang and Hastings, Janna and Jim\'{e}nez-Ruiz, Ernesto and L\'{o}pez, Vanessa and Monnin, Pierre and Pesquita, Catia and \v{S}koda, Petr and Tamma, Valentina},
  title =	{{Knowledge Graphs for the Life Sciences: Recent Developments, Challenges and Opportunities}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{5:1--5:33},
  year =	{2023},
  volume =	{1},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.1.1.5},
  URN =		{urn:nbn:de:0030-drops-194791},
  doi =		{10.4230/TGDK.1.1.5},
  annote =	{Keywords: Knowledge graphs, Life science, Knowledge discovery, Explainable AI}
}
Document
Position
Large Language Models and Knowledge Graphs: Opportunities and Challenges

Authors: Jeff Z. Pan, Simon Razniewski, Jan-Christoph Kalo, Sneha Singhania, Jiaoyan Chen, Stefan Dietze, Hajira Jabeen, Janna Omeliyanenko, Wen Zhang, Matteo Lissandrini, Russa Biswas, Gerard de Melo, Angela Bonifati, Edlira Vakaj, Mauro Dragoni, and Damien Graux

Published in: TGDK, Volume 1, Issue 1 (2023): Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge, Volume 1, Issue 1


Abstract
Large Language Models (LLMs) have taken Knowledge Representation - and the world - by storm. This inflection point marks a shift from explicit knowledge representation to a renewed focus on the hybrid representation of both explicit knowledge and parametric knowledge. In this position paper, we will discuss some of the common debate points within the community on LLMs (parametric knowledge) and Knowledge Graphs (explicit knowledge) and speculate on opportunities and visions that the renewed focus brings, as well as related research topics and challenges.

Cite as

Jeff Z. Pan, Simon Razniewski, Jan-Christoph Kalo, Sneha Singhania, Jiaoyan Chen, Stefan Dietze, Hajira Jabeen, Janna Omeliyanenko, Wen Zhang, Matteo Lissandrini, Russa Biswas, Gerard de Melo, Angela Bonifati, Edlira Vakaj, Mauro Dragoni, and Damien Graux. Large Language Models and Knowledge Graphs: Opportunities and Challenges. In Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge (TGDK), Volume 1, Issue 1, pp. 2:1-2:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@Article{pan_et_al:TGDK.1.1.2,
  author =	{Pan, Jeff Z. and Razniewski, Simon and Kalo, Jan-Christoph and Singhania, Sneha and Chen, Jiaoyan and Dietze, Stefan and Jabeen, Hajira and Omeliyanenko, Janna and Zhang, Wen and Lissandrini, Matteo and Biswas, Russa and de Melo, Gerard and Bonifati, Angela and Vakaj, Edlira and Dragoni, Mauro and Graux, Damien},
  title =	{{Large Language Models and Knowledge Graphs: Opportunities and Challenges}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{2:1--2:38},
  year =	{2023},
  volume =	{1},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.1.1.2},
  URN =		{urn:nbn:de:0030-drops-194766},
  doi =		{10.4230/TGDK.1.1.2},
  annote =	{Keywords: Large Language Models, Pre-trained Language Models, Knowledge Graphs, Ontology, Retrieval Augmented Language Models}
}
Document
Vision
Machine Learning and Knowledge Graphs: Existing Gaps and Future Research Challenges

Authors: Claudia d'Amato, Louis Mahon, Pierre Monnin, and Giorgos Stamou

Published in: TGDK, Volume 1, Issue 1 (2023): Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge, Volume 1, Issue 1


Abstract
The graph model is nowadays largely adopted to model a wide range of knowledge and data, spanning from social networks to knowledge graphs (KGs), representing a successful paradigm of how symbolic and transparent AI can scale on the World Wide Web. However, due to their unprecedented volume, they are generally tackled by Machine Learning (ML) and mostly numeric based methods such as graph embedding models (KGE) and deep neural networks (DNNs). The latter methods have been proved lately very efficient, leading the current AI spring. In this vision paper, we introduce some of the main existing methods for combining KGs and ML, divided into two categories: those using ML to improve KGs, and those using KGs to improve results on ML tasks. From this introduction, we highlight research gaps and perspectives that we deem promising and currently under-explored for the involved research communities, spanning from KG support for LLM prompting, integration of KG semantics in ML models to symbol-based methods, interpretability of ML models, and the need for improved benchmark datasets. In our opinion, such perspectives are stepping stones in an ultimate view of KGs as central assets for neuro-symbolic and explainable AI.

Cite as

Claudia d'Amato, Louis Mahon, Pierre Monnin, and Giorgos Stamou. Machine Learning and Knowledge Graphs: Existing Gaps and Future Research Challenges. In Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge (TGDK), Volume 1, Issue 1, pp. 8:1-8:35, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@Article{damato_et_al:TGDK.1.1.8,
  author =	{d'Amato, Claudia and Mahon, Louis and Monnin, Pierre and Stamou, Giorgos},
  title =	{{Machine Learning and Knowledge Graphs: Existing Gaps and Future Research Challenges}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{8:1--8:35},
  year =	{2023},
  volume =	{1},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.1.1.8},
  URN =		{urn:nbn:de:0030-drops-194824},
  doi =		{10.4230/TGDK.1.1.8},
  annote =	{Keywords: Graph-based Learning, Knowledge Graph Embeddings, Large Language Models, Explainable AI, Knowledge Graph Completion \& Curation}
}
Document
Vision
Knowledge Engineering Using Large Language Models

Authors: Bradley P. Allen, Lise Stork, and Paul Groth

Published in: TGDK, Volume 1, Issue 1 (2023): Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge, Volume 1, Issue 1


Abstract
Knowledge engineering is a discipline that focuses on the creation and maintenance of processes that generate and apply knowledge. Traditionally, knowledge engineering approaches have focused on knowledge expressed in formal languages. The emergence of large language models and their capabilities to effectively work with natural language, in its broadest sense, raises questions about the foundations and practice of knowledge engineering. Here, we outline the potential role of LLMs in knowledge engineering, identifying two central directions: 1) creating hybrid neuro-symbolic knowledge systems; and 2) enabling knowledge engineering in natural language. Additionally, we formulate key open research questions to tackle these directions.

Cite as

Bradley P. Allen, Lise Stork, and Paul Groth. Knowledge Engineering Using Large Language Models. In Special Issue on Trends in Graph Data and Knowledge. Transactions on Graph Data and Knowledge (TGDK), Volume 1, Issue 1, pp. 3:1-3:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@Article{allen_et_al:TGDK.1.1.3,
  author =	{Allen, Bradley P. and Stork, Lise and Groth, Paul},
  title =	{{Knowledge Engineering Using Large Language Models}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{3:1--3:19},
  ISSN =	{2942-7517},
  year =	{2023},
  volume =	{1},
  number =	{1},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.1.1.3},
  URN =		{urn:nbn:de:0030-drops-194777},
  doi =		{10.4230/TGDK.1.1.3},
  annote =	{Keywords: knowledge engineering, large language models}
}
Document
Bayesian Hybrid Automata: A Formal Model of Justified Belief in Interacting Hybrid Systems Subject to Imprecise Observation

Authors: Paul Kröger and Martin Fränzle

Published in: LITES, Volume 8, Issue 2 (2022): Special Issue on Distributed Hybrid Systems. Leibniz Transactions on Embedded Systems, Volume 8, Issue 2


Abstract
Hybrid discrete-continuous system dynamics arises when discrete actions, e.g. by a decision algorithm, meet continuous behaviour, e.g. due to physical processes and continuous control. A natural domain of such systems are emerging smart technologies which add elements of intelligence, co-operation, and adaptivity to physical entities, enabling them to interact with each other and with humans as systems of (human-)cyber-physical systems or (H)CPSes.Various flavours of hybrid automata have been suggested as a means to formally analyse CPS dynamics. In a previous article, we demonstrated that all these variants of hybrid automata provide inaccurate, in the sense of either overly pessimistic or overly optimistic, verdicts for engineered systems operating under imprecise observation of their environment due to, e.g., measurement error. We suggested a revised formal model, called Bayesian hybrid automata, that is able to represent state tracking and estimation in hybrid systems and thereby enhances precision of verdicts obtained from the model in comparison to traditional model variants.In this article, we present an extended definition of Bayesian hybrid automata which incorporates a new class of guard and invariant functions that allow to evaluate traditional guards and invariants over probability distributions. The resulting framework allows to model observers with knowledge about the control strategy of an observed agent but with imprecise estimates of the data on which the control decisions are based.

Cite as

Paul Kröger and Martin Fränzle. Bayesian Hybrid Automata: A Formal Model of Justified Belief in Interacting Hybrid Systems Subject to Imprecise Observation. In LITES, Volume 8, Issue 2 (2022): Special Issue on Distributed Hybrid Systems. Leibniz Transactions on Embedded Systems, Volume 8, Issue 2, pp. 05:1-05:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{kroger_et_al:LITES.8.2.5,
  author =	{Kr\"{o}ger, Paul and Fr\"{a}nzle, Martin},
  title =	{{Bayesian Hybrid Automata: A Formal Model of Justified Belief in Interacting Hybrid Systems Subject to Imprecise Observation}},
  journal =	{Leibniz Transactions on Embedded Systems},
  pages =	{05:1--05:27},
  ISSN =	{2199-2002},
  year =	{2022},
  volume =	{8},
  number =	{2},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LITES.8.2.5},
  URN =		{urn:nbn:de:0030-drops-192970},
  doi =		{10.4230/LITES.8.2.5},
  annote =	{Keywords: }
}
  • Refine by Type
  • 15 Document/PDF
  • 13 Document/HTML
  • 1 Artifact

  • Refine by Publication Year
  • 10 2025
  • 4 2023
  • 1 2022
  • 1 2016

  • Refine by Author
  • 2 Chen, Jiaoyan
  • 2 Chen, Zhuo Zoey
  • 2 Monnin, Pierre
  • 2 Rizkallah, Christine
  • 2 Åman Pohjola, Johannes
  • Show More...

  • Refine by Series/Journal
  • 7 LIPIcs
  • 3 OASIcs
  • 1 LITES
  • 4 TGDK

  • Refine by Classification
  • 2 Computing methodologies → Knowledge representation and reasoning
  • 2 Computing methodologies → Natural language processing
  • 1 Applied computing → Life and medical sciences
  • 1 Applied computing → Operations research
  • 1 Computer systems organization → Embedded and cyber-physical systems
  • Show More...

  • Refine by Keyword
  • 2 Explainable AI
  • 2 Large Language Models
  • 1 Answer-Set Programming
  • 1 Anytime Optimization
  • 1 Blockchain
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail