28 Search Results for "Rodriguez, Andrew"


Document
Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122)

Authors: Bran Knowles, Vicki L. Hanson, Christoph Becker, Mike Berners-Lee, Andrew A. Chien, Benoit Combemale, Vlad Coroamă, Koen De Bosschere, Yi Ding, Adrian Friday, Boris Gamazaychikov, Lynda Hardman, Simon Hinterholzer, Mattias Höjer, Lynn Kaack, Lenneke Kuijer, Anne-Laure Ligozat, Jan Tobias Muehlberg, Yunmook Nah, Thomas Olsson, Anne-Cécile Orgerie, Daniel Pargman, Birgit Penzenstadler, Tom Romanoff, Emma Strubell, Colin Venters, and Junhua Zhao

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
This Manifesto was produced from the Perspectives Workshop 25122 entitled "Climate Change: What is Computing’s Responsibility?" held March 16-19, 2025 at Schloss Dagstuhl, Germany. The Workshop provided a forum for world-leading computer scientists and expert consultants on environmental policy and sustainable transition to engage in a critical and urgent conversation about computing’s responsibilities in addressing climate change - or more aptly, climate crisis. The resulting Manifesto outlines commitments and directions for future action which, if adopted as a basis for more responsible computing practices, will help ensure that these technologies do not threaten the long-term habitability of the planet. We preface our Manifesto with a recognition that humanity is on a path that is not in agreement with international global warming targets and explore how computing technologies are currently hastening the overshoot of these boundaries. We critically assess the vaunted potential for harnessing computing technologies for the mitigation of global warming, agreeing that, under current circumstances, computing is contributing to negative environmental impacts in other sectors. Computing primarily improves efficiency and reduces costs which leads to more consumption and more negative environmental impact. Relying solely on efficiency gains in computing has thus far proven to be insufficient to curb global greenhouse gas emissions. Therefore, computing’s purpose within a strategy for tackling climate change must be reimagined. Our recommendations cover changes that need to be urgently made to the design priorities of computing technologies, but also speak to the more systemic shift in mindset, with sustainability and human rights providing a necessary moral foundation for developing the kinds of computing technologies most needed by society. We also stress the importance of digital policy that accounts for both the direct material impacts of computing and the detrimental indirect impacts arising from computing-enabled efficiencies, and the role of computing professionals in informing policy making.

Cite as

Bran Knowles, Vicki L. Hanson, Christoph Becker, Mike Berners-Lee, Andrew A. Chien, Benoit Combemale, Vlad Coroamă, Koen De Bosschere, Yi Ding, Adrian Friday, Boris Gamazaychikov, Lynda Hardman, Simon Hinterholzer, Mattias Höjer, Lynn Kaack, Lenneke Kuijer, Anne-Laure Ligozat, Jan Tobias Muehlberg, Yunmook Nah, Thomas Olsson, Anne-Cécile Orgerie, Daniel Pargman, Birgit Penzenstadler, Tom Romanoff, Emma Strubell, Colin Venters, and Junhua Zhao. Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 1-18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{knowles_et_al:DagMan.11.1.1,
  author =	{Knowles, Bran and Hanson, Vicki L. and Becker, Christoph and Berners-Lee, Mike and Chien, Andrew A. and Combemale, Benoit and Coroam\u{a}, Vlad and De Bosschere, Koen and Ding, Yi and Friday, Adrian and Gamazaychikov, Boris and Hardman, Lynda and Hinterholzer, Simon and H\"{o}jer, Mattias and Kaack, Lynn and Kuijer, Lenneke and Ligozat, Anne-Laure and Muehlberg, Jan Tobias and Nah, Yunmook and Olsson, Thomas and Orgerie, Anne-C\'{e}cile and Pargman, Daniel and Penzenstadler, Birgit and Romanoff, Tom and Strubell, Emma and Venters, Colin and Zhao, Junhua},
  title =	{{Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122)}},
  pages =	{1--18},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Knowles, Bran and Hanson, Vicki L. and Becker, Christoph and Berners-Lee, Mike and Chien, Andrew A. and Combemale, Benoit and Coroam\u{a}, Vlad and De Bosschere, Koen and Ding, Yi and Friday, Adrian and Gamazaychikov, Boris and Hardman, Lynda and Hinterholzer, Simon and H\"{o}jer, Mattias and Kaack, Lynn and Kuijer, Lenneke and Ligozat, Anne-Laure and Muehlberg, Jan Tobias and Nah, Yunmook and Olsson, Thomas and Orgerie, Anne-C\'{e}cile and Pargman, Daniel and Penzenstadler, Birgit and Romanoff, Tom and Strubell, Emma and Venters, Colin and Zhao, Junhua},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.1},
  URN =		{urn:nbn:de:0030-drops-250724},
  doi =		{10.4230/DagMan.11.1.1},
  annote =	{Keywords: sustainability, climate change, efficiency, supply chain management, climate modelling}
}
Document
APPROX
Multipass Linear Sketches for Geometric LP-Type Problems

Authors: N. Efe Çekirge, William Gay, and David P. Woodruff

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
LP-type problems such as the Minimum Enclosing Ball (MEB), Linear Support Vector Machine (SVM), Linear Programming (LP), and Semidefinite Programming (SDP) are fundamental combinatorial optimization problems, with many important applications in machine learning applications such as classification, bioinformatics, and noisy learning. We study LP-type problems in several streaming and distributed big data models, giving ε-approximation linear sketching algorithms with a focus on the high accuracy regime with low dimensionality d, that is, when d < (1/ε)^0.999. Our main result is an O(ds) pass algorithm with O(s(√d/ε)^{3d/s}) ⋅ poly(d, log (1/ε)) space complexity in words, for any parameter s ∈ [1, d log (1/ε)], to solve ε-approximate LP-type problems of O(d) combinatorial and VC dimension. Notably, by taking s = d log (1/ε), we achieve space complexity polynomial in d and polylogarithmic in 1/ε, presenting exponential improvements in 1/ε over current algorithms. We complement our results by showing lower bounds of (1/ε)^Ω(d) for any 1-pass algorithm solving the (1 + ε)-approximation MEB and linear SVM problems, further motivating our multi-pass approach.

Cite as

N. Efe Çekirge, William Gay, and David P. Woodruff. Multipass Linear Sketches for Geometric LP-Type Problems. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 8:1-8:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cekirge_et_al:LIPIcs.APPROX/RANDOM.2025.8,
  author =	{\c{C}ekirge, N. Efe and Gay, William and Woodruff, David P.},
  title =	{{Multipass Linear Sketches for Geometric LP-Type Problems}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{8:1--8:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.8},
  URN =		{urn:nbn:de:0030-drops-243741},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.8},
  annote =	{Keywords: Streaming, sketching, LP-type problems}
}
Document
Mixing Time of Quantum Gibbs Sampling for Random Sparse Hamiltonians

Authors: Akshar Ramkumar and Mehdi Soleimanifar

Published in: LIPIcs, Volume 350, 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)


Abstract
Providing evidence that quantum computers can efficiently prepare low-energy or thermal states of physically relevant interacting quantum systems is a major challenge in quantum information science. A newly developed quantum Gibbs sampling algorithm [Chen et al., 2023] provides an efficient simulation of the detailed-balanced dissipative dynamics of non-commutative quantum systems. The running time of this algorithm depends on the mixing time of the corresponding quantum Markov chain, which has not been rigorously bounded except in the high-temperature regime. In this work, we establish a polylog(n) upper bound on its mixing time for various families of random n × n sparse Hamiltonians at any constant temperature. We further analyze how the choice of the jump operators for the algorithm and the spectral properties of these sparse Hamiltonians influence the mixing time. Our result places this method for Gibbs sampling on par with other efficient algorithms for preparing low-energy states of quantumly easy Hamiltonians.

Cite as

Akshar Ramkumar and Mehdi Soleimanifar. Mixing Time of Quantum Gibbs Sampling for Random Sparse Hamiltonians. In 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 350, pp. 3:1-3:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ramkumar_et_al:LIPIcs.TQC.2025.3,
  author =	{Ramkumar, Akshar and Soleimanifar, Mehdi},
  title =	{{Mixing Time of Quantum Gibbs Sampling for Random Sparse Hamiltonians}},
  booktitle =	{20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)},
  pages =	{3:1--3:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-392-8},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{350},
  editor =	{Fefferman, Bill},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2025.3},
  URN =		{urn:nbn:de:0030-drops-240520},
  doi =		{10.4230/LIPIcs.TQC.2025.3},
  annote =	{Keywords: Quantum algorithms, quantum Gibbs sampling, mixing time analysis}
}
Document
Efficient Quantum Pseudorandomness from Hamiltonian Phase States

Authors: John Bostanci, Jonas Haferkamp, Dominik Hangleiter, and Alexander Poremba

Published in: LIPIcs, Volume 350, 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)


Abstract
Quantum pseudorandomness has found applications in many areas of quantum information, ranging from entanglement theory, to models of scrambling phenomena in chaotic quantum systems, and, more recently, in the foundations of quantum cryptography. Kretschmer (TQC '21) showed that both pseudorandom states and pseudorandom unitaries exist even in a world without classical one-way functions. To this day, however, all known constructions require classical cryptographic building blocks which are themselves synonymous with the existence of one-way functions, and which are also challenging to implement on realistic quantum hardware. In this work, we seek to make progress on both of these fronts simultaneously - by decoupling quantum pseudorandomness from classical cryptography altogether. We introduce a quantum hardness assumption called the Hamiltonian Phase State (HPS) problem, which is the task of decoding output states of a random instantaneous quantum polynomial-time (IQP) circuit. Hamiltonian phase states can be generated very efficiently using only Hadamard gates, single-qubit Z rotations and CNOT circuits. We show that the hardness of our problem reduces to a worst-case version of the problem, and we provide evidence that our assumption is plausibly fully quantum; meaning, it cannot be used to construct one-way functions. We also show information-theoretic hardness when only few copies of HPS are available by proving an approximate t-design property of our ensemble. Finally, we show that our HPS assumption and its variants allow us to efficiently construct many pseudorandom quantum primitives, ranging from pseudorandom states, to quantum pseudoentanglement, to pseudorandom unitaries, and even primitives such as public-key encryption with quantum keys.

Cite as

John Bostanci, Jonas Haferkamp, Dominik Hangleiter, and Alexander Poremba. Efficient Quantum Pseudorandomness from Hamiltonian Phase States. In 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 350, pp. 9:1-9:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bostanci_et_al:LIPIcs.TQC.2025.9,
  author =	{Bostanci, John and Haferkamp, Jonas and Hangleiter, Dominik and Poremba, Alexander},
  title =	{{Efficient Quantum Pseudorandomness from Hamiltonian Phase States}},
  booktitle =	{20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)},
  pages =	{9:1--9:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-392-8},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{350},
  editor =	{Fefferman, Bill},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2025.9},
  URN =		{urn:nbn:de:0030-drops-240586},
  doi =		{10.4230/LIPIcs.TQC.2025.9},
  annote =	{Keywords: Quantum pseudorandomness, quantum phase states, quantum cryptography}
}
Document
Reachability in Deletion-Only Chemical Reaction Networks

Authors: Bin Fu, Timothy Gomez, Ryan Knobel, Austin Luchsinger, Aiden Massie, Marco Rodriguez, Adrian Salinas, Robert Schweller, and Tim Wylie

Published in: LIPIcs, Volume 347, 31st International Conference on DNA Computing and Molecular Programming (DNA 31) (2025)


Abstract
For general discrete Chemical Reaction Networks (CRNs), the fundamental problem of reachability - the question of whether a target configuration can be produced from a given initial configuration - was recently shown to be Ackermann-complete. However, many open questions remain about which features of the CRN model drive this complexity. We study a restricted class of CRNs with void rules, reactions that only decrease species counts. We further examine this regime in the motivated model of step CRNs, which allow additional species to be introduced in discrete stages. With and without steps, we characterize the complexity of the reachability problem for CRNs with void rules. We show that, without steps, reachability remains polynomial-time solvable for bimolecular systems but becomes NP-complete for larger reactions. Conversely, with just a single step, reachability becomes NP-complete even for bimolecular systems. Our results provide a nearly complete classification of void-rule reachability problems into tractable and intractable cases, with only a single exception.

Cite as

Bin Fu, Timothy Gomez, Ryan Knobel, Austin Luchsinger, Aiden Massie, Marco Rodriguez, Adrian Salinas, Robert Schweller, and Tim Wylie. Reachability in Deletion-Only Chemical Reaction Networks. In 31st International Conference on DNA Computing and Molecular Programming (DNA 31). Leibniz International Proceedings in Informatics (LIPIcs), Volume 347, pp. 3:1-3:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{fu_et_al:LIPIcs.DNA.31.3,
  author =	{Fu, Bin and Gomez, Timothy and Knobel, Ryan and Luchsinger, Austin and Massie, Aiden and Rodriguez, Marco and Salinas, Adrian and Schweller, Robert and Wylie, Tim},
  title =	{{Reachability in Deletion-Only Chemical Reaction Networks}},
  booktitle =	{31st International Conference on DNA Computing and Molecular Programming (DNA 31)},
  pages =	{3:1--3:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-399-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{347},
  editor =	{Schaeffer, Josie and Zhang, Fei},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.DNA.31.3},
  URN =		{urn:nbn:de:0030-drops-238521},
  doi =		{10.4230/LIPIcs.DNA.31.3},
  annote =	{Keywords: CRN, Chemical Reaction Network, Reachability, Void Reactions}
}
Document
Which Phylogenetic Networks Are Level-k Networks with Additional Arcs? Structure and Algorithms

Authors: Takatora Suzuki and Momoko Hayamizu

Published in: LIPIcs, Volume 344, 25th International Conference on Algorithms for Bioinformatics (WABI 2025)


Abstract
Reticulate evolution gives rise to complex phylogenetic networks, making their interpretation challenging. A typical approach is to extract trees within such networks. Since Francis and Steel’s seminal paper, "Which Phylogenetic Networks are Merely Trees with Additional Arcs?" (2015), tree-based phylogenetic networks and their support trees (spanning trees with the same root and leaf-set as a given network) have been extensively studied. However, not all phylogenetic networks are tree-based, and for the study of reticulate evolution, it is often more biologically relevant to identify support networks rather than trees. This study generalizes Hayamizu’s structure theorem, which yielded optimal algorithms for various computational problems on support trees of rooted almost-binary phylogenetic networks, to extend the theoretical framework for support trees to support networks. This allows us to obtain a direct-product characterization of each of three sets: all, minimal, and minimum support networks, for a given network. Each characterization yields optimal algorithms for counting and generating the support networks of each type. Applications include a linear-time algorithm for finding a support network with the fewest reticulations (i.e., the minimum tier). We also provide exact and heuristic algorithms for finding a support network with the minimum level, both running in exponential time but practical across a reasonably wide range of reticulation numbers.

Cite as

Takatora Suzuki and Momoko Hayamizu. Which Phylogenetic Networks Are Level-k Networks with Additional Arcs? Structure and Algorithms. In 25th International Conference on Algorithms for Bioinformatics (WABI 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 344, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{suzuki_et_al:LIPIcs.WABI.2025.19,
  author =	{Suzuki, Takatora and Hayamizu, Momoko},
  title =	{{Which Phylogenetic Networks Are Level-k Networks with Additional Arcs? Structure and Algorithms}},
  booktitle =	{25th International Conference on Algorithms for Bioinformatics (WABI 2025)},
  pages =	{19:1--19:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-386-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{344},
  editor =	{Brejov\'{a}, Bro\v{n}a and Patro, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2025.19},
  URN =		{urn:nbn:de:0030-drops-239454},
  doi =		{10.4230/LIPIcs.WABI.2025.19},
  annote =	{Keywords: Phylogenetic networks, Support networks, Level-k networks, Tier-k networks, Structure theorem, Enumeration, Optimization}
}
Document
Assessing Map Reproducibility with Visual Question-Answering: An Empirical Evaluation

Authors: Eftychia Koukouraki, Auriol Degbelo, and Christian Kray

Published in: LIPIcs, Volume 346, 13th International Conference on Geographic Information Science (GIScience 2025)


Abstract
Reproducibility is a key principle of the modern scientific method. Maps, as an important means of communicating scientific results in GIScience and across disciplines, should be reproducible. Currently, map reproducibility assessment is done manually, which makes the assessment process tedious and time-consuming, ultimately limiting its efficiency. Hence, this work explores the extent to which Visual Question-Answering (VQA) can be used to automate some tasks relevant to map reproducibility assessment. We selected five state-of-the-art vision language models (VLMs) and followed a three-step approach to evaluate their ability to discriminate between maps and other images, interpret map content, and compare two map images using VQA. Our results show that current VLMs already possess map-reading capabilities and demonstrate understanding of spatial concepts, such as cardinal directions, geographic scope, and legend interpretation. Our paper demonstrates the potential of using VQA to support reproducibility assessment and highlights the outstanding issues that need to be addressed to achieve accurate, trustworthy map descriptions, thereby reducing the time and effort required by human evaluators.

Cite as

Eftychia Koukouraki, Auriol Degbelo, and Christian Kray. Assessing Map Reproducibility with Visual Question-Answering: An Empirical Evaluation. In 13th International Conference on Geographic Information Science (GIScience 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 346, pp. 13:1-13:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{koukouraki_et_al:LIPIcs.GIScience.2025.13,
  author =	{Koukouraki, Eftychia and Degbelo, Auriol and Kray, Christian},
  title =	{{Assessing Map Reproducibility with Visual Question-Answering: An Empirical Evaluation}},
  booktitle =	{13th International Conference on Geographic Information Science (GIScience 2025)},
  pages =	{13:1--13:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-378-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{346},
  editor =	{Sila-Nowicka, Katarzyna and Moore, Antoni and O'Sullivan, David and Adams, Benjamin and Gahegan, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.GIScience.2025.13},
  URN =		{urn:nbn:de:0030-drops-238426},
  doi =		{10.4230/LIPIcs.GIScience.2025.13},
  annote =	{Keywords: map comparison, computational reproducibility, visual question answering, large language models, GeoAI}
}
Document
DiVerG: Scalable Distance Index for Validation of Paired-End Alignments in Sequence Graphs

Authors: Ali Ghaffaari, Alexander Schönhuth, and Tobias Marschall

Published in: LIPIcs, Volume 344, 25th International Conference on Algorithms for Bioinformatics (WABI 2025)


Abstract
Determining the distance between two loci within a genomic region is a recurrent operation in various tasks in computational genomics. A notable example of this task arises in paired-end read mapping as a form of validation of distances between multiple alignments. While straightforward for a single genome, graph-based reference structures render the operation considerably more involved. Given the sheer number of such queries in a typical read mapping experiment, an efficient algorithm for answering distance queries is crucial. In this paper, we introduce DiVerG, a compact data structure as well as a fast and scalable algorithm, for constructing distance indexes for general sequence graphs on multi-core CPU and many-core GPU architectures. DiVerG is based on PairG [Jain et al., 2019], but overcomes the limitations of PairG by exploiting the extensive potential for improvements in terms of scalability and space efficiency. As a consequence, DiVerG can process substantially larger datasets, such as whole human genomes, which are unmanageable by PairG. DiVerG offers faster index construction time and consistently faster query time with gains proportional to the size of the underlying compact data structure. We demonstrate that our method performs favorably on multiple real datasets at various scales. DiVerG achieves superior performance over PairG; e.g. resulting to 2.5-4x speed-up in query time, 44-340x smaller index size, and 3-50x faster construction time for the genome graph of the MHC region, as a particularly variable region of the human genome. The implementation is available at: https://github.com/cartoonist/diverg

Cite as

Ali Ghaffaari, Alexander Schönhuth, and Tobias Marschall. DiVerG: Scalable Distance Index for Validation of Paired-End Alignments in Sequence Graphs. In 25th International Conference on Algorithms for Bioinformatics (WABI 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 344, pp. 10:1-10:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ghaffaari_et_al:LIPIcs.WABI.2025.10,
  author =	{Ghaffaari, Ali and Sch\"{o}nhuth, Alexander and Marschall, Tobias},
  title =	{{DiVerG: Scalable Distance Index for Validation of Paired-End Alignments in Sequence Graphs}},
  booktitle =	{25th International Conference on Algorithms for Bioinformatics (WABI 2025)},
  pages =	{10:1--10:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-386-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{344},
  editor =	{Brejov\'{a}, Bro\v{n}a and Patro, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2025.10},
  URN =		{urn:nbn:de:0030-drops-239369},
  doi =		{10.4230/LIPIcs.WABI.2025.10},
  annote =	{Keywords: Sequence graph, distance index, read mapping, sparse matrix}
}
Document
Practically Feasible Proof Logging for Pseudo-Boolean Optimization

Authors: Wietze Koops, Daniel Le Berre, Magnus O. Myreen, Jakob Nordström, Andy Oertel, Yong Kiam Tan, and Marc Vinyals

Published in: LIPIcs, Volume 340, 31st International Conference on Principles and Practice of Constraint Programming (CP 2025)


Abstract
Certifying solvers have long been standard for decision problems in Boolean satisfiability (SAT), allowing for proof logging and checking with very limited overhead, but developing similar tools for combinatorial optimization has remained a challenge. A recent promising approach covering a wide range of solving paradigms is pseudo-Boolean proof logging, but this has mostly consisted of proof-of-concept works far from delivering the performance required for real-world deployment. In this work, we present an efficient toolchain based on VeriPB and CakePB for formally verified pseudo-Boolean optimization. We implement proof logging for the full range of techniques in the state-of-the-art solvers RoundingSat and Sat4j, including core-guided search and linear programming integration with Farkas certificates and cut generation. Our experimental evaluation shows that proof logging and checking performance in this much more expressive paradigm is now quite close to the level of SAT solving, and hence is clearly practically feasible.

Cite as

Wietze Koops, Daniel Le Berre, Magnus O. Myreen, Jakob Nordström, Andy Oertel, Yong Kiam Tan, and Marc Vinyals. Practically Feasible Proof Logging for Pseudo-Boolean Optimization. In 31st International Conference on Principles and Practice of Constraint Programming (CP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 340, pp. 21:1-21:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{koops_et_al:LIPIcs.CP.2025.21,
  author =	{Koops, Wietze and Le Berre, Daniel and Myreen, Magnus O. and Nordstr\"{o}m, Jakob and Oertel, Andy and Tan, Yong Kiam and Vinyals, Marc},
  title =	{{Practically Feasible Proof Logging for Pseudo-Boolean Optimization}},
  booktitle =	{31st International Conference on Principles and Practice of Constraint Programming (CP 2025)},
  pages =	{21:1--21:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-380-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{340},
  editor =	{de la Banda, Maria Garcia},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2025.21},
  URN =		{urn:nbn:de:0030-drops-238825},
  doi =		{10.4230/LIPIcs.CP.2025.21},
  annote =	{Keywords: proof logging, certifying algorithms, combinatorial optimization, certification, pseudo-Boolean solving, 0-1 integer linear programming}
}
Document
Portuguese Far-Right Discourse on Social Media: Insights from Topic Modeling

Authors: Mauro Cardoso, Eugénio Ribeiro, and Fernando Batista

Published in: OASIcs, Volume 135, 14th Symposium on Languages, Applications and Technologies (SLATE 2025)


Abstract
This study analyzes the social media discourse of leading figures from Portugal’s far right party CHEGA, examining 10,323 posts on X (formerly Twitter) published between late 2019 and mid‑2024. Using BERTopic, 59 latent topics clustered into two main discursive dynamics were found: (1) ideological and public, and (2) party, electoral and parliamentary related. Within the first dynamic, we conducted a focused sub-analysis of themes related with identity, immigration and security narratives - topics that display posting peaks around electoral cycles, suggesting the strategic use of emotionally charged, identitarian frames for political mobilization. The model exhibits strong topic coherence and lexical diversity, indicating its robustness in extracting thematic structures from politically polarized microtexts. Nevertheless, our findings are constrained by source, the absence of interaction metrics, and the unmet need to link online discourse to offline events. This study demonstrates how computational topic modeling can reveal strategic communication patterns in far-right political discourse and underscores the need for cross-platform and interaction-level research to assess broader societal impact.

Cite as

Mauro Cardoso, Eugénio Ribeiro, and Fernando Batista. Portuguese Far-Right Discourse on Social Media: Insights from Topic Modeling. In 14th Symposium on Languages, Applications and Technologies (SLATE 2025). Open Access Series in Informatics (OASIcs), Volume 135, pp. 12:1-12:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cardoso_et_al:OASIcs.SLATE.2025.12,
  author =	{Cardoso, Mauro and Ribeiro, Eug\'{e}nio and Batista, Fernando},
  title =	{{Portuguese Far-Right Discourse on Social Media: Insights from Topic Modeling}},
  booktitle =	{14th Symposium on Languages, Applications and Technologies (SLATE 2025)},
  pages =	{12:1--12:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-387-4},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{135},
  editor =	{Baptista, Jorge and Barateiro, Jos\'{e}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SLATE.2025.12},
  URN =		{urn:nbn:de:0030-drops-236929},
  doi =		{10.4230/OASIcs.SLATE.2025.12},
  annote =	{Keywords: Political Discourse, Topic Modeling, Far-Right, CHEGA (Portugal), Social Media}
}
Document
Enabling Containerisation of Distributed Applications with Real-Time Constraints

Authors: Nasim Samimi, Luca Abeni, Daniel Casini, Mauro Marinoni, Twan Basten, Mitra Nasri, Marc Geilen, and Alessandro Biondi

Published in: LIPIcs, Volume 335, 37th Euromicro Conference on Real-Time Systems (ECRTS 2025)


Abstract
Containerisation is becoming a cornerstone of modern distributed systems, thanks to their lightweight virtualisation, high portability, and seamless integration with orchestration tools such as Kubernetes. The usage of containers has also gained traction in real-time cyber-physical systems, such as software-defined vehicles, which are characterised by strict timing requirements to ensure safety and performance. Nevertheless, ensuring real-time execution of co-located containers is challenging because of mutual interference due to the sharing of the same processing hardware. Existing parallel computing frameworks such as Ray and its Kubernetes-enabled variant, KubeRay, excel in distributed computation but lack support for scheduling policies that allow guaranteeing real-time timing constraints and CPU resource isolation between containers, such as the SCHED_DEADLINE policy of Linux. To fill this gap, this paper extends Ray to support real-time containers that leverage SCHED_DEADLINE. To this end, we propose KubeDeadline, a novel, modular Kubernetes extension to support SCHED_DEADLINE. We evaluate our approach through extensive experiments, using synthetic workloads and a case study based on the MobileNet and EfficientNet deep neural networks. Our evaluation shows that KubeDeadline ensures deadline compliance in all synthetic workloads, adds minimal deployment overhead (in the order of milliseconds), and achieves lower worst-case response times, up to 4 times lower, than vanilla Kubernetes under background interference.

Cite as

Nasim Samimi, Luca Abeni, Daniel Casini, Mauro Marinoni, Twan Basten, Mitra Nasri, Marc Geilen, and Alessandro Biondi. Enabling Containerisation of Distributed Applications with Real-Time Constraints. In 37th Euromicro Conference on Real-Time Systems (ECRTS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 335, pp. 3:1-3:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{samimi_et_al:LIPIcs.ECRTS.2025.3,
  author =	{Samimi, Nasim and Abeni, Luca and Casini, Daniel and Marinoni, Mauro and Basten, Twan and Nasri, Mitra and Geilen, Marc and Biondi, Alessandro},
  title =	{{Enabling Containerisation of Distributed Applications with Real-Time Constraints}},
  booktitle =	{37th Euromicro Conference on Real-Time Systems (ECRTS 2025)},
  pages =	{3:1--3:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-377-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{335},
  editor =	{Mancuso, Renato},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECRTS.2025.3},
  URN =		{urn:nbn:de:0030-drops-235816},
  doi =		{10.4230/LIPIcs.ECRTS.2025.3},
  annote =	{Keywords: Kubernetes, real-time containers, SCHED\underlineDEADLINE, KubeRay}
}
Document
Track A: Algorithms, Complexity and Games
Algorithms for the Diverse-k-SAT Problem: The Geometry of Satisfying Assignments

Authors: Per Austrin, Ioana O. Bercea, Mayank Goswami, Nutan Limaye, and Adarsh Srinivasan

Published in: LIPIcs, Volume 334, 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)


Abstract
Given a k-CNF formula and an integer s ≥ 2, we study algorithms that obtain s solutions to the formula that are as dispersed as possible. For s = 2, this problem of computing the diameter of a k-CNF formula was initiated by Creszenzi and Rossi, who showed strong hardness results even for k = 2. The current best upper bound [Angelsmark and Thapper '04] goes to 4ⁿ as k → ∞. As our first result, we show that this quadratic blow up is not necessary by utilizing the Fast-Fourier transform (FFT) to give a O^*(2ⁿ) time exact algorithm for computing the diameter of any k-CNF formula. For s > 2, the problem was raised in the SAT community (Nadel '11) and several heuristics have been proposed for it, but no algorithms with theoretical guarantees are known. We give exact algorithms using FFT and clique-finding that run in O^*(2^{(s-1)n}) and O^*(s² |Ω_{𝐅}|^{ω ⌈ s/3 ⌉}) respectively, where |Ω_{𝐅}| is the size of the solutions space of the formula 𝐅 and ω is the matrix multiplication exponent. However, current SAT algorithms for finding one solution run in time O^*(2^{ε_{k}n}) for ε_{k} ≈ 1-Θ(1/k), which is much faster than all above run times. As our main result, we analyze two popular SAT algorithms - PPZ (Paturi, Pudlák, Zane '97) and Schöning’s ('02) algorithms, and show that in time poly(s)O^*(2^{ε_{k}n}), they can be used to approximate diameter as well as the dispersion (s > 2) problem. While we need to modify Schöning’s original algorithm for technical reasons, we show that the PPZ algorithm, without any modification, samples solutions in a geometric sense. We believe this geometric sampling property of PPZ may be of independent interest. Finally, we focus on diverse solutions to NP-complete optimization problems, and give bi-approximations running in time poly(s)O^*(2^{ε n}) with ε < 1 for several problems such as Maximum Independent Set, Minimum Vertex Cover, Minimum Hitting Set, Feedback Vertex Set, Multicut on Trees and Interval Vertex Deletion. For all of these problems, all existing exact methods for finding optimal diverse solutions have a runtime with at least an exponential dependence on the number of solutions s. Our methods show that by relaxing to bi-approximations, this dependence on s can be made polynomial.

Cite as

Per Austrin, Ioana O. Bercea, Mayank Goswami, Nutan Limaye, and Adarsh Srinivasan. Algorithms for the Diverse-k-SAT Problem: The Geometry of Satisfying Assignments. In 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 334, pp. 14:1-14:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{austrin_et_al:LIPIcs.ICALP.2025.14,
  author =	{Austrin, Per and Bercea, Ioana O. and Goswami, Mayank and Limaye, Nutan and Srinivasan, Adarsh},
  title =	{{Algorithms for the Diverse-k-SAT Problem: The Geometry of Satisfying Assignments}},
  booktitle =	{52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)},
  pages =	{14:1--14:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-372-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{334},
  editor =	{Censor-Hillel, Keren and Grandoni, Fabrizio and Ouaknine, Jo\"{e}l and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2025.14},
  URN =		{urn:nbn:de:0030-drops-233916},
  doi =		{10.4230/LIPIcs.ICALP.2025.14},
  annote =	{Keywords: Exponential time algorithms, Satisfiability, k-SAT, PPZ, Sch\"{o}ning, Dispersion, Diversity}
}
Document
Track A: Algorithms, Complexity and Games
Nearly Optimal Circuit Size for Sparse Quantum State Preparation

Authors: Lvzhou Li and Jingquan Luo

Published in: LIPIcs, Volume 334, 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)


Abstract
Quantum state preparation is a fundamental and significant subroutine in quantum computing. In this paper, we conduct a systematic investigation of the circuit size (the total count of elementary gates in the circuit) for sparse quantum state preparation. A quantum state is said to be d-sparse if it has only d non-zero amplitudes. For the task of preparing an n-qubit d-sparse quantum state, we obtain the following results: - Without ancillary qubits: Any n-qubit d-sparse quantum state can be prepared by a quantum circuit of size O(nd/(log n) + n) without using ancillary qubits, which improves the previous best results. It is asymptotically optimal when d = poly(n), and this optimality holds for a broader scope under some reasonable assumptions. - With limited ancillary qubits: (i) Based on the first result, we prove for the first time a trade-off between the number of ancillary qubits and the circuit size: any n-qubit d-sparse quantum state can be prepared by a quantum circuit of size O((nd)/(log(n + m)) + n) using m ancillary qubits for any m ∈ O((nd)/(log nd) + n). (ii) We establish a matching lower bound Ω((nd)/(log(n+m))+n) under some reasonable assumptions, and obtain a slightly weaker lower bound Ω((nd)/(log(n+m)+log d) + n) without any assumptions. - With unlimited ancillary qubits: Given an arbitrary amount of ancillary qubits available, the circuit size for preparing n-qubit d-sparse quantum states is Θ((nd)/(log nd) + n).

Cite as

Lvzhou Li and Jingquan Luo. Nearly Optimal Circuit Size for Sparse Quantum State Preparation. In 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 334, pp. 113:1-113:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{li_et_al:LIPIcs.ICALP.2025.113,
  author =	{Li, Lvzhou and Luo, Jingquan},
  title =	{{Nearly Optimal Circuit Size for Sparse Quantum State Preparation}},
  booktitle =	{52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)},
  pages =	{113:1--113:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-372-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{334},
  editor =	{Censor-Hillel, Keren and Grandoni, Fabrizio and Ouaknine, Jo\"{e}l and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2025.113},
  URN =		{urn:nbn:de:0030-drops-234900},
  doi =		{10.4230/LIPIcs.ICALP.2025.113},
  annote =	{Keywords: Quantum computing, quantum state preparation, circuit complexity}
}
Document
Chain of Grounded Objectives: Concise Goal-Oriented Prompting for Code Generation

Authors: Sangyeop Yeo, Seung-Won Hwang, and Yu-Seung Ma

Published in: LIPIcs, Volume 333, 39th European Conference on Object-Oriented Programming (ECOOP 2025)


Abstract
The use of Large Language Models (LLMs) for code generation has gained significant attention in recent years. Existing methods often aim to improve the quality of generated code by incorporating additional contextual information or guidance into input prompts. Many of these approaches adopt process-oriented reasoning strategies, mimicking human-like step-by-step thinking; however, they may not always align with the structured nature of programming languages. This paper introduces Chain of Grounded Objectives (CGO), a concise goal-oriented prompting approach that embeds functional objectives into prompts to enhance code generation. By focusing on precisely defined objectives rather than explicit procedural steps, CGO aligns more naturally with programming tasks while retaining flexibility. Empirical evaluations on HumanEval, MBPP, their extended versions, and LiveCodeBench show that CGO achieves accuracy comparable to or better than existing methods while using fewer tokens, making it a more efficient approach to LLM-based code generation.

Cite as

Sangyeop Yeo, Seung-Won Hwang, and Yu-Seung Ma. Chain of Grounded Objectives: Concise Goal-Oriented Prompting for Code Generation. In 39th European Conference on Object-Oriented Programming (ECOOP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 333, pp. 35:1-35:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{yeo_et_al:LIPIcs.ECOOP.2025.35,
  author =	{Yeo, Sangyeop and Hwang, Seung-Won and Ma, Yu-Seung},
  title =	{{Chain of Grounded Objectives: Concise Goal-Oriented Prompting for Code Generation}},
  booktitle =	{39th European Conference on Object-Oriented Programming (ECOOP 2025)},
  pages =	{35:1--35:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-373-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{333},
  editor =	{Aldrich, Jonathan and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2025.35},
  URN =		{urn:nbn:de:0030-drops-233271},
  doi =		{10.4230/LIPIcs.ECOOP.2025.35},
  annote =	{Keywords: Artificial Intelligence, Natural Language Processing, Prompt Design, Large Language Models, Code Generation}
}
Document
Practical Type-Based Taint Checking and Inference

Authors: Nima Karimipour, Kanak Das, Manu Sridharan, and Behnaz Hassanshahi

Published in: LIPIcs, Volume 333, 39th European Conference on Object-Oriented Programming (ECOOP 2025)


Abstract
Many important security properties can be formulated in terms of flows of tainted data, and improved taint analysis tools to prevent such flows are of critical need. Most existing taint analyses use whole-program static analysis, leading to scalability challenges. Type-based checking is a promising alternative, as it enables modular and incremental checking for fast performance. However, type-based approaches have not been widely adopted in practice, due to challenges with false positives and annotating existing codebases. In this paper, we present a new approach to type-based checking of taint properties that addresses these challenges, based on two key techniques. First, we present a new type-based tainting checker with significantly reduced false positives, via more practical handling of third-party libraries and other language constructs. Second, we present a novel technique to automatically infer tainting type qualifiers for existing code. Our technique supports inference of generic type argument annotations, crucial for tainting properties. We implemented our techniques in a tool TaintTyper and evaluated it on real-world benchmarks. TaintTyper exceeds the recall of a state-of-the-art whole-program taint analyzer, with comparable precision, and 2.93X-22.9X faster checking time. Further, TaintTyper infers annotations comparable to those written by hand, suitable for insertion into source code. TaintTyper is a promising new approach to efficient and practical taint checking.

Cite as

Nima Karimipour, Kanak Das, Manu Sridharan, and Behnaz Hassanshahi. Practical Type-Based Taint Checking and Inference. In 39th European Conference on Object-Oriented Programming (ECOOP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 333, pp. 18:1-18:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{karimipour_et_al:LIPIcs.ECOOP.2025.18,
  author =	{Karimipour, Nima and Das, Kanak and Sridharan, Manu and Hassanshahi, Behnaz},
  title =	{{Practical Type-Based Taint Checking and Inference}},
  booktitle =	{39th European Conference on Object-Oriented Programming (ECOOP 2025)},
  pages =	{18:1--18:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-373-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{333},
  editor =	{Aldrich, Jonathan and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2025.18},
  URN =		{urn:nbn:de:0030-drops-233119},
  doi =		{10.4230/LIPIcs.ECOOP.2025.18},
  annote =	{Keywords: Static analysis, Taint Analysis, Pluggable type systems, Security, Inference}
}
  • Refine by Type
  • 28 Document/PDF
  • 24 Document/HTML

  • Refine by Publication Year
  • 20 2025
  • 1 2024
  • 6 2023
  • 1 2022

  • Refine by Author
  • 6 Gomez, Timothy
  • 6 Schweller, Robert
  • 6 Wylie, Tim
  • 4 Knobel, Ryan
  • 3 Alaniz, Robert M.
  • Show More...

  • Refine by Series/Journal
  • 19 LIPIcs
  • 2 OASIcs
  • 6 TGDK
  • 1 DagMan

  • Refine by Classification
  • 4 Computing methodologies → Knowledge representation and reasoning
  • 4 Theory of computation → Problems, reductions and completeness
  • 3 Computing methodologies → Natural language processing
  • 2 Computing methodologies → Artificial intelligence
  • 2 Information systems → Graph-based database models
  • Show More...

  • Refine by Keyword
  • 3 Knowledge Graphs
  • 2 CRN
  • 2 Chemical Reaction Network
  • 2 Large Language Models
  • 2 Reachability
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail