160 Search Results for "David, Nicolas"


Document
Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)

Authors: Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
During the workshop, we deeply discussed what CONversational Information ACcess (CONIAC) is and its unique features, proposing a world model abstracting it, and defined the Conversational Agents Framework for Evaluation (CAFE) for the evaluation of CONIAC systems, consisting of six major components: 1) goals of the system’s stakeholders, 2) user tasks to be studied in the evaluation, 3) aspects of the users carrying out the tasks, 4) evaluation criteria to be considered, 5) evaluation methodology to be applied, and 6) measures for the quantitative criteria chosen.

Cite as

Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen. Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 19-67, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{bauer_et_al:DagMan.11.1.19,
  author =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  title =	{{Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)}},
  pages =	{19--67},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.19},
  URN =		{urn:nbn:de:0030-drops-252722},
  doi =		{10.4230/DagMan.11.1.19},
  annote =	{Keywords: Conversational Agents, Evaluation, Information Access}
}
Document
Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122)

Authors: Bran Knowles, Vicki L. Hanson, Christoph Becker, Mike Berners-Lee, Andrew A. Chien, Benoit Combemale, Vlad Coroamă, Koen De Bosschere, Yi Ding, Adrian Friday, Boris Gamazaychikov, Lynda Hardman, Simon Hinterholzer, Mattias Höjer, Lynn Kaack, Lenneke Kuijer, Anne-Laure Ligozat, Jan Tobias Muehlberg, Yunmook Nah, Thomas Olsson, Anne-Cécile Orgerie, Daniel Pargman, Birgit Penzenstadler, Tom Romanoff, Emma Strubell, Colin Venters, and Junhua Zhao

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
This Manifesto was produced from the Perspectives Workshop 25122 entitled "Climate Change: What is Computing’s Responsibility?" held March 16-19, 2025 at Schloss Dagstuhl, Germany. The Workshop provided a forum for world-leading computer scientists and expert consultants on environmental policy and sustainable transition to engage in a critical and urgent conversation about computing’s responsibilities in addressing climate change - or more aptly, climate crisis. The resulting Manifesto outlines commitments and directions for future action which, if adopted as a basis for more responsible computing practices, will help ensure that these technologies do not threaten the long-term habitability of the planet. We preface our Manifesto with a recognition that humanity is on a path that is not in agreement with international global warming targets and explore how computing technologies are currently hastening the overshoot of these boundaries. We critically assess the vaunted potential for harnessing computing technologies for the mitigation of global warming, agreeing that, under current circumstances, computing is contributing to negative environmental impacts in other sectors. Computing primarily improves efficiency and reduces costs which leads to more consumption and more negative environmental impact. Relying solely on efficiency gains in computing has thus far proven to be insufficient to curb global greenhouse gas emissions. Therefore, computing’s purpose within a strategy for tackling climate change must be reimagined. Our recommendations cover changes that need to be urgently made to the design priorities of computing technologies, but also speak to the more systemic shift in mindset, with sustainability and human rights providing a necessary moral foundation for developing the kinds of computing technologies most needed by society. We also stress the importance of digital policy that accounts for both the direct material impacts of computing and the detrimental indirect impacts arising from computing-enabled efficiencies, and the role of computing professionals in informing policy making.

Cite as

Bran Knowles, Vicki L. Hanson, Christoph Becker, Mike Berners-Lee, Andrew A. Chien, Benoit Combemale, Vlad Coroamă, Koen De Bosschere, Yi Ding, Adrian Friday, Boris Gamazaychikov, Lynda Hardman, Simon Hinterholzer, Mattias Höjer, Lynn Kaack, Lenneke Kuijer, Anne-Laure Ligozat, Jan Tobias Muehlberg, Yunmook Nah, Thomas Olsson, Anne-Cécile Orgerie, Daniel Pargman, Birgit Penzenstadler, Tom Romanoff, Emma Strubell, Colin Venters, and Junhua Zhao. Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 1-18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{knowles_et_al:DagMan.11.1.1,
  author =	{Knowles, Bran and Hanson, Vicki L. and Becker, Christoph and Berners-Lee, Mike and Chien, Andrew A. and Combemale, Benoit and Coroam\u{a}, Vlad and De Bosschere, Koen and Ding, Yi and Friday, Adrian and Gamazaychikov, Boris and Hardman, Lynda and Hinterholzer, Simon and H\"{o}jer, Mattias and Kaack, Lynn and Kuijer, Lenneke and Ligozat, Anne-Laure and Muehlberg, Jan Tobias and Nah, Yunmook and Olsson, Thomas and Orgerie, Anne-C\'{e}cile and Pargman, Daniel and Penzenstadler, Birgit and Romanoff, Tom and Strubell, Emma and Venters, Colin and Zhao, Junhua},
  title =	{{Climate Change: What is Computing’s Responsibility? (Dagstuhl Perspectives Workshop 25122)}},
  pages =	{1--18},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Knowles, Bran and Hanson, Vicki L. and Becker, Christoph and Berners-Lee, Mike and Chien, Andrew A. and Combemale, Benoit and Coroam\u{a}, Vlad and De Bosschere, Koen and Ding, Yi and Friday, Adrian and Gamazaychikov, Boris and Hardman, Lynda and Hinterholzer, Simon and H\"{o}jer, Mattias and Kaack, Lynn and Kuijer, Lenneke and Ligozat, Anne-Laure and Muehlberg, Jan Tobias and Nah, Yunmook and Olsson, Thomas and Orgerie, Anne-C\'{e}cile and Pargman, Daniel and Penzenstadler, Birgit and Romanoff, Tom and Strubell, Emma and Venters, Colin and Zhao, Junhua},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.1},
  URN =		{urn:nbn:de:0030-drops-250724},
  doi =		{10.4230/DagMan.11.1.1},
  annote =	{Keywords: sustainability, climate change, efficiency, supply chain management, climate modelling}
}
Document
Survey
Resilience in Knowledge Graph Embeddings

Authors: Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo

Published in: TGDK, Volume 3, Issue 2 (2025). Transactions on Graph Data and Knowledge, Volume 3, Issue 2


Abstract
In recent years, knowledge graphs have gained interest and witnessed widespread applications in various domains, such as information retrieval, question-answering, recommendation systems, amongst others. Large-scale knowledge graphs to this end have demonstrated their utility in effectively representing structured knowledge. To further facilitate the application of machine learning techniques, knowledge graph embedding models have been developed. Such models can transform entities and relationships within knowledge graphs into vectors. However, these embedding models often face challenges related to noise, missing information, distribution shift, adversarial attacks, etc. This can lead to sub-optimal embeddings and incorrect inferences, thereby negatively impacting downstream applications. While the existing literature has focused so far on adversarial attacks on KGE models, the challenges related to the other critical aspects remain unexplored. In this paper, we, first of all, give a unified definition of resilience, encompassing several factors such as generalisation, in-distribution generalization, distribution adaption, and robustness. After formalizing these concepts for machine learning in general, we define them in the context of knowledge graphs. To find the gap in the existing works on resilience in the context of knowledge graphs, we perform a systematic survey, taking into account all these aspects mentioned previously. Our survey results show that most of the existing works focus on a specific aspect of resilience, namely robustness. After categorizing such works based on their respective aspects of resilience, we discuss the challenges and future research directions.

Cite as

Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo. Resilience in Knowledge Graph Embeddings. In Transactions on Graph Data and Knowledge (TGDK), Volume 3, Issue 2, pp. 1:1-1:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{sharma_et_al:TGDK.3.2.1,
  author =	{Sharma, Arnab and Kouagou, N'Dah Jean and Ngomo, Axel-Cyrille Ngonga},
  title =	{{Resilience in Knowledge Graph Embeddings}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{1:1--1:38},
  ISSN =	{2942-7517},
  year =	{2025},
  volume =	{3},
  number =	{2},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.3.2.1},
  URN =		{urn:nbn:de:0030-drops-248117},
  doi =		{10.4230/TGDK.3.2.1},
  annote =	{Keywords: Knowledge graphs, Resilience, Robustness}
}
Document
The Planted Orthogonal Vectors Problem

Authors: David Kühnemann, Adam Polak, and Alon Rosen

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
In the k-Orthogonal Vectors (k-OV) problem we are given k sets, each containing n binary vectors of dimension d = n^o(1), and our goal is to pick one vector from each set so that at each coordinate at least one vector has a zero. It is a central problem in fine-grained complexity, conjectured to require n^{k-o(1)} time in the worst case. We propose a way to plant a solution among vectors with i.i.d. p-biased entries, for appropriately chosen p, so that the planted solution is the unique one. Our conjecture is that the resulting k-OV instances still require time n^{k-o(1)} to solve, on average. Our planted distribution has the property that any subset of strictly less than k vectors has the same marginal distribution as in the model distribution, consisting of i.i.d. p-biased random vectors. We use this property to give average-case search-to-decision reductions for k-OV.

Cite as

David Kühnemann, Adam Polak, and Alon Rosen. The Planted Orthogonal Vectors Problem. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 95:1-95:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kuhnemann_et_al:LIPIcs.ESA.2025.95,
  author =	{K\"{u}hnemann, David and Polak, Adam and Rosen, Alon},
  title =	{{The Planted Orthogonal Vectors Problem}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{95:1--95:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.95},
  URN =		{urn:nbn:de:0030-drops-245640},
  doi =		{10.4230/LIPIcs.ESA.2025.95},
  annote =	{Keywords: Average-case complexity, fine-grained complexity, orthogonal vectors}
}
Document
Bandwidth vs BFS Width in Matrix Reordering, Graph Reconstruction, and Graph Drawing

Authors: David Eppstein, Michael T. Goodrich, and Songyu (Alfred) Liu

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
We provide the first approximation quality guarantees for the Cuthull-McKee heuristic for reordering symmetric matrices to have low bandwidth, and we provide an algorithm for reconstructing bounded-bandwidth graphs from distance oracles with near-linear query complexity. To prove these results we introduce a new width parameter, BFS width, and we prove polylogarithmic upper and lower bounds on the BFS width of graphs of bounded bandwidth. Unlike other width parameters, such as bandwidth, pathwidth, and treewidth, BFS width can easily be computed in polynomial time. Bounded BFS width implies bounded bandwidth, pathwidth, and treewidth, which in turn imply fixed-parameter tractable algorithms for many problems that are NP-hard for general graphs. In addition to their applications to matrix ordering, we also provide applications of BFS width to graph reconstruction, to reconstruct graphs from distance queries, and graph drawing, to construct arc diagrams of small height.

Cite as

David Eppstein, Michael T. Goodrich, and Songyu (Alfred) Liu. Bandwidth vs BFS Width in Matrix Reordering, Graph Reconstruction, and Graph Drawing. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 69:1-69:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{eppstein_et_al:LIPIcs.ESA.2025.69,
  author =	{Eppstein, David and Goodrich, Michael T. and Liu, Songyu (Alfred)},
  title =	{{Bandwidth vs BFS Width in Matrix Reordering, Graph Reconstruction, and Graph Drawing}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{69:1--69:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.69},
  URN =		{urn:nbn:de:0030-drops-245373},
  doi =		{10.4230/LIPIcs.ESA.2025.69},
  annote =	{Keywords: Graph algorithms, graph theory, graph width, bandwidth, treewidth}
}
Document
On Finding 𝓁-Th Smallest Perfect Matchings

Authors: Nicolas El Maalouly, Sebastian Haslebacher, Adrian Taubner, and Lasse Wulf

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Given an undirected weighted graph G and an integer k, Exact-Weight Perfect Matching (EWPM) is the problem of finding a perfect matching of weight exactly k in G. In this paper, we study EWPM and its variants. The EWPM problem is famous, since in the case of unary encoded weights, Mulmuley, Vazirani, and Vazirani showed almost 40 years ago that the problem can be solved in randomized polynomial time. However, up to this date no derandomization is known. Our first result is a simple deterministic algorithm for EWPM that runs in time n^𝒪(𝓁), where 𝓁 is the number of distinct weights that perfect matchings in G can take. In fact, we show how to find an 𝓁-th smallest perfect matching in any weighted graph (even if the weights are encoded in binary, in which case EWPM in general is known to be NP-complete) in time n^𝒪(𝓁) for any integer 𝓁. Similar next-to-optimal variants have also been studied recently for the shortest path problem. For our second result, we extend the list of problems that are known to be equivalent to EWPM. We show that EWPM is equivalent under a weight-preserving reduction to the Exact Cycle Sum problem (ECS) in undirected graphs with a conservative (i.e. no negative cycles) weight function. To the best of our knowledge, we are the first to study this problem. As a consequence, the latter problem is contained in RP if the weights are encoded in unary. Finally, we identify a special case of EWPM, called BCPM, which was recently studied by El Maalouly, Steiner and Wulf. We show that BCPM is equivalent under a weight-preserving transformation to another problem recently studied by Schlotter and Sebő as well as Geelen and Kapadia: the Shortest Odd Cycle problem (SOC) in undirected graphs with conservative weights. Finally, our n^𝒪(𝓁) algorithm works in this setting as well, identifying a tractable special case of SOC, BCPM, and ECS.

Cite as

Nicolas El Maalouly, Sebastian Haslebacher, Adrian Taubner, and Lasse Wulf. On Finding 𝓁-Th Smallest Perfect Matchings. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 19:1-19:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{elmaalouly_et_al:LIPIcs.ESA.2025.19,
  author =	{El Maalouly, Nicolas and Haslebacher, Sebastian and Taubner, Adrian and Wulf, Lasse},
  title =	{{On Finding 𝓁-Th Smallest Perfect Matchings}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{19:1--19:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.19},
  URN =		{urn:nbn:de:0030-drops-244875},
  doi =		{10.4230/LIPIcs.ESA.2025.19},
  annote =	{Keywords: Exact Matching, Perfect Matching, Exact-Weight Perfect Matching, Shortest Odd Cycle, Exact Cycle Sum, l-th Smallest Solution, l-th Largest Solution, k-th Best Solution, Derandomization}
}
Document
Hardness of Median and Center in the Ulam Metric

Authors: Nick Fischer, Elazar Goldenberg, Mursalin Habib, and Karthik C. S.

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
The classical rank aggregation problem seeks to combine a set X of n permutations into a single representative "consensus" permutation. In this paper, we investigate two fundamental rank aggregation tasks under the well-studied Ulam metric: computing a median permutation (which minimizes the sum of Ulam distances to X) and computing a center permutation (which minimizes the maximum Ulam distance to X) in two settings. - Continuous Setting: In the continuous setting, the median/center is allowed to be any permutation. It is known that computing a center in the Ulam metric is NP-hard and we add to this by showing that computing a median is NP-hard as well via a simple reduction from the Max-Cut problem. While this result may not be unexpected, it had remained elusive until now and confirms a speculation by Chakraborty, Das, and Krauthgamer [SODA '21]. - Discrete Setting: In the discrete setting, the median/center must be a permutation from the input set. We fully resolve the fine-grained complexity of the discrete median and discrete center problems under the Ulam metric, proving that the naive Õ(n² L)-time algorithm (where L is the length of the permutation) is conditionally optimal. This resolves an open problem raised by Abboud, Bateni, Cohen-Addad, Karthik C. S., and Seddighin [APPROX '23]. Our reductions are inspired by the known fine-grained lower bounds for similarity measures, but we face and overcome several new highly technical challenges.

Cite as

Nick Fischer, Elazar Goldenberg, Mursalin Habib, and Karthik C. S.. Hardness of Median and Center in the Ulam Metric. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 111:1-111:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{fischer_et_al:LIPIcs.ESA.2025.111,
  author =	{Fischer, Nick and Goldenberg, Elazar and Habib, Mursalin and Karthik C. S.},
  title =	{{Hardness of Median and Center in the Ulam Metric}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{111:1--111:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.111},
  URN =		{urn:nbn:de:0030-drops-245809},
  doi =		{10.4230/LIPIcs.ESA.2025.111},
  annote =	{Keywords: Ulam distance, median, center, rank aggregation, fine-grained complexity}
}
Document
Going Beyond Surfaces in Diameter Approximation

Authors: Michał Włodarczyk

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Calculating the diameter of an undirected graph requires quadratic running time under the Strong Exponential Time Hypothesis and this barrier works even against any approximation better than 3/2. For planar graphs with positive edge weights, there are known (1+ε)-approximation algorithms with running time poly(1/ε, log n)⋅ n. However, these algorithms rely on shortest path separators and this technique falls short to yield efficient algorithms beyond graphs of bounded genus. In this work we depart from embedding-based arguments and obtain diameter approximations relying on VC set systems and the local treewidth property. We present two orthogonal extensions of the planar case by giving (1+ε)-approximation algorithms with the following running times: - 𝒪_h((1/ε)^𝒪(h) ⋅ nlog² n)-time algorithm for graphs excluding an apex graph of size h as a minor, - 𝒪_d((1/ε)^𝒪(d) ⋅ nlog² n)-time algorithm for the class of d-apex graphs. As a stepping stone, we obtain efficient (1+ε)-approximate distance oracles for graphs excluding an apex graph of size h as a minor. Our oracle has preprocessing time 𝒪_h((1/ε)⁸⋅ nlog nlog W) and query time 𝒪_h((1/ε)²⋅log n log W), where W is the metric stretch. Such oracles have been so far only known for bounded genus graphs. All our algorithms are deterministic.

Cite as

Michał Włodarczyk. Going Beyond Surfaces in Diameter Approximation. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 39:1-39:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{wlodarczyk:LIPIcs.ESA.2025.39,
  author =	{W{\l}odarczyk, Micha{\l}},
  title =	{{Going Beyond Surfaces in Diameter Approximation}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{39:1--39:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.39},
  URN =		{urn:nbn:de:0030-drops-245076},
  doi =		{10.4230/LIPIcs.ESA.2025.39},
  annote =	{Keywords: diameter, approximation, distance oracles, graph minors, treewidth}
}
Document
Hardness of Computation of Quantum Invariants on 3-Manifolds with Restricted Topology

Authors: Henrique Ennes and Clément Maria

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Quantum invariants in low-dimensional topology offer a wide variety of valuable invariants about knots and 3-manifolds, presented by explicit formulas that are readily computable. Their computational complexity has been actively studied and is tightly connected to topological quantum computing. In this article, we prove that for any 3-manifold quantum invariant in the Reshetikhin-Turaev model, there is a deterministic polynomial time algorithm that, given as input an arbitrary closed 3-manifold M, outputs a closed 3-manifold M' with the same quantum invariant, such that M' is hyperbolic, contains no low genus embedded incompressible surface, and is presented by a strongly irreducible Heegaard diagram. Our construction relies on properties of Heegaard splittings and the Hempel distance. At the level of computational complexity, this proves that the hardness of computing a given quantum invariant of 3-manifolds is preserved even when severely restricting the topology and the combinatorics of the input. This positively answers a question raised by Samperton [Samperton, 2023].

Cite as

Henrique Ennes and Clément Maria. Hardness of Computation of Quantum Invariants on 3-Manifolds with Restricted Topology. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 37:1-37:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ennes_et_al:LIPIcs.ESA.2025.37,
  author =	{Ennes, Henrique and Maria, Cl\'{e}ment},
  title =	{{Hardness of Computation of Quantum Invariants on 3-Manifolds with Restricted Topology}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{37:1--37:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.37},
  URN =		{urn:nbn:de:0030-drops-245057},
  doi =		{10.4230/LIPIcs.ESA.2025.37},
  annote =	{Keywords: 3-manifold, Heegaard splitting, Hempel distance, Quantum invariant, polynomial time reduction}
}
Document
A Mechanized First-Order Theory of Algebraic Data Types with Pattern Matching

Authors: Joshua M. Cohen

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Algebraic data types (ADTs) and pattern matching are widely used to write elegant functional programs and to specify program behavior. These constructs are critical to most general-purpose interactive theorem provers (e.g. Lean, Rocq/Coq), first-order SMT-based deductive verifiers (e.g. Dafny, VeriFast), and intermediate verification languages (e.g. Why3). Such features require layers of compilation - in Rocq, pattern matches are compiled to remove nesting, while SMT-based tools further axiomatize ADTs with a first-order specification. However, these critical steps have been omitted from prior formalizations of such toolchains (e.g. MetaRocq). We give the first proved-sound sophisticated pattern matching compiler (based on Maranget’s compilation to decision trees) and first-order axiomatization of ADTs, both based on Why3 implementations. We prove the soundness of exhaustiveness checking, extending pen-and-paper proofs from the literature, and formulate a robustness property with which we find an exhaustiveness-related bug in Why3. We show that many of our proofs could be useful for reasoning about any first-order program verifier supporting ADTs.

Cite as

Joshua M. Cohen. A Mechanized First-Order Theory of Algebraic Data Types with Pattern Matching. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 5:1-5:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cohen:LIPIcs.ITP.2025.5,
  author =	{Cohen, Joshua M.},
  title =	{{A Mechanized First-Order Theory of Algebraic Data Types with Pattern Matching}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{5:1--5:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.5},
  URN =		{urn:nbn:de:0030-drops-246046},
  doi =		{10.4230/LIPIcs.ITP.2025.5},
  annote =	{Keywords: Pattern Matching Compilation, Algebraic Data Types, First-Order Logic}
}
Document
A Certified Proof Checker for Deep Neural Network Verification in Imandra

Authors: Remi Desmartin, Omri Isac, Grant Passmore, Ekaterina Komendantskaya, Kathrin Stark, and Guy Katz

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Recent advances in the verification of deep neural networks (DNNs) have opened the way for a broader usage of DNN verification technology in many application areas, including safety-critical ones. However, DNN verifiers are themselves complex programs that have been shown to be susceptible to errors and numerical imprecision; this, in turn, has raised the question of trust in DNN verifiers. One prominent attempt to address this issue is enhancing DNN verifiers with the capability of producing certificates of their results that are subject to independent algorithmic checking. While formulations of Marabou certificate checking already exist on top of the state-of-the-art DNN verifier Marabou, they are implemented in C++, and that code itself raises the question of trust (e.g., in the precision of floating point calculations or guarantees for implementation soundness). Here, we present an alternative implementation of the Marabou certificate checking in Imandra - an industrial functional programming language and an interactive theorem prover (ITP) - that allows us to obtain full proof of certificate correctness. The significance of the result is two-fold. Firstly, it gives stronger independent guarantees for Marabou proofs. Secondly, it opens the way for the wider adoption of DNN verifiers in interactive theorem proving in the same way as many ITPs already incorporate SMT solvers.

Cite as

Remi Desmartin, Omri Isac, Grant Passmore, Ekaterina Komendantskaya, Kathrin Stark, and Guy Katz. A Certified Proof Checker for Deep Neural Network Verification in Imandra. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 1:1-1:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{desmartin_et_al:LIPIcs.ITP.2025.1,
  author =	{Desmartin, Remi and Isac, Omri and Passmore, Grant and Komendantskaya, Ekaterina and Stark, Kathrin and Katz, Guy},
  title =	{{A Certified Proof Checker for Deep Neural Network Verification in Imandra}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{1:1--1:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.1},
  URN =		{urn:nbn:de:0030-drops-246000},
  doi =		{10.4230/LIPIcs.ITP.2025.1},
  annote =	{Keywords: Neural Network Verification, Farkas Lemma, Proof Certification}
}
Document
A Formal Analysis of Algorithms for Matroids and Greedoids

Authors: Mohammad Abdulaziz, Thomas Ammer, Shriya Meenakshisundaram, and Adem Rimpapa

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
We present a formal analysis, in Isabelle/HOL, of optimisation algorithms for matroids, which are useful generalisations of combinatorial structures that occur in optimisation, and greedoids, which are a generalisation of matroids. Although some formalisation work has been done earlier on matroids, our work here presents the first formalisation of results on greedoids, and many results we formalise in relation to matroids are also formalised for the first time in this work. We formalise the analysis of a number of optimisation algorithms for matroids and greedoids. We also derive from those algorithms executable implementations of Kruskal’s algorithm for computing optimal spanning trees, an algorithm for maximum cardinality matching for bi-partite graphs, and Prim’s algorithm for computing minimum weight spanning trees.

Cite as

Mohammad Abdulaziz, Thomas Ammer, Shriya Meenakshisundaram, and Adem Rimpapa. A Formal Analysis of Algorithms for Matroids and Greedoids. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 2:1-2:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{abdulaziz_et_al:LIPIcs.ITP.2025.2,
  author =	{Abdulaziz, Mohammad and Ammer, Thomas and Meenakshisundaram, Shriya and Rimpapa, Adem},
  title =	{{A Formal Analysis of Algorithms for Matroids and Greedoids}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{2:1--2:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.2},
  URN =		{urn:nbn:de:0030-drops-246012},
  doi =		{10.4230/LIPIcs.ITP.2025.2},
  annote =	{Keywords: Matroids, Greedoids, Combinatorial Optimisation, Graph Algorithms, Isabelle/HOL, Formal Verification}
}
Document
Nondeterministic Asynchronous Dataflow in Isabelle/HOL

Authors: Rafael Castro Gonçalves Silva, Laouen Fernet, and Dmitriy Traytel

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
We formalize nondeterministic asynchronous dataflow networks in Isabelle/HOL. Dataflow networks are comprised of operators that are capable of communicating with the network, performing silent computations, and making nondeterministic choices. We represent operators using a shallow embedding as codatatypes. Using this representation, we define standard asynchronous dataflow primitives, including sequential and parallel composition and a feedback operator. These primitives adhere to a number of laws from the literature, which we prove by coinduction using weak bisimilarity as our equality. Albeit coinductive and nondeterministic, our model is executable via code extraction to Haskell.

Cite as

Rafael Castro Gonçalves Silva, Laouen Fernet, and Dmitriy Traytel. Nondeterministic Asynchronous Dataflow in Isabelle/HOL. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 30:1-30:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{silva_et_al:LIPIcs.ITP.2025.30,
  author =	{Silva, Rafael Castro Gon\c{c}alves and Fernet, Laouen and Traytel, Dmitriy},
  title =	{{Nondeterministic Asynchronous Dataflow in Isabelle/HOL}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{30:1--30:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.30},
  URN =		{urn:nbn:de:0030-drops-246280},
  doi =		{10.4230/LIPIcs.ITP.2025.30},
  annote =	{Keywords: dataflow, verification, coinduction, Isabelle/HOL}
}
Document
Formally Verifying a Vertical Cell Decomposition Algorithm

Authors: Yves Bertot and Thomas Portet

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
The broad context of this work is the application of formal methods to geometry and robotics. We describe an algorithm to decompose a working area containing obstacles into a collection of safe cells and the formal proof that this algorithm is correct. We expect such an algorithm will be useful to compute safe trajectories. To our knowledge, this is one of the first formalization of such an algorithm to decompose a working space into elementary cells that are suitable for later applications, with the proof of correctness that guarantees that large parts of the working space are safe. Techniques to perform this proof go from algebraic reasoning on coordinates and determinants to sorting. The main difficulty comes from the possible existence of degenerate cases, which are treated in a principled way.

Cite as

Yves Bertot and Thomas Portet. Formally Verifying a Vertical Cell Decomposition Algorithm. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 24:1-24:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bertot_et_al:LIPIcs.ITP.2025.24,
  author =	{Bertot, Yves and Portet, Thomas},
  title =	{{Formally Verifying a Vertical Cell Decomposition Algorithm}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{24:1--24:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.24},
  URN =		{urn:nbn:de:0030-drops-246222},
  doi =		{10.4230/LIPIcs.ITP.2025.24},
  annote =	{Keywords: Formal Verification, Motion planning, algorithmic geometry}
}
Document
Automatically Generalizing Proofs and Statements

Authors: Anshula Gandhi, Anand Rao Tadipatri, and Timothy Gowers

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
We present an algorithm, developed in the Lean programming language, to automatically generalize mathematical proofs. The algorithm, which builds on work by Olivier Pons, advances state-of-the-art proof generalization by robustly generalizing repeated and related constants, as well as abstracting out hypotheses implicitly concerning them. We also discuss the role of proof generalization in conjecturing, learning from failure, and other aspects of mathematical proof discovery.

Cite as

Anshula Gandhi, Anand Rao Tadipatri, and Timothy Gowers. Automatically Generalizing Proofs and Statements. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 12:1-12:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gandhi_et_al:LIPIcs.ITP.2025.12,
  author =	{Gandhi, Anshula and Tadipatri, Anand Rao and Gowers, Timothy},
  title =	{{Automatically Generalizing Proofs and Statements}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{12:1--12:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.12},
  URN =		{urn:nbn:de:0030-drops-246104},
  doi =		{10.4230/LIPIcs.ITP.2025.12},
  annote =	{Keywords: automated reasoning, automated theorem proving, interactive theorem proving, formalization of mathematics, generalization, Lean theorem prover, Lean tactic}
}
  • Refine by Type
  • 160 Document/PDF
  • 133 Document/HTML

  • Refine by Publication Year
  • 127 2025
  • 4 2024
  • 10 2023
  • 5 2022
  • 3 2020
  • Show More...

  • Refine by Author
  • 3 Lissandrini, Matteo
  • 3 Markey, Nicolas
  • 3 Resch, Nicolas
  • 3 Yuan, Chen
  • 2 Biswas, Russa
  • Show More...

  • Refine by Series/Journal
  • 120 LIPIcs
  • 16 OASIcs
  • 1 DARTS
  • 9 LITES
  • 11 TGDK
  • Show More...

  • Refine by Classification
  • 10 Theory of computation → Design and analysis of algorithms
  • 10 Theory of computation → Problems, reductions and completeness
  • 8 Theory of computation → Computational geometry
  • 6 Mathematics of computing → Coding theory
  • 6 Theory of computation → Distributed algorithms
  • Show More...

  • Refine by Keyword
  • 4 Formal Verification
  • 3 Differential Privacy
  • 3 Knowledge graphs
  • 3 Large Language Models
  • 3 Security
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail