106 Search Results for "David, Laurent"


Document
Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)

Authors: Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
During the workshop, we deeply discussed what CONversational Information ACcess (CONIAC) is and its unique features, proposing a world model abstracting it, and defined the Conversational Agents Framework for Evaluation (CAFE) for the evaluation of CONIAC systems, consisting of six major components: 1) goals of the system’s stakeholders, 2) user tasks to be studied in the evaluation, 3) aspects of the users carrying out the tasks, 4) evaluation criteria to be considered, 5) evaluation methodology to be applied, and 6) measures for the quantitative criteria chosen.

Cite as

Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen. Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 19-67, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{bauer_et_al:DagMan.11.1.19,
  author =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  title =	{{Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)}},
  pages =	{19--67},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.19},
  URN =		{urn:nbn:de:0030-drops-252722},
  doi =		{10.4230/DagMan.11.1.19},
  annote =	{Keywords: Conversational Agents, Evaluation, Information Access}
}
Document
Testing Sumsets Is Hard

Authors: Xi Chen, Shivam Nadimpalli, Tim Randolph, Rocco A. Servedio, and Or Zamir

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
A subset S of the Boolean hypercube 𝔽₂ⁿ is a sumset if S = {a + b : a, b ∈ A} for some A ⊆ 𝔽₂ⁿ. Sumsets are central objects of study in additive combinatorics, where they play a role in several of the field’s most important results. We prove a lower bound of Ω(2^{n/2}) for the number of queries needed to test whether a Boolean function f:𝔽₂ⁿ → {0,1} is the indicator function of a sumset, ruling out an efficient testing algorithm for sumsets. Our lower bound for testing sumsets follows from sharp bounds on the related problem of shift testing, which may be of independent interest. We also give a near-optimal {2^{n/2} ⋅ poly(n)}-query algorithm for a smoothed analysis formulation of the sumset refutation problem. Finally, we include a simple proof that the number of different sumsets in 𝔽₂ⁿ is 2^{(1±o(1))2^{n-1}}.

Cite as

Xi Chen, Shivam Nadimpalli, Tim Randolph, Rocco A. Servedio, and Or Zamir. Testing Sumsets Is Hard. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 14:1-14:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ESA.2025.14,
  author =	{Chen, Xi and Nadimpalli, Shivam and Randolph, Tim and Servedio, Rocco A. and Zamir, Or},
  title =	{{Testing Sumsets Is Hard}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{14:1--14:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.14},
  URN =		{urn:nbn:de:0030-drops-244822},
  doi =		{10.4230/LIPIcs.ESA.2025.14},
  annote =	{Keywords: Sumsets, additive combinatorics, property testing, Boolean functions}
}
Document
Improved Hardness-Of-Approximation for Token-Swapping

Authors: Sam Hiken and Nicole Wein

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
We study the token swapping problem, in which we are given a graph with an initial assignment of one distinct token to each vertex, and a final desired assignment (again with one token per vertex). The goal is to find the minimum length sequence of swaps of adjacent tokens required to get from the initial to the final assignment. The token swapping problem is known to be NP-complete. It is also known to have a polynomial-time 4-approximation algorithm. From the hardness-of-approximation side, it is known to be NP-hard to approximate with a ratio better than 1001/1000. Our main result is an improvement of the approximation ratio of the lower bound: We show that it is NP-hard to approximate with ratio better than 14/13. We then turn our attention to the 0/1-weighted version, in which every token has a weight of either 0 or 1, and the cost of a swap is the sum of the weights of the two participating tokens. Unlike standard token swapping, no constant-factor approximation is known for this version, and we provide an explanation. We prove that 0/1-weighted token swapping is NP-hard to approximate with ratio better than (1-ε) ln(n) for any constant ε > 0. Lastly, we prove two barrier results for the standard (unweighted) token swapping problem. We show that one cannot beat the current best known approximation ratio of 4 using a large class of algorithms which includes all known algorithms, nor can one beat it using a common analysis framework.

Cite as

Sam Hiken and Nicole Wein. Improved Hardness-Of-Approximation for Token-Swapping. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 57:1-57:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{hiken_et_al:LIPIcs.ESA.2025.57,
  author =	{Hiken, Sam and Wein, Nicole},
  title =	{{Improved Hardness-Of-Approximation for Token-Swapping}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{57:1--57:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.57},
  URN =		{urn:nbn:de:0030-drops-245251},
  doi =		{10.4230/LIPIcs.ESA.2025.57},
  annote =	{Keywords: algorithms, token-swapping, hardness-of-approximation, lower-bounds}
}
Document
Going Beyond Surfaces in Diameter Approximation

Authors: Michał Włodarczyk

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Calculating the diameter of an undirected graph requires quadratic running time under the Strong Exponential Time Hypothesis and this barrier works even against any approximation better than 3/2. For planar graphs with positive edge weights, there are known (1+ε)-approximation algorithms with running time poly(1/ε, log n)⋅ n. However, these algorithms rely on shortest path separators and this technique falls short to yield efficient algorithms beyond graphs of bounded genus. In this work we depart from embedding-based arguments and obtain diameter approximations relying on VC set systems and the local treewidth property. We present two orthogonal extensions of the planar case by giving (1+ε)-approximation algorithms with the following running times: - 𝒪_h((1/ε)^𝒪(h) ⋅ nlog² n)-time algorithm for graphs excluding an apex graph of size h as a minor, - 𝒪_d((1/ε)^𝒪(d) ⋅ nlog² n)-time algorithm for the class of d-apex graphs. As a stepping stone, we obtain efficient (1+ε)-approximate distance oracles for graphs excluding an apex graph of size h as a minor. Our oracle has preprocessing time 𝒪_h((1/ε)⁸⋅ nlog nlog W) and query time 𝒪_h((1/ε)²⋅log n log W), where W is the metric stretch. Such oracles have been so far only known for bounded genus graphs. All our algorithms are deterministic.

Cite as

Michał Włodarczyk. Going Beyond Surfaces in Diameter Approximation. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 39:1-39:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{wlodarczyk:LIPIcs.ESA.2025.39,
  author =	{W{\l}odarczyk, Micha{\l}},
  title =	{{Going Beyond Surfaces in Diameter Approximation}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{39:1--39:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.39},
  URN =		{urn:nbn:de:0030-drops-245076},
  doi =		{10.4230/LIPIcs.ESA.2025.39},
  annote =	{Keywords: diameter, approximation, distance oracles, graph minors, treewidth}
}
Document
Formalising New Mathematics in Isabelle: Diagonal Ramsey

Authors: Lawrence C. Paulson

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
The formalisation of mathematics is becoming routine, but its value to research mathematicians remains unproven. There are few examples of using proof assistants to verify new work. This paper reports the formalisation - inspired by a Lean one by Bhavik Mehta - of a major new result [Marcelo Campos et al., 2023] about Ramsey numbers. One unexpected finding was a heavy role for computer algebra techniques.

Cite as

Lawrence C. Paulson. Formalising New Mathematics in Isabelle: Diagonal Ramsey. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 18:1-18:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{paulson:LIPIcs.ITP.2025.18,
  author =	{Paulson, Lawrence C.},
  title =	{{Formalising New Mathematics in Isabelle: Diagonal Ramsey}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{18:1--18:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.18},
  URN =		{urn:nbn:de:0030-drops-246163},
  doi =		{10.4230/LIPIcs.ITP.2025.18},
  annote =	{Keywords: Isabelle, formalisation of mathematics, Ramsey’s theorem, computer algebra}
}
Document
A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration

Authors: Federico Nemmi, Emma Chabani, Laure Boyer, Charlie Madier, and Daniel Lewkowicz

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
As humanity prepares for long-distance space exploration, optimizing group performance, the ability of a group to achieve its goals efficiently, is critical. Astronaut crews will endure isolation, confinement, and operational stress, making group synchrony - the alignment of behaviors, emotions, and physiological states - a key factor in mission success. Synchrony influences team cohesion, performance, and resilience, necessitating effective crew management strategies. This paper proposes a framework for a real-time, unobtrusive index of group synchrony to support astronauts and mission control. Research indicates that team cohesion fluctuates in isolated environments, with reduced communication and interpersonal conflicts emerging over time. A system tracking synchrony could mitigate these issues, providing proactive support and improving remote management. Additionally, it could serve as a cognitive and physiological feedback tool for astronauts and a decision-making aid for mission control, enhancing well-being and efficiency. Our approach integrates behavioral and physiological synchrony measures to assess team cohesion and performance. We propose a multi-modal synchrony index combining movement coordination, communication patterns, and physiological signals such as heart rate, electrodermal activity, and EEG. This index will be validated across different tasks to ensure applicability across diverse mission scenarios. By developing a robust synchrony index, we address a fundamental challenge in space missions: sustaining team effectiveness under extreme conditions. Beyond space exploration, our findings could benefit high-risk, high-isolation teams in submarine crews, polar expeditions, and remote research groups. Our collaboration with the Centre National d'Etudes Spatiales, the Institut de Médecine et de Physiologie Spatiales, and the Toulouse University Hospital marks the first step, with experimental data collection starting this year. Ultimately, this research fosters more adaptive, responsive, and resilient teams for future space missions.

Cite as

Federico Nemmi, Emma Chabani, Laure Boyer, Charlie Madier, and Daniel Lewkowicz. A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 30:1-30:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{nemmi_et_al:OASIcs.SpaceCHI.2025.30,
  author =	{Nemmi, Federico and Chabani, Emma and Boyer, Laure and Madier, Charlie and Lewkowicz, Daniel},
  title =	{{A Research Framework to Develop a Real-Time Synchrony Index to Monitor Team Cohesion and Performance in Long-Duration Space Exploration}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{30:1--30:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.30},
  URN =		{urn:nbn:de:0030-drops-240200},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.30},
  annote =	{Keywords: Performance, Synchronie, Crew monitoring, Cohesion}
}
Document
Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems

Authors: Elizabeth Sloan and Kristin Yvonne Rozier

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
Timelines are critical in space exploration. Timelines facilitate planning, resource management, and automation of uncrewed missions. As NASA and other space agencies increasingly rely on timelines for autonomous spacecraft operations, ensuring their understandability and verifiability is essential for mission success. However, interdisciplinary design teams face challenges in interpreting timelines due to variations in cultural and educational backgrounds, leading to communication barriers and potential system mismatches. This work-in-progress research explores time-oriented data visualizations to improve timeline comprehension in space systems. We contribute (1) a survey of visualization techniques, identifying patterns and gaps in historic time-oriented data visualizations and industry tools, (2) a focus group pilot study analyzing user interpretations of timeline visualizations, and (3) a novel method for visualizing aggregate runs of a timeline on a complex system, including identification of key features for usability of aggregate-data visuals. Our findings inform future visualization strategies for debugging and verifying timelines in uncrewed systems. While focused on space, this research has broader implications for aerospace, robotics, and emergency response systems.

Cite as

Elizabeth Sloan and Kristin Yvonne Rozier. Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 24:1-24:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{sloan_et_al:OASIcs.SpaceCHI.2025.24,
  author =	{Sloan, Elizabeth and Rozier, Kristin Yvonne},
  title =	{{Understanding Time in Space: Improving Timeline Understandability for Uncrewed Space Systems}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{24:1--24:12},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.24},
  URN =		{urn:nbn:de:0030-drops-240143},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.24},
  annote =	{Keywords: Human-Ceneterd Design, Time-Oriented Data Visualization, Uncrewed Spacecraft Operations, Formal Methods}
}
Document
On the Effectiveness of Interpreter-Guided Compiler Testing

Authors: Federico Lochbaum and Guillermo Polito

Published in: OASIcs, Volume 134, Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025)


Abstract
Guaranteeing that a compiler behaves correctly is a complex task often approached through test generation and fuzzing. Compiler test generation must not only ensure that a compiler generates code that does not break, but also that it implements the programming language semantics. Recently, interpreter-guided test generation has been proposed to test JIT compilers: Concolic-execution on the interpreter yields test cases for the language semantics which are then validated between differential testing of the interpreter and compiler. In previous work, this solution has been shown to find interpreter/compiler differences. However, little has been said about the effectiveness and the solution limits. In this paper we study the behavior of this technique, to shed light on future improvements and research. We experiment with this technique on the JIT compiler for the Pharo programming language, on two different backends: ARMv7 and x86. We explore how effective the solution is in terms of compiler coverage and its limitations, and we discuss how future research can overcome them. Moreover, we investigate how this technique combined with random constraint mutations increases backend compiler coverage.

Cite as

Federico Lochbaum and Guillermo Polito. On the Effectiveness of Interpreter-Guided Compiler Testing. In Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025). Open Access Series in Informatics (OASIcs), Volume 134, pp. 20:1-20:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lochbaum_et_al:OASIcs.Programming.2025.20,
  author =	{Lochbaum, Federico and Polito, Guillermo},
  title =	{{On the Effectiveness of Interpreter-Guided Compiler Testing}},
  booktitle =	{Companion Proceedings of the 9th International Conference on the Art, Science, and Engineering of Programming (Programming 2025)},
  pages =	{20:1--20:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-382-9},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{134},
  editor =	{Edwards, Jonathan and Perera, Roly and Petricek, Tomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.Programming.2025.20},
  URN =		{urn:nbn:de:0030-drops-243040},
  doi =		{10.4230/OASIcs.Programming.2025.20},
  annote =	{Keywords: Virtual Machines, Concolic Testing, JIT compilers, interpreters, Differential Testing, Constraint Mutations, Compiler Coverage}
}
Document
RANDOM
Algorithmic Contiguity from Low-Degree Conjecture and Applications in Correlated Random Graphs

Authors: Zhangsong Li

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
In this paper, assuming a natural strengthening of the low-degree conjecture, we provide evidence of computational hardness for two problems: (1) the (partial) matching recovery problem in the sparse correlated Erdős-Rényi graphs G(n,q;ρ) when the edge-density q = n^{-1+o(1)} and the correlation ρ < √{α} lies below the Otter’s threshold, this resolves a remaining problem in [Jian Ding et al., 2023]; (2) the detection problem between a pair of correlated sparse stochastic block model S(n,λ/n;k,ε;s) and a pair of independent stochastic block models S(n,λs/n;k,ε) when ε² λ s < 1 lies below the Kesten-Stigum (KS) threshold and s < √α lies below the Otter’s threshold, this resolves a remaining problem in [Guanyi Chen et al., 2024]. One of the main ingredient in our proof is to derive certain forms of algorithmic contiguity between two probability measures based on bounds on their low-degree advantage. To be more precise, consider the high-dimensional hypothesis testing problem between two probability measures ℙ and ℚ based on the sample Y. We show that if the low-degree advantage Adv_{≤D}(dℙ/dℚ) = O(1), then (assuming the low-degree conjecture) there is no efficient algorithm A such that ℚ(A(Y) = 0) = 1-o(1) and ℙ(A(Y) = 1) = Ω(1). This framework provides a useful tool for performing reductions between different inference tasks.

Cite as

Zhangsong Li. Algorithmic Contiguity from Low-Degree Conjecture and Applications in Correlated Random Graphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 30:1-30:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{li:LIPIcs.APPROX/RANDOM.2025.30,
  author =	{Li, Zhangsong},
  title =	{{Algorithmic Contiguity from Low-Degree Conjecture and Applications in Correlated Random Graphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{30:1--30:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.30},
  URN =		{urn:nbn:de:0030-drops-243965},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.30},
  annote =	{Keywords: Algorithmic Contiguity, Low-degree Conjecture, Correlated Random Graphs}
}
Document
APPROX
Max-Cut with Multiple Cardinality Constraints

Authors: Yury Makarychev, Madhusudhan Reddy Pittu, and Ali Vakilian

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
We study the classic Max-Cut problem under multiple cardinality constraints, which we refer to as the Constrained Max-Cut problem. Given a graph G = (V, E), a partition of the vertices into c disjoint parts V₁, …, V_c, and cardinality parameters k₁, …, k_c, the goal is to select a set S ⊆ V such that |S ∩ V_i| = k_i for each i ∈ [c], maximizing the total weight of edges crossing S (i.e., edges with exactly one endpoint in S). By designing an approximate kernel for Constrained Max-Cut and building on the correlation rounding technique of Raghavendra and Tan (2012), we present a (0.858 - ε)-approximation algorithm for the problem when c = O(1). The algorithm runs in time O(min{k/ε, n}^poly(c/ε) + poly(n)), where k = ∑_{i∈[c]} k_i and n = |V|. This improves upon the (1/2 + ε₀)-approximation of Feige and Langberg (2001) for Max-Cut_k (the special case when c = 1, k₁ = k), and generalizes the (0.858 - ε)-approximation of Raghavendra and Tan (2012), which only applies when min{k,n-k} = Ω(n) and does not handle multiple constraints. We also establish that, for general values of c, it is NP-hard to determine whether a feasible solution exists that cuts all edges. Finally, we present a 1/2-approximation algorithm for Max-Cut under an arbitrary matroid constraint.

Cite as

Yury Makarychev, Madhusudhan Reddy Pittu, and Ali Vakilian. Max-Cut with Multiple Cardinality Constraints. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 13:1-13:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{makarychev_et_al:LIPIcs.APPROX/RANDOM.2025.13,
  author =	{Makarychev, Yury and Pittu, Madhusudhan Reddy and Vakilian, Ali},
  title =	{{Max-Cut with Multiple Cardinality Constraints}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{13:1--13:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.13},
  URN =		{urn:nbn:de:0030-drops-243790},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.13},
  annote =	{Keywords: Maxcut, Semi-definite Programming, Sum of Squares Hierarchy}
}
Document
APPROX
Sparsest Cut and Eigenvalue Multiplicities on Low Degree Abelian Cayley Graphs

Authors: Tommaso d'Orsi, Chris Jones, Jake Ruotolo, Salil Vadhan, and Jiyu Zhang

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
Whether or not the Sparsest Cut problem admits an efficient O(1)-approximation algorithm is a fundamental algorithmic question with connections to geometry and the Unique Games Conjecture. Revisiting spectral algorithms for Sparsest Cut, we present a novel, simple algorithm that combines eigenspace enumeration with a new algorithm for the Cut Improvement problem. The runtime of our algorithm is parametrized by a quantity that we call the solution dimension SD_ε(G): the smallest k such that the subspace spanned by the first k Laplacian eigenvectors contains all but ε fraction of a sparsest cut. Our algorithm matches the guarantees of prior methods based on the threshold-rank paradigm, while also extending beyond them. To illustrate this, we study its performance on low degree Cayley graphs over Abelian groups - canonical examples of graphs with poor expansion properties. We prove that low degree Abelian Cayley graphs have small solution dimension, yielding an algorithm that computes a (1+ε)-approximation to the uniform Sparsest Cut of a degree-d Cayley graph over an Abelian group of size n in time n^O(1) ⋅ exp{(d/ε)^O(d)}. Along the way to bounding the solution dimension of Abelian Cayley graphs, we analyze their sparse cuts and spectra, proving that the collection of O(1)-approximate sparsest cuts has an ε-net of size exp{(d/ε)^O(d)} and that the multiplicity of λ₂ is bounded by 2^O(d). The latter bound is tight and improves on a previous bound of 2^O(d²) by Lee and Makarychev.

Cite as

Tommaso d'Orsi, Chris Jones, Jake Ruotolo, Salil Vadhan, and Jiyu Zhang. Sparsest Cut and Eigenvalue Multiplicities on Low Degree Abelian Cayley Graphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 16:1-16:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{dorsi_et_al:LIPIcs.APPROX/RANDOM.2025.16,
  author =	{d'Orsi, Tommaso and Jones, Chris and Ruotolo, Jake and Vadhan, Salil and Zhang, Jiyu},
  title =	{{Sparsest Cut and Eigenvalue Multiplicities on Low Degree Abelian Cayley Graphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{16:1--16:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.16},
  URN =		{urn:nbn:de:0030-drops-243827},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.16},
  annote =	{Keywords: Sparsest Cut, Spectral Graph Theory, Cayley Graphs, Approximation Algorithms}
}
Document
Vantage Point Selection Algorithms for Bottleneck Capacity Estimation

Authors: Vikrant Ashvinkumar, Rezaul Chowdhury, Jie Gao, Mayank Goswami, Joseph S. B. Mitchell, and Valentin Polishchuk

Published in: LIPIcs, Volume 349, 19th International Symposium on Algorithms and Data Structures (WADS 2025)


Abstract
Motivated by the problem of estimating bottleneck capacities on the Internet, we formulate and study the problem of vantage point selection. We are given a graph G = (V, E) whose edges E have unknown capacity values that are to be discovered. Probes from a vantage point, i.e, a vertex v ∈ V, along shortest paths from v to all other vertices, reveal bottleneck edge capacities along each path. Our goal is to select k vantage points from V that reveal the maximum number of bottleneck edge capacities. We consider both a non-adaptive setting where all k vantage points are selected before any bottleneck capacity is revealed, and an adaptive setting where each vantage point selection instantly reveals bottleneck capacities along all shortest paths starting from that point. In the non-adaptive setting, by considering a relaxed model where edge capacities are drawn from a random permutation (which still leaves the problem of maximizing the expected number of revealed edges NP-hard), we are able to give a 1-1/e approximate algorithm. In the adaptive setting we work with the least permissive model where edge capacities are arbitrarily fixed but unknown. We compare with the best solution for the particular input instance (i.e. by enumerating all choices of k tuples), and provide both lower bounds on instance optimal approximation algorithms and upper bounds for trees and planar graphs.

Cite as

Vikrant Ashvinkumar, Rezaul Chowdhury, Jie Gao, Mayank Goswami, Joseph S. B. Mitchell, and Valentin Polishchuk. Vantage Point Selection Algorithms for Bottleneck Capacity Estimation. In 19th International Symposium on Algorithms and Data Structures (WADS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 349, pp. 6:1-6:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ashvinkumar_et_al:LIPIcs.WADS.2025.6,
  author =	{Ashvinkumar, Vikrant and Chowdhury, Rezaul and Gao, Jie and Goswami, Mayank and Mitchell, Joseph S. B. and Polishchuk, Valentin},
  title =	{{Vantage Point Selection Algorithms for Bottleneck Capacity Estimation}},
  booktitle =	{19th International Symposium on Algorithms and Data Structures (WADS 2025)},
  pages =	{6:1--6:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-398-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{349},
  editor =	{Morin, Pat and Oh, Eunjin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WADS.2025.6},
  URN =		{urn:nbn:de:0030-drops-242376},
  doi =		{10.4230/LIPIcs.WADS.2025.6},
  annote =	{Keywords: Bottleneck capacity, Approximation algorithms, Instance optimality}
}
Document
Testing Whether a Subgraph Is Convex or Isometric

Authors: Sergio Cabello

Published in: LIPIcs, Volume 349, 19th International Symposium on Algorithms and Data Structures (WADS 2025)


Abstract
We consider the following two algorithmic problems: given a graph G and a subgraph H ⊆ G, decide whether H is an isometric or a geodesically convex subgraph of G. It is relatively easy to see that the problems can be solved by computing the distances between all pairs of vertices. We provide a conditional lower bound showing that, for sparse graphs with n vertices and Θ(n) edges, we cannot expect to solve the problem in O(n^{2-ε}) time for any constant ε > 0. We also show that the problem can be solved in subquadratic time for planar graphs and in near-linear time for graphs of bounded treewidth. Finally, we provide a near-linear time algorithm for the setting where G is a plane graph and H is defined by a few cycles in G.

Cite as

Sergio Cabello. Testing Whether a Subgraph Is Convex or Isometric. In 19th International Symposium on Algorithms and Data Structures (WADS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 349, pp. 12:1-12:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cabello:LIPIcs.WADS.2025.12,
  author =	{Cabello, Sergio},
  title =	{{Testing Whether a Subgraph Is Convex or Isometric}},
  booktitle =	{19th International Symposium on Algorithms and Data Structures (WADS 2025)},
  pages =	{12:1--12:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-398-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{349},
  editor =	{Morin, Pat and Oh, Eunjin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WADS.2025.12},
  URN =		{urn:nbn:de:0030-drops-242439},
  doi =		{10.4230/LIPIcs.WADS.2025.12},
  annote =	{Keywords: convex subgraph, isometric subgraph, plane graph}
}
Document
On the Enumeration of Signatures of XOR-CNF’s

Authors: Nadia Creignou, Oscar Defrain, Frédéric Olive, and Simon Vilmin

Published in: LIPIcs, Volume 349, 19th International Symposium on Algorithms and Data Structures (WADS 2025)


Abstract
Given a CNF formula φ with clauses C_1, … , C_m over a set of variables V, a truth assignment 𝐚: V → {0, 1} generates a binary sequence σ_φ(𝐚) = (C_1(𝐚), …, C_m(𝐚)), called a signature of φ, where C_i(𝐚) = 1 if clause C_i evaluates to 1 under assignment 𝐚, and C_i(𝐚) = 0 otherwise. Signatures and their associated generation problems have given rise to new yet promising research questions in algorithmic enumeration. In a recent paper, Bérczi et al. interestingly proved that generating signatures of a CNF is tractable despite the fact that verifying a solution is hard. They also showed the hardness of finding maximal signatures of an arbitrary CNF due to the intractability of satisfiability in general. Their contribution leaves open the problem of efficiently generating maximal signatures for tractable classes of CNFs, i.e., those for which satisfiability can be solved in polynomial time. Stepping into that direction, we completely characterize the complexity of generating all, minimal, and maximal signatures for XOR-CNF’s.

Cite as

Nadia Creignou, Oscar Defrain, Frédéric Olive, and Simon Vilmin. On the Enumeration of Signatures of XOR-CNF’s. In 19th International Symposium on Algorithms and Data Structures (WADS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 349, pp. 19:1-19:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{creignou_et_al:LIPIcs.WADS.2025.19,
  author =	{Creignou, Nadia and Defrain, Oscar and Olive, Fr\'{e}d\'{e}ric and Vilmin, Simon},
  title =	{{On the Enumeration of Signatures of XOR-CNF’s}},
  booktitle =	{19th International Symposium on Algorithms and Data Structures (WADS 2025)},
  pages =	{19:1--19:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-398-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{349},
  editor =	{Morin, Pat and Oh, Eunjin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WADS.2025.19},
  URN =		{urn:nbn:de:0030-drops-242508},
  doi =		{10.4230/LIPIcs.WADS.2025.19},
  annote =	{Keywords: Algorithmic enumeration, XOR-CNF, signatures, maximal bipartite subgraphs enumeration, extension, proximity search}
}
Document
On the Complexity of Recoverable Robust Optimization in the Polynomial Hierarchy

Authors: Christoph Grüne and Lasse Wulf

Published in: LIPIcs, Volume 345, 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)


Abstract
Recoverable robust optimization is a popular multi-stage approach, in which it is possible to adjust a first-stage solution after the uncertain cost scenario is revealed. We consider recoverable robust optimization in combination with discrete budgeted uncertainty. In this setting, it seems plausible that many problems become Σ^p₃-complete and therefore it is impossible to find compact IP formulations of them (unless the unlikely conjecture NP = Σ^p₃ holds). Even though this seems plausible, few concrete results of this kind are known. In this paper, we fill that gap of knowledge. We consider recoverable robust optimization for the nominal problems of Sat, 3Sat, vertex cover, dominating set, set cover, hitting set, feedback vertex set, feedback arc set, uncapacitated facility location, p-center, p-median, independent set, clique, subset sum, knapsack, partition, scheduling, Hamiltonian path/cycle (directed/undirected), TSP, k-directed disjoint path (k ≥ 2), and Steiner tree. We show that for each of these problems, and for each of three widely used distance measures, the recoverable robust problem becomes Σ^p₃-complete. Concretely, we show that all these problems share a certain abstract property and prove that this property implies that their robust recoverable counterpart is Σ^p₃-complete. This reveals the insight that all the above problems are Σ^p₃-complete "for the same reason". Our result extends a recent framework by Grüne and Wulf.

Cite as

Christoph Grüne and Lasse Wulf. On the Complexity of Recoverable Robust Optimization in the Polynomial Hierarchy. In 50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 345, pp. 52:1-52:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{grune_et_al:LIPIcs.MFCS.2025.52,
  author =	{Gr\"{u}ne, Christoph and Wulf, Lasse},
  title =	{{On the Complexity of Recoverable Robust Optimization in the Polynomial Hierarchy}},
  booktitle =	{50th International Symposium on Mathematical Foundations of Computer Science (MFCS 2025)},
  pages =	{52:1--52:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-388-1},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{345},
  editor =	{Gawrychowski, Pawe{\l} and Mazowiecki, Filip and Skrzypczak, Micha{\l}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2025.52},
  URN =		{urn:nbn:de:0030-drops-241596},
  doi =		{10.4230/LIPIcs.MFCS.2025.52},
  annote =	{Keywords: Complexity, Robust Optimization, Recoverable Robust Optimization, Two-Stage Problems, Polynomial Hierarchy, Sigma 2, Sigma 3}
}
  • Refine by Type
  • 106 Document/PDF
  • 91 Document/HTML

  • Refine by Publication Year
  • 85 2025
  • 5 2024
  • 5 2023
  • 1 2022
  • 2 2021
  • Show More...

  • Refine by Author
  • 3 Scherp, Ansgar
  • 2 Baelde, David
  • 2 Beck, J. Christopher
  • 2 Bruyère, Véronique
  • 2 Dell'Aglio, Daniele
  • Show More...

  • Refine by Series/Journal
  • 80 LIPIcs
  • 11 OASIcs
  • 7 LITES
  • 7 TGDK
  • 1 DagMan

  • Refine by Classification
  • 8 Theory of computation → Design and analysis of algorithms
  • 8 Theory of computation → Graph algorithms analysis
  • 7 Theory of computation → Constraint and logic programming
  • 6 Mathematics of computing → Combinatorial optimization
  • 5 Theory of computation → Formal languages and automata theory
  • Show More...

  • Refine by Keyword
  • 5 Constraint Programming
  • 4 Scheduling
  • 2 Abstract interpretation
  • 2 Approximation Algorithms
  • 2 Burrows-Wheeler Transform
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail