44 Search Results for "Johnson, Timothy"


Document
Mind the Gap. Doubling Constant Parametrization of Weighted Problems: TSP, Max-Cut, and More

Authors: Mihail Stoian

Published in: LIPIcs, Volume 364, 43rd International Symposium on Theoretical Aspects of Computer Science (STACS 2026)


Abstract
Despite much research, hard weighted problems still resist super-polynomial improvements over their textbook solution. On the other hand, the unweighted versions of these problems have recently witnessed the sought-after speedups. Currently, the only way to repurpose the algorithm of the unweighted version for the weighted version is to employ a polynomial embedding of the input weights. This, however, introduces a pseudo-polynomial factor into the running time, which becomes impractical for arbitrarily weighted instances. In this paper, we introduce a new way to repurpose the algorithm of the unweighted problem. Specifically, we show that the time complexity of several well-known NP-hard problems operating over the (min, +) and (max, +) semirings, such as TSP, Weighted Max-Cut, and Edge-Weighted k-Clique, is proportional to that of their unweighted versions when the set of input weights has small doubling. We achieve this by a meta-algorithm that converts the input weights into polynomially bounded integers using the recent constructive Freiman’s theorem by Randolph and Węgrzycki [ESA 2024] before applying the polynomial embedding.

Cite as

Mihail Stoian. Mind the Gap. Doubling Constant Parametrization of Weighted Problems: TSP, Max-Cut, and More. In 43rd International Symposium on Theoretical Aspects of Computer Science (STACS 2026). Leibniz International Proceedings in Informatics (LIPIcs), Volume 364, pp. 79:1-79:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2026)


Copy BibTex To Clipboard

@InProceedings{stoian:LIPIcs.STACS.2026.79,
  author =	{Stoian, Mihail},
  title =	{{Mind the Gap. Doubling Constant Parametrization of Weighted Problems: TSP, Max-Cut, and More}},
  booktitle =	{43rd International Symposium on Theoretical Aspects of Computer Science (STACS 2026)},
  pages =	{79:1--79:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-412-3},
  ISSN =	{1868-8969},
  year =	{2026},
  volume =	{364},
  editor =	{Mahajan, Meena and Manea, Florin and McIver, Annabelle and Thắng, Nguy\~{ê}n Kim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2026.79},
  URN =		{urn:nbn:de:0030-drops-255680},
  doi =		{10.4230/LIPIcs.STACS.2026.79},
  annote =	{Keywords: doubling constant parametrization, weighted problems, traveling salesman, weighted max-cut, edge-weighted k-clique}
}
Document
On the PTAS Complexity of Multidimensional Knapsack

Authors: Ilan Doron-Arad, Ariel Kulik, and Pasin Manurangsi

Published in: LIPIcs, Volume 362, 17th Innovations in Theoretical Computer Science Conference (ITCS 2026)


Abstract
We study the d-dimensional knapsack problem. We are given a set of items, each with a d-dimensional cost vector and a profit, along with a d-dimensional budget vector. The goal is to select a set of items that do not exceed the budget in all dimensions and maximize the total profit. A polynomial-time approximation scheme (PTAS) with running time n^{Θ(d/{ε})} has long been known for this problem, where {ε} is the error parameter and n is the encoding size. Despite decades of active research, the best running time of a PTAS has remained O(n^{⌈ d/{ε} ⌉ - d}). Unfortunately, existing lower bounds only cover the special case with two dimensions d = 2, and do not answer whether there is a n^{o(d/({ε)})}-time PTAS for larger values of d. In this work, we show that the running times of the best-known PTAS cannot be improved up to a polylogarithmic factor assuming the Exponential Time Hypothesis (ETH). Our techniques are based on a robust reduction from 2-CSP, which embeds 2-CSP constraints into a desired number of dimensions. Then, using a recent result of [Bafna Karthik and Minzer, STOC'25], we succeed in exhibiting tight trade-off between d and {ε} for all regimes of the parameters assuming d is sufficiently large. Informally, our result also shows that under ETH, for any function f there is no f(d/({ε)}) ⋅ n^{õ(d/({ε)})}-time (1-{ε})-approximation for d-dimensional knapsack, where n is the number of items and õ hides polylogarithmic factors in d/({ε)}.

Cite as

Ilan Doron-Arad, Ariel Kulik, and Pasin Manurangsi. On the PTAS Complexity of Multidimensional Knapsack. In 17th Innovations in Theoretical Computer Science Conference (ITCS 2026). Leibniz International Proceedings in Informatics (LIPIcs), Volume 362, pp. 50:1-50:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2026)


Copy BibTex To Clipboard

@InProceedings{doronarad_et_al:LIPIcs.ITCS.2026.50,
  author =	{Doron-Arad, Ilan and Kulik, Ariel and Manurangsi, Pasin},
  title =	{{On the PTAS Complexity of Multidimensional Knapsack}},
  booktitle =	{17th Innovations in Theoretical Computer Science Conference (ITCS 2026)},
  pages =	{50:1--50:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-410-9},
  ISSN =	{1868-8969},
  year =	{2026},
  volume =	{362},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2026.50},
  URN =		{urn:nbn:de:0030-drops-253377},
  doi =		{10.4230/LIPIcs.ITCS.2026.50},
  annote =	{Keywords: d-dimensional Knapsack, Multidimensional Knapsack, PTAS, CSP}
}
Document
A General Framework for Low Soundness Homomorphism Testing

Authors: Tushant Mittal and Sourya Roy

Published in: LIPIcs, Volume 362, 17th Innovations in Theoretical Computer Science Conference (ITCS 2026)


Abstract
We introduce a general framework to design and analyze algorithms for the problem of testing homomorphisms between finite groups in the low-soundness regime. In this regime, we give the first constant-query tests for various families of groups. These include tests for: (i) homomorphisms between arbitrary cyclic groups, (ii) homomorphisms between any finite group and ℤ_p, (iii) automorphisms of dihedral and symmetric groups, (iv) inner automorphisms of non-abelian finite simple groups and extraspecial groups, and (v) testing linear characters of GL_n(F_q), and finite-dimensional Lie algebras over F_q. We also recover the result of Kiwi [TCS'03] for testing homomorphisms between F_qⁿ and F_q. Prior to this work, such tests were only known for abelian groups with a constant maximal order (such as F_qⁿ). No tests were known for non-abelian groups. As an additional corollary, our framework gives combinatorial list decoding bounds for cyclic groups with list size dependence of O(ε^{-2}) (for agreement parameter ε). This improves upon the currently best-known bound of O(ε^{-105}) due to Dinur, Grigorescu, Kopparty, and Sudan [STOC'08], and Guo and Sudan [RANDOM'14].

Cite as

Tushant Mittal and Sourya Roy. A General Framework for Low Soundness Homomorphism Testing. In 17th Innovations in Theoretical Computer Science Conference (ITCS 2026). Leibniz International Proceedings in Informatics (LIPIcs), Volume 362, pp. 103:1-103:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2026)


Copy BibTex To Clipboard

@InProceedings{mittal_et_al:LIPIcs.ITCS.2026.103,
  author =	{Mittal, Tushant and Roy, Sourya},
  title =	{{A General Framework for Low Soundness Homomorphism Testing}},
  booktitle =	{17th Innovations in Theoretical Computer Science Conference (ITCS 2026)},
  pages =	{103:1--103:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-410-9},
  ISSN =	{1868-8969},
  year =	{2026},
  volume =	{362},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2026.103},
  URN =		{urn:nbn:de:0030-drops-253901},
  doi =		{10.4230/LIPIcs.ITCS.2026.103},
  annote =	{Keywords: Property Testing, Coding Theory}
}
Document
Triangle Detection in H-Free Graphs

Authors: Amir Abboud, Ron Safier, and Nathan Wallheimer

Published in: LIPIcs, Volume 362, 17th Innovations in Theoretical Computer Science Conference (ITCS 2026)


Abstract
We initiate the study of combinatorial algorithms for Triangle Detection in H-free graphs. The goal is to decide if a graph that forbids a fixed pattern H as a subgraph contains a triangle, using only "combinatorial" methods that notably exclude fast matrix multiplication. Our work aims to classify which patterns admit a subcubic speedup, working towards a dichotomy theorem. On the lower bound side, we show that if H is not 3-colorable or contains more than one triangle, the complexity of the problem remains unchanged, and no combinatorial speedup is likely possible. On the upper bound side, we develop an embedding approach that results in a strongly subcubic, combinatorial algorithm for a rich class of "embeddable" patterns. Specifically, for an embeddable pattern of size k, our algorithm runs in Õ(n^{3-1/(2^{k-3)}}) time, where Õ(⋅) hides poly-logarithmic factors. This algorithm also extends to listing all the triangles within the same time bound. We supplement this main result with two generalizations: - A generalization to patterns that are embeddable up to a single obstacle that arises from a triangle in the pattern. This completes our classification for small patterns, yielding a dichotomy theorem for all patterns of size up to eight. - An H-sensitive algorithm for embeddable patterns, which runs faster when the number of copies of H is significantly smaller than the maximum possible Ω(n^{k}). Finally, we focus on the special case of odd cycles. We present specialized Triangle Detection algorithms that are very efficient: - A combinatorial algorithm for C_{2k+1}-free graphs that runs in Õ(m+n^{1+2/k}) time for every k ≥ 2, where m is the number of edges in the graph. - A combinatorial C₅-sensitive algorithm that runs in Õ(n² + n^{4/3} t^{1/3}) time, where t is the number of 5-cycles in the graph.

Cite as

Amir Abboud, Ron Safier, and Nathan Wallheimer. Triangle Detection in H-Free Graphs. In 17th Innovations in Theoretical Computer Science Conference (ITCS 2026). Leibniz International Proceedings in Informatics (LIPIcs), Volume 362, pp. 1:1-1:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2026)


Copy BibTex To Clipboard

@InProceedings{abboud_et_al:LIPIcs.ITCS.2026.1,
  author =	{Abboud, Amir and Safier, Ron and Wallheimer, Nathan},
  title =	{{Triangle Detection in H-Free Graphs}},
  booktitle =	{17th Innovations in Theoretical Computer Science Conference (ITCS 2026)},
  pages =	{1:1--1:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-410-9},
  ISSN =	{1868-8969},
  year =	{2026},
  volume =	{362},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2026.1},
  URN =		{urn:nbn:de:0030-drops-252885},
  doi =		{10.4230/LIPIcs.ITCS.2026.1},
  annote =	{Keywords: fine-grained complexity, triangle detection, H-free graphs}
}
Document
Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)

Authors: Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
During the workshop, we deeply discussed what CONversational Information ACcess (CONIAC) is and its unique features, proposing a world model abstracting it, and defined the Conversational Agents Framework for Evaluation (CAFE) for the evaluation of CONIAC systems, consisting of six major components: 1) goals of the system’s stakeholders, 2) user tasks to be studied in the evaluation, 3) aspects of the users carrying out the tasks, 4) evaluation criteria to be considered, 5) evaluation methodology to be applied, and 6) measures for the quantitative criteria chosen.

Cite as

Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen. Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 19-67, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{bauer_et_al:DagMan.11.1.19,
  author =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  title =	{{Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)}},
  pages =	{19--67},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.19},
  URN =		{urn:nbn:de:0030-drops-252722},
  doi =		{10.4230/DagMan.11.1.19},
  annote =	{Keywords: Conversational Agents, Evaluation, Information Access}
}
Document
Polynomial Equivalence of Extended Chemical Reaction Models

Authors: Divya Bajaj, Jose-Luis Castellanos, Ryan Knobel, Austin Luchsinger, Aiden Massie, Adrian Salinas, Pablo Santos, Ramiro Santos, Robert Schweller, and Tim Wylie

Published in: LIPIcs, Volume 359, 36th International Symposium on Algorithms and Computation (ISAAC 2025)


Abstract
The ability to detect whether a species (or dimension) is zero in Chemical Reaction Networks (CRN), Vector Addition Systems, or Petri Nets is known to increase the power of these models - making them capable of universal computation. While this ability may appear in many forms, such as extending the models to allow transitions to be inhibited, prioritized, or synchronized, we present an extension that directly performs this zero checking. We introduce a new void genesis CRN variant with a simple design that merely increments the count of a specific species when any other species' count goes to zero. As with previous extensions, we show that the model is Turing Universal. We then analyze several other studied CRN variants and show that they are all equivalent through a polynomial simulation with the void genesis model, which does not merely follow from Turing-universality. Thus, inhibitor species, reactions that occur at different rates, being allowed to run reactions in parallel, or even being allowed to continually add more volume to the CRN, does not add additional simulation power beyond simply detecting if a species count becomes zero.

Cite as

Divya Bajaj, Jose-Luis Castellanos, Ryan Knobel, Austin Luchsinger, Aiden Massie, Adrian Salinas, Pablo Santos, Ramiro Santos, Robert Schweller, and Tim Wylie. Polynomial Equivalence of Extended Chemical Reaction Models. In 36th International Symposium on Algorithms and Computation (ISAAC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 359, pp. 7:1-7:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bajaj_et_al:LIPIcs.ISAAC.2025.7,
  author =	{Bajaj, Divya and Castellanos, Jose-Luis and Knobel, Ryan and Luchsinger, Austin and Massie, Aiden and Salinas, Adrian and Santos, Pablo and Santos, Ramiro and Schweller, Robert and Wylie, Tim},
  title =	{{Polynomial Equivalence of Extended Chemical Reaction Models}},
  booktitle =	{36th International Symposium on Algorithms and Computation (ISAAC 2025)},
  pages =	{7:1--7:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-408-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{359},
  editor =	{Chen, Ho-Lin and Hon, Wing-Kai and Tsai, Meng-Tsung},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2025.7},
  URN =		{urn:nbn:de:0030-drops-249158},
  doi =		{10.4230/LIPIcs.ISAAC.2025.7},
  annote =	{Keywords: Chemical Reaction Networks, Simulations, Petri-nets, Vector Addition Systems, Bi-simulation, Turing-universality, Inhibitors}
}
Document
Approximation Schemes for k-Subset Sum Ratio and k-Way Number Partitioning Ratio

Authors: Sotiris Kanellopoulos, Giorgos Mitropoulos, Antonis Antonopoulos, Nikos Leonardos, Aris Pagourtzis, Christos Pergaminelis, Stavros Petsalakis, and Kanellos Tsitouras

Published in: LIPIcs, Volume 359, 36th International Symposium on Algorithms and Computation (ISAAC 2025)


Abstract
The Subset Sum Ratio problem (SSR) asks, given a multiset A of positive integers, to find two disjoint subsets of A such that the largest-to-smallest ratio of their sums is minimized. In this paper we study the k-version of SSR, namely k-Subset Sum Ratio (k-SSR), which asks to minimize the largest-to-smallest ratio of sums of k disjoint subsets of A. We develop an approximation scheme for k-SSR running in O(n^{2k}/ε^{k-1}) time, where n = |A| and ε is the error parameter. To the best of our knowledge, this is the first FPTAS for k-SSR for fixed k > 2. We also study the k-way Number Partitioning Ratio (k-PART) problem, which differs from k-SSR in that the k subsets must constitute a partition of A; this problem in fact corresponds to the objective of minimizing the largest-to-smallest sum ratio in the family of Multiway Number Partitioning problems. We present a more involved FPTAS for k-PART, also achieving O(n^{2k}/ε^{k-1}) time complexity. Notably, k-PART is also equivalent to the Minimum Envy-Ratio problem with identical valuation functions, which has been studied in the context of fair division of indivisible goods. Thus, for the case of identical valuations, our FPTAS represents a significant improvement over the O(n^{4k²+1}/ε^{2k²}) bound obtained by Nguyen and Rothe’s FPTAS [Trung Thanh Nguyen and Jörg Rothe, 2014] for Minimum Envy-Ratio with general additive valuations. Lastly, we propose a second FPTAS for k-SSR, which employs carefully designed calls to the first one; the new scheme has a time complexity of Õ(n/ε^{3k-1}), thus being much faster when n≫ 1/ ε.

Cite as

Sotiris Kanellopoulos, Giorgos Mitropoulos, Antonis Antonopoulos, Nikos Leonardos, Aris Pagourtzis, Christos Pergaminelis, Stavros Petsalakis, and Kanellos Tsitouras. Approximation Schemes for k-Subset Sum Ratio and k-Way Number Partitioning Ratio. In 36th International Symposium on Algorithms and Computation (ISAAC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 359, pp. 44:1-44:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kanellopoulos_et_al:LIPIcs.ISAAC.2025.44,
  author =	{Kanellopoulos, Sotiris and Mitropoulos, Giorgos and Antonopoulos, Antonis and Leonardos, Nikos and Pagourtzis, Aris and Pergaminelis, Christos and Petsalakis, Stavros and Tsitouras, Kanellos},
  title =	{{Approximation Schemes for k-Subset Sum Ratio and k-Way Number Partitioning Ratio}},
  booktitle =	{36th International Symposium on Algorithms and Computation (ISAAC 2025)},
  pages =	{44:1--44:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-408-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{359},
  editor =	{Chen, Ho-Lin and Hon, Wing-Kai and Tsai, Meng-Tsung},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2025.44},
  URN =		{urn:nbn:de:0030-drops-249521},
  doi =		{10.4230/LIPIcs.ISAAC.2025.44},
  annote =	{Keywords: Fully polynomial-time approximation schemes, Subset Sum Ratio, Number Partitioning, Fair division, Envy minimization, Pseudo-polynomial time algorithms}
}
Document
Fréchet Distance in Unweighted Planar Graphs

Authors: Ivor van der Hoog, Thijs van der Horst, Eva Rotenberg, and Lasse Wulf

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
The Fréchet distance is a distance measure between trajectories in ℝ^d or walks in a graph G. Given constant-time shortest path queries, the Discrete Fréchet distance D_G(P, Q) between two walks P and Q can be computed in O(|P|⋅|Q|) time using a dynamic program. Driemel, van der Hoog, and Rotenberg [SoCG'22] show that for weighted planar graphs this approach is likely tight, as there can be no strongly-subquadratic algorithm to compute a 1.01-approximation of D_G(P, Q) unless the Orthogonal Vector Hypothesis (OVH) fails. Such quadratic-time conditional lower bounds are common to many Fréchet distance variants. However, they can be circumvented by assuming that the input comes from some well-behaved class: There exist (1+ε)-approximations, both in weighted graphs and in ℝ^d, that take near-linear time for c-packed or κ-straight walks in the graph. In ℝ^d there also exists a near-linear time algorithm to compute the Fréchet distance whenever all input edges are long compared to the distance. We consider computing the Fréchet distance in unweighted planar graphs. We show that there exist no strongly-subquadratic 1.25-approximations of the discrete Fréchet distance between two disjoint simple paths in an unweighted planar graph in strongly subquadratic time, unless OVH fails. This improves the previous lower bound, both in terms of generality and approximation factor. We subsequently show that adding graph structure circumvents this lower bound: If the graph is a regular tiling with unit-weighted edges, then there exists an Õ((|P|+|Q|)^{1.5})-time algorithm to compute D_G(P, Q). Our result has natural implications in the plane, as it allows us to define a new class of well-behaved curves that facilitate (1+ε)-approximations of their discrete Fréchet distance in subquadratic time.

Cite as

Ivor van der Hoog, Thijs van der Horst, Eva Rotenberg, and Lasse Wulf. Fréchet Distance in Unweighted Planar Graphs. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 24:1-24:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{vanderhoog_et_al:LIPIcs.ESA.2025.24,
  author =	{van der Hoog, Ivor and van der Horst, Thijs and Rotenberg, Eva and Wulf, Lasse},
  title =	{{Fr\'{e}chet Distance in Unweighted Planar Graphs}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{24:1--24:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.24},
  URN =		{urn:nbn:de:0030-drops-244924},
  doi =		{10.4230/LIPIcs.ESA.2025.24},
  annote =	{Keywords: Fr\'{e}chet distance, planar graphs, lower bounds, approximation algorithms}
}
Document
An Optimal Algorithm for Shortest Paths in Unweighted Disk Graphs

Authors: Bruce W. Brewer and Haitao Wang

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Given in the plane a set S of n points and a set of disks centered at these points, the disk graph G(S) induced by these disks has vertex set S and an edge between two vertices if their disks intersect. Note that the disks may have different radii. We consider the problem of computing shortest paths from a source point s ∈ S to all vertices in G(S) where the length of a path in G(S) is defined as the number of edges in the path. The previously best algorithm solves the problem in O(nlog² n) time. A lower bound of Ω(nlog n) is also known for this problem under the algebraic decision tree model. In this paper, we present an O(nlog n) time algorithm, which matches the lower bound and thus is optimal. Another virtue of our algorithm is that it is quite simple.

Cite as

Bruce W. Brewer and Haitao Wang. An Optimal Algorithm for Shortest Paths in Unweighted Disk Graphs. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 31:1-31:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{brewer_et_al:LIPIcs.ESA.2025.31,
  author =	{Brewer, Bruce W. and Wang, Haitao},
  title =	{{An Optimal Algorithm for Shortest Paths in Unweighted Disk Graphs}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{31:1--31:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.31},
  URN =		{urn:nbn:de:0030-drops-244997},
  doi =		{10.4230/LIPIcs.ESA.2025.31},
  annote =	{Keywords: disk graphs, weighted Voronoi diagrams, shortest paths}
}
Document
(Multivariate) k-SUM as Barrier to Succinct Computation

Authors: Geri Gokaj, Marvin Künnemann, Sabine Storandt, and Carina Truschel

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
How does the time complexity of a problem change when the input is given succinctly rather than explicitly? We study this question for several geometric problems defined on a set X of N points in ℤ^d. As succinct representation, we choose a sumset (or Minkowski sum) representation: Instead of receiving X explicitly, we are given sets A,B of n points that define X as A+B = {a+b∣ a ∈ A,b ∈ B}. We investigate the fine-grained complexity of this succinct version for several Õ(N)-time computable geometric primitives. Remarkably, we can tie their complexity tightly to the complexity of corresponding k-SUM problems. Specifically, we introduce as All-ints 3-SUM(n,n,k) the following multivariate, multi-output variant of 3-SUM: given sets A,B of size n and set C of size k, determine for all c ∈ C whether there are a ∈ A and b ∈ B with a+b = c. We obtain the following results: 1) Succinct closest L_∞-pair requires time N^{1-o(1)} under the 3-SUM hypothesis, while succinct furthest L_∞-pair can be solved in time Õ(n). 2) Succinct bichromatic closest L_∞-Pair requires time N^{1-o(1)} iff the 4-SUM hypothesis holds. 3) The following problems are fine-grained equivalent to All-ints 3-SUM(n,n,k): succinct skyline computation in 2D with output size k and succinct batched orthogonal range search with k given ranges. This establishes conditionally tight Õ(min{nk, N})-time algorithms for these problems. We obtain further connections with All-ints 3-SUM(n,n,k) for succinctly computing independent sets in unit interval graphs. Thus, (Multivariate) k-SUM problems precisely capture the barrier for enabling sumset-succinct computation for various geometric primitives.

Cite as

Geri Gokaj, Marvin Künnemann, Sabine Storandt, and Carina Truschel. (Multivariate) k-SUM as Barrier to Succinct Computation. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 42:1-42:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gokaj_et_al:LIPIcs.ESA.2025.42,
  author =	{Gokaj, Geri and K\"{u}nnemann, Marvin and Storandt, Sabine and Truschel, Carina},
  title =	{{(Multivariate) k-SUM as Barrier to Succinct Computation}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{42:1--42:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.42},
  URN =		{urn:nbn:de:0030-drops-245101},
  doi =		{10.4230/LIPIcs.ESA.2025.42},
  annote =	{Keywords: Fine-grained complexity theory, sumsets, additive combinatorics, succinct inputs, computational geometry}
}
Document
Min-Max Correlation Clustering via Neighborhood Similarity

Authors: Nairen Cao, Steven Roche, and Hsin-Hao Su

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
We present an efficient algorithm for the min-max correlation clustering problem. The input is a complete graph where edges are labeled as either positive (+) or negative (-), and the objective is to find a clustering that minimizes the 𝓁_∞-norm of the disagreement vector over all vertices. We address this problem with an efficient (3 + ε)-approximation algorithm that runs in nearly linear time, Õ(|E^+|), where |E^+| denotes the number of positive edges. This improves upon the previous best-known approximation guarantee of 4 by Heidrich, Irmai, and Andres [Heidrich et al., 2024], whose algorithm runs in O(|V|² + |V| D²) time, where |V| is the number of nodes and D is the maximum degree in the graph (V,E^+). Furthermore, we extend our algorithm to the massively parallel computation (MPC) model and the semi-streaming model. In the MPC model, our algorithm runs on machines with memory sublinear in the number of nodes and takes O(1) rounds. In the streaming model, our algorithm requires only Õ(|V|) space, where |V| is the number of vertices in the graph. Our algorithms are purely combinatorial. They are based on a novel structural observation about the optimal min-max instance, which enables the construction of a (3 + ε)-approximation algorithm using O(|E^+|) neighborhood similarity queries. By leveraging random projection, we further show these queries can be computed in nearly linear time.

Cite as

Nairen Cao, Steven Roche, and Hsin-Hao Su. Min-Max Correlation Clustering via Neighborhood Similarity. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 41:1-41:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cao_et_al:LIPIcs.ESA.2025.41,
  author =	{Cao, Nairen and Roche, Steven and Su, Hsin-Hao},
  title =	{{Min-Max Correlation Clustering via Neighborhood Similarity}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{41:1--41:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.41},
  URN =		{urn:nbn:de:0030-drops-245098},
  doi =		{10.4230/LIPIcs.ESA.2025.41},
  annote =	{Keywords: Min Max Correlation Clustering, Approximate algorithms}
}
Document
Separating Two Points with Obstacles in the Plane: Improved Upper and Lower Bounds

Authors: Jack Spalding-Jamieson and Anurag Murty Naredla

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
Given two points in the plane, and a set of "obstacles" given as curves through the plane with assigned weights, we consider the point-separation problem, which asks for a minimum-weight subset of the obstacles separating the two points. A few computational models for this problem have been previously studied. We give a unified approach to this problem in all models via a reduction to a particular shortest-path problem, and obtain improved running times in essentially all cases. In addition, we also give fine-grained lower bounds for many cases.

Cite as

Jack Spalding-Jamieson and Anurag Murty Naredla. Separating Two Points with Obstacles in the Plane: Improved Upper and Lower Bounds. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 90:1-90:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{spaldingjamieson_et_al:LIPIcs.ESA.2025.90,
  author =	{Spalding-Jamieson, Jack and Naredla, Anurag Murty},
  title =	{{Separating Two Points with Obstacles in the Plane: Improved Upper and Lower Bounds}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{90:1--90:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.90},
  URN =		{urn:nbn:de:0030-drops-245598},
  doi =		{10.4230/LIPIcs.ESA.2025.90},
  annote =	{Keywords: obstacle separation, point separation, geometric intersection graph, Z₂-homology, fine-grained lower bounds}
}
Document
Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR

Authors: Bryson Lawton, Frank Maurer, and Daniel Zielasko

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
With thousands of exoplanets now confirmed by space missions such as NASA’s Kepler and TESS, scientific interest and public curiosity about these distant worlds continue to grow. However, current visualization tools for exploring exoplanetary systems often lack sufficient scientific accuracy or interactive features, limiting their educational effectiveness and analytical utility. To help address this gap, we developed ExoAR, an augmented reality tool designed to offer immersive, scientifically sound visualizations of all known exoplanetary systems using data directly sourced from NASA’s Exoplanet Archive. By leveraging augmented reality’s strengths, ExoAR enables users to immerse themselves in interactive, dynamic 3D models of these planetary systems with data-driven representations of planets and their host stars. The application also allows users to adjust various visualization scales independently, a capability designed to aid comprehension of comparative astronomical properties such as orbital mechanics, planetary sizes, and stellar classifications. To begin assessing ExoAR’s potential as an educational and analytical tool and inform future iterations, a pilot user study was conducted. Its findings indicate that participants found ExoAR improved user engagement and spatial understanding compared to NASA’s Eyes on Exoplanets application, a non-immersive exoplanetary system visualization tool. This work-in-progress paper presents these early insights, acknowledges current system limitations, and outlines future directions for more rigorously evaluating and further improving ExoAR’s capabilities for both educational and scientific communities.

Cite as

Bryson Lawton, Frank Maurer, and Daniel Zielasko. Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 20:1-20:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{lawton_et_al:OASIcs.SpaceCHI.2025.20,
  author =	{Lawton, Bryson and Maurer, Frank and Zielasko, Daniel},
  title =	{{Navigating Exoplanetary Systems in Augmented Reality: Preliminary Insights on ExoAR}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{20:1--20:13},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.20},
  URN =		{urn:nbn:de:0030-drops-240106},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.20},
  annote =	{Keywords: Immersive Analytics, Data Visualization, Astronomy, Astrophysics, Exoplanet, Augmented Reality, AR}
}
Document
NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface

Authors: Jasmine Q. Wu, Andrew J. Hwang, and Matthew J. Bietz

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
Extravehicular Activity (EVA) is one of the most complex operational endeavors during human lunar exploration. A key aspect of successful operations involves adapting procedures to address unexpected hazards on the lunar surface. Current route mapping systems rely heavily on static navigation planning around craters, high elevations, and extreme weather conditions to accomplish pre-defined mission objectives. However, the high-resolution data necessary for reliable route mapping is often unavailable. To address this challenge, we have designed NEREUS, a Decision Support System (DSS) that helps EVA operators on the ground respond to anomalies faster by simulating multiple alternative routes in parallel and visualizing trade-offs in consumable resources, speed, and safety as well as impact on overall mission timeline. The system offloads computationally intensive tasks like calculating the impact of evolving hazard data, allowing operators to focus on higher-level decision-making.

Cite as

Jasmine Q. Wu, Andrew J. Hwang, and Matthew J. Bietz. NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 25:1-25:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{wu_et_al:OASIcs.SpaceCHI.2025.25,
  author =	{Wu, Jasmine Q. and Hwang, Andrew J. and Bietz, Matthew J.},
  title =	{{NEREUS: An Assistive Decision Support System for Real-Time, Adaptive Route Guidance in Extravehicular Navigation Activities on the Lunar Surface}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{25:1--25:14},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.25},
  URN =		{urn:nbn:de:0030-drops-240158},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.25},
  annote =	{Keywords: Human Computer Interaction (HCI), Adaptive Navigation, Decision Support, Cognitive Load Analysis, Decision Support System, Extravehicular Activity}
}
Document
Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks

Authors: David Andrew Green

Published in: OASIcs, Volume 130, Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)


Abstract
The Artemis programme seeks to develop and test concepts, hardware and approaches to support long term habitation of the Lunar surface, and future missions to Mars. In preparation for the Artemis missions determination of tasks to be performed, the functional requirements of such tasks and as mission duration extends whether physiological deconditioning becomes functionally significant, compromising the crew member’s ability to perform critical tasks on the surface, and/or upon return to earth [MoLo-LUNA – leveraging the Molo programme (and several other activities) - could become a key supporting activity for LUNA incl. validation of the Puppeteer offloading system itself via creation of a complementary MoLo-LUNA-LAB. Furthermore, the MoLo-LUNA programme could become a key facilitator of simulator suit instrumentation/definition, broader astronaut training activities and mission architecture development – including Artemis mission simulations. By employing a Puppeteer system external to the LUNA chamber hall it will optimise utilisation and cost-effectiveness of LUNA, and as such represents a critical service to future LUNA stakeholders. Furthermore, MoLo-LUNA would generate a unique data set that can be leveraged to predict de-conditioning on the Lunar surface - and thereby optimise functionality, and minimise mission risk – including informing the need for, and prescription of exercise countermeasures on the Lunar Surface and in transit. Thus, MoLo-LUNA offers a unique opportunity to place LUNA, and ESA as a key ongoing provider of evidence to define, optimise and support crew Artemis surface missions.

Cite as

David Andrew Green. Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks. In Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025). Open Access Series in Informatics (OASIcs), Volume 130, pp. 26:1-26:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{green:OASIcs.SpaceCHI.2025.26,
  author =	{Green, David Andrew},
  title =	{{Movement in Low Gravity (MoLo) – LUNA: Biomechanical Modelling to Mitigate Lunar Surface Operation Risks}},
  booktitle =	{Advancing Human-Computer Interaction for Space Exploration (SpaceCHI 2025)},
  pages =	{26:1--26:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-384-3},
  ISSN =	{2190-6807},
  year =	{2025},
  volume =	{130},
  editor =	{Bensch, Leonie and Nilsson, Tommy and Nisser, Martin and Pataranutaporn, Pat and Schmidt, Albrecht and Sumini, Valentina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SpaceCHI.2025.26},
  URN =		{urn:nbn:de:0030-drops-240166},
  doi =		{10.4230/OASIcs.SpaceCHI.2025.26},
  annote =	{Keywords: Locomotion, hypogravity, modelling, Lunar}
}
  • Refine by Type
  • 44 Document/PDF
  • 38 Document/HTML

  • Refine by Publication Year
  • 4 2026
  • 34 2025
  • 2 2023
  • 1 2020
  • 1 2018
  • Show More...

  • Refine by Author
  • 3 Johnson, Timothy
  • 2 Besa, Juan Jose
  • 2 Chan, Timothy M.
  • 2 Devanny, William E.
  • 2 Eppstein, David
  • Show More...

  • Refine by Series/Journal
  • 37 LIPIcs
  • 3 OASIcs
  • 1 LITES
  • 2 TGDK
  • 1 DagMan

  • Refine by Classification
  • 12 Theory of computation → Computational geometry
  • 5 Theory of computation → Design and analysis of algorithms
  • 5 Theory of computation → Graph algorithms analysis
  • 4 Theory of computation → Approximation algorithms analysis
  • 3 Theory of computation → Parameterized complexity and exact algorithms
  • Show More...

  • Refine by Keyword
  • 2 Approximation Algorithms
  • 2 Approximation algorithms
  • 2 Chemical Reaction Networks
  • 2 Fréchet distance
  • 2 disk graphs
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail