14 Search Results for "Seddighin, Saeed"


Document
Even Faster Knapsack via Rectangular Monotone Min-Plus Convolution and Balancing

Authors: Karl Bringmann, Anita Dürr, and Adam Polak

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
We present a pseudopolynomial-time algorithm for the Knapsack problem that has running time Õ(n + t√{p_{max}}), where n is the number of items, t is the knapsack capacity, and p_{max} is the maximum item profit. This improves over the Õ(n + t p_{max})-time algorithm based on the convolution and prediction technique by Bateni et al. (STOC 2018). Moreover, we give some evidence, based on a strengthening of the Min-Plus Convolution Hypothesis, that our running time might be optimal. Our algorithm uses two new technical tools, which might be of independent interest. First, we generalize the Õ(n^{1.5})-time algorithm for bounded monotone min-plus convolution by Chi et al. (STOC 2022) to the rectangular case where the range of entries can be different from the sequence length. Second, we give a reduction from general knapsack instances to balanced instances, where all items have nearly the same profit-to-weight ratio, up to a constant factor. Using these techniques, we can also obtain algorithms that run in time Õ(n + OPT√{w_{max}}), Õ(n + (nw_{max}p_{max})^{1/3}t^{2/3}), and Õ(n + (nw_{max}p_{max})^{1/3} OPT^{2/3}), where OPT is the optimal total profit and w_{max} is the maximum item weight.

Cite as

Karl Bringmann, Anita Dürr, and Adam Polak. Even Faster Knapsack via Rectangular Monotone Min-Plus Convolution and Balancing. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 33:1-33:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.ESA.2024.33,
  author =	{Bringmann, Karl and D\"{u}rr, Anita and Polak, Adam},
  title =	{{Even Faster Knapsack via Rectangular Monotone Min-Plus Convolution and Balancing}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{33:1--33:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.33},
  URN =		{urn:nbn:de:0030-drops-211047},
  doi =		{10.4230/LIPIcs.ESA.2024.33},
  annote =	{Keywords: 0-1-Knapsack problem, bounded monotone min-plus convolution, fine-grained complexity}
}
Document
Parameterized Quantum Query Algorithms for Graph Problems

Authors: Tatsuya Terao and Ryuhei Mori

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
In this paper, we consider the parameterized quantum query complexity for graph problems. We design parameterized quantum query algorithms for k-vertex cover and k-matching problems, and present lower bounds on the parameterized quantum query complexity. Then, we show that our quantum query algorithms are optimal up to a constant factor when the parameters are small. Our main results are as follows. Parameterized quantum query complexity of vertex cover. In the k-vertex cover problem, we are given an undirected graph G with n vertices and an integer k, and the objective is to determine whether G has a vertex cover of size at most k. We show that the quantum query complexity of the k-vertex cover problem is O(√kn + k^{3/2}√n) in the adjacency matrix model. For the design of the quantum query algorithm, we use the method of kernelization, a well-known tool for the design of parameterized classical algorithms, combined with Grover’s search. Parameterized quantum query complexity of matching. In the k-matching problem, we are given an undirected graph G with n vertices and an integer k, and the objective is to determine whether G has a matching of size at least k. We show that the quantum query complexity of the k-matching problem is O(√kn + k²) in the adjacency matrix model. We obtain this upper bound by using Grover’s search carefully and analyzing the number of Grover’s searches by making use of potential functions. We also show that the quantum query complexity of the maximum matching problem is O(√pn + p²) where p is the size of the maximum matching. For small p, it improves known bounds Õ(n^{3/2}) for bipartite graphs [Blikstad-v.d.Brand-Efron-Mukhopadhyay-Nanongkai, FOCS 2022] and O(n^{7/4}) for general graphs [Kimmel-Witter, WADS 2021]. Lower bounds on parameterized quantum query complexity. We also present lower bounds on the quantum query complexities of the k-vertex cover and k-matching problems. The lower bounds prove the optimality of the above parameterized quantum query algorithms up to a constant factor when k is small. Indeed, the quantum query complexities of the k-vertex cover and k-matching problems are both Θ(√k n) when k = O(√n) and k = O(n^{2/3}), respectively.

Cite as

Tatsuya Terao and Ryuhei Mori. Parameterized Quantum Query Algorithms for Graph Problems. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 99:1-99:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{terao_et_al:LIPIcs.ESA.2024.99,
  author =	{Terao, Tatsuya and Mori, Ryuhei},
  title =	{{Parameterized Quantum Query Algorithms for Graph Problems}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{99:1--99:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.99},
  URN =		{urn:nbn:de:0030-drops-211707},
  doi =		{10.4230/LIPIcs.ESA.2024.99},
  annote =	{Keywords: Quantum query complexity, parameterized algorithms, vertex cover, matching, kernelization}
}
Document
APPROX
Distributional Online Weighted Paging with Limited Horizon

Authors: Yaron Fairstein, Joseph (Seffi) Naor, and Tomer Tsachor

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
In this work we study the classic problem of online weighted paging with a probabilistic prediction model, in which we are given additional information about the input in the form of distributions over page requests, known as distributional online paging (DOP). This work continues a recent line of research on learning-augmented algorithms that incorporates machine-learning predictions in online algorithms, so as to go beyond traditional worst-case competitive analysis, thus circumventing known lower bounds for online paging. We first provide an efficient online algorithm that achieves a constant factor competitive ratio with respect to the best online algorithm (policy) for weighted DOP that follows from earlier work on the stochastic k-server problem. Our main contribution concerns the question of whether distributional information over a limited horizon suffices for obtaining a constant competitive factor. To this end, we define in a natural way a new predictive model with limited horizon, which we call Per-Request Stochastic Prediction (PRSP). We show that we can obtain a constant factor competitive algorithm with respect to the optimal online algorithm for this model.

Cite as

Yaron Fairstein, Joseph (Seffi) Naor, and Tomer Tsachor. Distributional Online Weighted Paging with Limited Horizon. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 15:1-15:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{fairstein_et_al:LIPIcs.APPROX/RANDOM.2024.15,
  author =	{Fairstein, Yaron and Naor, Joseph (Seffi) and Tsachor, Tomer},
  title =	{{Distributional Online Weighted Paging with Limited Horizon}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{15:1--15:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.15},
  URN =		{urn:nbn:de:0030-drops-210088},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.15},
  annote =	{Keywords: Online algorithms, Caching, Stochastic analysis, Predictions}
}
Document
Track A: Algorithms, Complexity and Games
Minimizing Tardy Processing Time on a Single Machine in Near-Linear Time

Authors: Nick Fischer and Leo Wennmann

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
In this work we revisit the elementary scheduling problem 1||∑ p_j U_j. The goal is to select, among n jobs with processing times and due dates, a subset of jobs with maximum total processing time that can be scheduled in sequence without violating their due dates. This problem is NP-hard, but a classical algorithm by Lawler and Moore from the 60s solves this problem in pseudo-polynomial time O(nP), where P is the total processing time of all jobs. With the aim to develop best-possible pseudo-polynomial-time algorithms, a recent wave of results has improved Lawler and Moore’s algorithm for 1||∑ p_j U_j: First to time Õ(P^{7/4}) [Bringmann, Fischer, Hermelin, Shabtay, Wellnitz; ICALP'20], then to time Õ(P^{5/3}) [Klein, Polak, Rohwedder; SODA'23], and finally to time Õ(P^{7/5}) [Schieber, Sitaraman; WADS'23]. It remained an exciting open question whether these works can be improved further. In this work we develop an algorithm in near-linear time Õ(P) for the 1||∑ p_j U_j problem. This running time not only significantly improves upon the previous results, but also matches conditional lower bounds based on the Strong Exponential Time Hypothesis or the Set Cover Hypothesis and is therefore likely optimal (up to subpolynomial factors). Our new algorithm also extends to the case of m machines in time Õ(P^m). In contrast to the previous improvements, we take a different, more direct approach inspired by the recent reductions from Modular Subset Sum to dynamic string problems. We thereby arrive at a satisfyingly simple algorithm.

Cite as

Nick Fischer and Leo Wennmann. Minimizing Tardy Processing Time on a Single Machine in Near-Linear Time. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 64:1-64:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{fischer_et_al:LIPIcs.ICALP.2024.64,
  author =	{Fischer, Nick and Wennmann, Leo},
  title =	{{Minimizing Tardy Processing Time on a Single Machine in Near-Linear Time}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{64:1--64:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.64},
  URN =		{urn:nbn:de:0030-drops-202079},
  doi =		{10.4230/LIPIcs.ICALP.2024.64},
  annote =	{Keywords: Scheduling, Fine-Grained Complexity, Dynamic Strings}
}
Document
Track A: Algorithms, Complexity and Games
Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification

Authors: Esther Galby, Sándor Kisfaludi-Bak, Dániel Marx, and Roohani Sharma

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
In the Directed Steiner Network problem, the input is a directed graph G, a set T ⊆ V(G) of k terminals, and a demand graph D on T. The task is to find a subgraph H ⊆ G with the minimum number of edges such that for every (s,t) ∈ E(D), the solution H contains a directed s → t path. The goal of this paper is to investigate how the complexity of the problem depends on the demand pattern in planar graphs. Formally, if 𝒟 is a class of directed graphs, then the 𝒟-Steiner Network (𝒟-DSN) problem is the special case where the demand graph D is restricted to be from 𝒟. We give a complete characterization of the behavior of every 𝒟-DSN problem on planar graphs. We classify every class 𝒟 closed under transitive equivalence and identification of vertices into three cases: assuming ETH, either the problem is 1) solvable in time 2^O(k)⋅n^O(1), i.e., FPT parameterized by the number k of terminals, but not solvable in time 2^o(k)⋅n^O(1), 2) solvable in time f(k)⋅n^O(√k), but cannot be solved in time f(k)⋅n^o(√k), or 3) solvable in time f(k)⋅n^O(k), but cannot be solved in time f(k)⋅n^o(k). Our result is a far-reaching generalization and unification of earlier results on Directed Steiner Tree, Directed Steiner Network, and Strongly Connected Steiner Subgraph on planar graphs. As an important step of our lower bound proof, we discover a rare example of a genuinely planar problem (i.e., described by a planar graph and two sets of vertices) that cannot be solved in time f(k)⋅n^o(k): given two sets of terminals S and T with |S|+|T| = k, find a subgraph with minimum number of edges such that every vertex of T is reachable from every vertex of S.

Cite as

Esther Galby, Sándor Kisfaludi-Bak, Dániel Marx, and Roohani Sharma. Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 67:1-67:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.ICALP.2024.67,
  author =	{Galby, Esther and Kisfaludi-Bak, S\'{a}ndor and Marx, D\'{a}niel and Sharma, Roohani},
  title =	{{Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{67:1--67:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.67},
  URN =		{urn:nbn:de:0030-drops-202104},
  doi =		{10.4230/LIPIcs.ICALP.2024.67},
  annote =	{Keywords: Directed Steiner Network, Sub-exponential algorithm}
}
Document
Track A: Algorithms, Complexity and Games
No Polynomial Kernels for Knapsack

Authors: Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
This paper focuses on kernelization algorithms for the fundamental Knapsack problem. A kernelization algorithm (or kernel) is a polynomial-time reduction from a problem onto itself, where the output size is bounded by a function of some problem-specific parameter. Such algorithms provide a theoretical model for data reduction and preprocessing and are central in the area of parameterized complexity. In this way, a kernel for Knapsack for some parameter k reduces any instance of Knapsack to an equivalent instance of size at most f(k) in polynomial time, for some computable function f. When f(k) = k^{O(1)} then we call such a reduction a polynomial kernel. Our study focuses on two natural parameters for Knapsack: The number w_# of different item weights, and the number p_# of different item profits. Our main technical contribution is a proof showing that Knapsack does not admit a polynomial kernel for any of these two parameters under standard complexity-theoretic assumptions. Our proof discovers an elaborate application of the standard kernelization lower bound framework, and develops along the way novel ideas that should be useful for other problems as well. We complement our lower bounds by showing that Knapsack admits a polynomial kernel for the combined parameter w_# ⋅ p_#.

Cite as

Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay. No Polynomial Kernels for Knapsack. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 83:1-83:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.ICALP.2024.83,
  author =	{Heeger, Klaus and Hermelin, Danny and Mnich, Matthias and Shabtay, Dvir},
  title =	{{No Polynomial Kernels for Knapsack}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{83:1--83:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.83},
  URN =		{urn:nbn:de:0030-drops-202261},
  doi =		{10.4230/LIPIcs.ICALP.2024.83},
  annote =	{Keywords: Knapsack, polynomial kernels, compositions, number of different weights, number of different profits}
}
Document
APPROX
On Complexity of 1-Center in Various Metrics

Authors: Amir Abboud, MohammadHossein Bateni, Vincent Cohen-Addad, Karthik C. S., and Saeed Seddighin

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
We consider the classic 1-center problem: Given a set P of n points in a metric space find the point in P that minimizes the maximum distance to the other points of P. We study the complexity of this problem in d-dimensional 𝓁_p-metrics and in edit and Ulam metrics over strings of length d. Our results for the 1-center problem may be classified based on d as follows. - Small d. Assuming the hitting set conjecture (HSC), we show that when d = ω(log n), no subquadratic algorithm can solve the 1-center problem in any of the 𝓁_p-metrics, or in the edit or Ulam metrics. - Large d. When d = Ω(n), we extend our conditional lower bound to rule out subquartic algorithms for the 1-center problem in edit metric (assuming Quantified SETH). On the other hand, we give a (1+ε)-approximation for 1-center in the Ulam metric with running time O_{ε}̃(nd+n²√d). We also strengthen some of the above lower bounds by allowing approximation algorithms or by reducing the dimension d, but only against a weaker class of algorithms which list all requisite solutions. Moreover, we extend one of our hardness results to rule out subquartic algorithms for the well-studied 1-median problem in the edit metric, where given a set of n strings each of length n, the goal is to find a string in the set that minimizes the sum of the edit distances to the rest of the strings in the set.

Cite as

Amir Abboud, MohammadHossein Bateni, Vincent Cohen-Addad, Karthik C. S., and Saeed Seddighin. On Complexity of 1-Center in Various Metrics. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 1:1-1:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{abboud_et_al:LIPIcs.APPROX/RANDOM.2023.1,
  author =	{Abboud, Amir and Bateni, MohammadHossein and Cohen-Addad, Vincent and Karthik C. S. and Seddighin, Saeed},
  title =	{{On Complexity of 1-Center in Various Metrics}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{1:1--1:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.1},
  URN =		{urn:nbn:de:0030-drops-188260},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.1},
  annote =	{Keywords: Center, Clustering, Edit metric, Ulam metric, Hamming metric, Fine-grained Complexity, Approximation}
}
Document
Quantum Meets Fine-Grained Complexity: Sublinear Time Quantum Algorithms for String Problems

Authors: François Le Gall and Saeed Seddighin

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
Longest common substring (LCS), longest palindrome substring (LPS), and Ulam distance (UL) are three fundamental string problems that can be classically solved in near linear time. In this work, we present sublinear time quantum algorithms for these problems along with quantum lower bounds. Our results shed light on a very surprising fact: Although the classic solutions for LCS and LPS are almost identical (via suffix trees), their quantum computational complexities are different. While we give an exact Õ(√n) time algorithm for LPS, we prove that LCS needs at least time ̃ Ω(n^{2/3}) even for 0/1 strings.

Cite as

François Le Gall and Saeed Seddighin. Quantum Meets Fine-Grained Complexity: Sublinear Time Quantum Algorithms for String Problems. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 97:1-97:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{legall_et_al:LIPIcs.ITCS.2022.97,
  author =	{Le Gall, Fran\c{c}ois and Seddighin, Saeed},
  title =	{{Quantum Meets Fine-Grained Complexity: Sublinear Time Quantum Algorithms for String Problems}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{97:1--97:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.97},
  URN =		{urn:nbn:de:0030-drops-156934},
  doi =		{10.4230/LIPIcs.ITCS.2022.97},
  annote =	{Keywords: Longest common substring, Longest palindrome substring, Quantum algorithms, Sublinear algorithms}
}
Document
3+ε Approximation of Tree Edit Distance in Truly Subquadratic Time

Authors: Masoud Seddighin and Saeed Seddighin

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
Tree edit distance is a well-known generalization of the edit distance problem to rooted trees. In this problem, the goal is to transform a rooted tree into another rooted tree via (i) node addition, (ii) node deletion, and (iii) node relabel. In this work, we give a truly subquadratic time algorithm that approximates tree edit distance within a factor 3+ε. Our result is obtained through a novel extension of a 3-step framework that approximates edit distance in truly subquadratic time. This framework has also been previously used to approximate longest common subsequence in subquadratic time.

Cite as

Masoud Seddighin and Saeed Seddighin. 3+ε Approximation of Tree Edit Distance in Truly Subquadratic Time. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 115:1-115:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{seddighin_et_al:LIPIcs.ITCS.2022.115,
  author =	{Seddighin, Masoud and Seddighin, Saeed},
  title =	{{3+\epsilon Approximation of Tree Edit Distance in Truly Subquadratic Time}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{115:1--115:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.115},
  URN =		{urn:nbn:de:0030-drops-157116},
  doi =		{10.4230/LIPIcs.ITCS.2022.115},
  annote =	{Keywords: tree edit distance, approximation, subquadratic, edit distance}
}
Document
Track A: Algorithms, Complexity and Games
Streaming and Small Space Approximation Algorithms for Edit Distance and Longest Common Subsequence

Authors: Kuan Cheng, Alireza Farhadi, MohammadTaghi Hajiaghayi, Zhengzhong Jin, Xin Li, Aviad Rubinstein, Saeed Seddighin, and Yu Zheng

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
The edit distance (ED) and longest common subsequence (LCS) are two fundamental problems which quantify how similar two strings are to one another. In this paper, we first consider these problems in the asymmetric streaming model introduced by Andoni, Krauthgamer and Onak [Andoni et al., 2010] (FOCS'10) and Saks and Seshadhri [Saks and Seshadhri, 2013] (SODA'13). In this model we have random access to one string and streaming access the other one. Our main contribution is a constant factor approximation algorithm for ED with memory Õ(n^δ) for any constant δ > 0. In addition to this, we present an upper bound of Õ _ε(√n) on the memory needed to approximate ED or LCS within a factor 1±ε. All our algorithms are deterministic and run in polynomial time in a single pass. We further study small-space approximation algorithms for ED, LCS, and longest increasing sequence (LIS) in the non-streaming setting. Here, we design algorithms that achieve 1 ± ε approximation for all three problems, where ε > 0 can be any constant and even slightly sub-constant. Our algorithms only use poly-logarithmic space while maintaining a polynomial running time. This significantly improves previous results in terms of space complexity, where all known results need to use space at least Ω(√n). Our algorithms make novel use of triangle inequality and carefully designed recursions to save space, which can be of independent interest.

Cite as

Kuan Cheng, Alireza Farhadi, MohammadTaghi Hajiaghayi, Zhengzhong Jin, Xin Li, Aviad Rubinstein, Saeed Seddighin, and Yu Zheng. Streaming and Small Space Approximation Algorithms for Edit Distance and Longest Common Subsequence. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 54:1-54:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cheng_et_al:LIPIcs.ICALP.2021.54,
  author =	{Cheng, Kuan and Farhadi, Alireza and Hajiaghayi, MohammadTaghi and Jin, Zhengzhong and Li, Xin and Rubinstein, Aviad and Seddighin, Saeed and Zheng, Yu},
  title =	{{Streaming and Small Space Approximation Algorithms for Edit Distance and Longest Common Subsequence}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{54:1--54:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.54},
  URN =		{urn:nbn:de:0030-drops-141236},
  doi =		{10.4230/LIPIcs.ICALP.2021.54},
  annote =	{Keywords: Edit Distance, Longest Common Subsequence, Longest Increasing Subsequence, Space Efficient Algorithm, Approximation Algorithm}
}
Document
Greedy Algorithms for Online Survivable Network Design

Authors: Sina Dehghani, Soheil Ehsani, MohammadTaghi Hajiaghayi, Vahid Liaghat, and Saeed Seddighin

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
In an instance of the network design problem, we are given a graph G=(V,E), an edge-cost function c:E -> R^{>= 0}, and a connectivity criterion. The goal is to find a minimum-cost subgraph H of G that meets the connectivity requirements. An important family of this class is the survivable network design problem (SNDP): given non-negative integers r_{u v} for each pair u,v in V, the solution subgraph H should contain r_{u v} edge-disjoint paths for each pair u and v. While this problem is known to admit good approximation algorithms in the offline case, the problem is much harder in the online setting. Gupta, Krishnaswamy, and Ravi [Gupta et al., 2012] (STOC'09) are the first to consider the online survivable network design problem. They demonstrate an algorithm with competitive ratio of O(k log^3 n), where k=max_{u,v} r_{u v}. Note that the competitive ratio of the algorithm by Gupta et al. grows linearly in k. Since then, an important open problem in the online community [Naor et al., 2011; Gupta et al., 2012] is whether the linear dependence on k can be reduced to a logarithmic dependency. Consider an online greedy algorithm that connects every demand by adding a minimum cost set of edges to H. Surprisingly, we show that this greedy algorithm significantly improves the competitive ratio when a congestion of 2 is allowed on the edges or when the model is stochastic. While our algorithm is fairly simple, our analysis requires a deep understanding of k-connected graphs. In particular, we prove that the greedy algorithm is O(log^2 n log k)-competitive if one satisfies every demand between u and v by r_{uv}/2 edge-disjoint paths. The spirit of our result is similar to the work of Chuzhoy and Li [Chuzhoy and Li, 2012] (FOCS'12), in which the authors give a polylogarithmic approximation algorithm for edge-disjoint paths with congestion 2. Moreover, we study the greedy algorithm in the online stochastic setting. We consider the i.i.d. model, where each online demand is drawn from a single probability distribution, the unknown i.i.d. model, where every demand is drawn from a single but unknown probability distribution, and the prophet model in which online demands are drawn from (possibly) different probability distributions. Through a different analysis, we prove that a similar greedy algorithm is constant competitive for the i.i.d. and the prophet models. Also, the greedy algorithm is O(log n)-competitive for the unknown i.i.d. model, which is almost tight due to the lower bound of [Garg et al., 2008] for single connectivity.

Cite as

Sina Dehghani, Soheil Ehsani, MohammadTaghi Hajiaghayi, Vahid Liaghat, and Saeed Seddighin. Greedy Algorithms for Online Survivable Network Design. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 152:1-152:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{dehghani_et_al:LIPIcs.ICALP.2018.152,
  author =	{Dehghani, Sina and Ehsani, Soheil and Hajiaghayi, MohammadTaghi and Liaghat, Vahid and Seddighin, Saeed},
  title =	{{Greedy Algorithms for Online Survivable Network Design}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{152:1--152:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.152},
  URN =		{urn:nbn:de:0030-drops-91569},
  doi =		{10.4230/LIPIcs.ICALP.2018.152},
  annote =	{Keywords: survivable network design, online, greedy}
}
Document
Stochastic k-Server: How Should Uber Work?

Authors: Sina Dehghani, Soheil Ehsani, MohammadTaghi Hajiaghayi, Vahid Liaghat, and Saeed Seddighin

Published in: LIPIcs, Volume 80, 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)


Abstract
In this paper we study a stochastic variant of the celebrated $k$-server problem. In the k-server problem, we are required to minimize the total movement of k servers that are serving an online sequence of $t$ requests in a metric. In the stochastic setting we are given t independent distributions <P_1, P_2, ..., P_t> in advance, and at every time step i a request is drawn from P_i. Designing the optimal online algorithm in such setting is NP-hard, therefore the emphasis of our work is on designing an approximately optimal online algorithm. We first show a structural characterization for a certain class of non-adaptive online algorithms. We prove that in general metrics, the best of such algorithms has a cost of no worse than three times that of the optimal online algorithm. Next, we present an integer program that finds the optimal algorithm of this class for any arbitrary metric. Finally by rounding the solution of the linear relaxation of this program, we present an online algorithm for the stochastic k-server problem with an approximation factor of $3$ in the line and circle metrics and factor of O(log n) in general metrics. In this way, we achieve an approximation factor that is independent of k, the number of servers. Moreover, we define the Uber problem, motivated by extraordinary growth of online network transportation services. In the Uber problem, each demand consists of two points -a source and a destination- in the metric. Serving a demand is to move a server to its source and then to its destination. The objective is again minimizing the total movement of the k given servers. It is not hard to show that given an alpha-approximation algorithm for the k-server problem, we can obtain a max{3,alpha}-approximation algorithm for the Uber problem. Motivated by the fact that demands are usually highly correlated with the time (e.g. what day of the week or what time of the day the demand is arrived), we study the stochastic Uber problem. Using our results for stochastic k-server we can obtain a 3-approximation algorithm for the stochastic Uber problem in line and circle metrics, and a O(log n)-approximation algorithm for a general metric of size n. Furthermore, we extend our results to the correlated setting where the probability of a request arriving at a certain point depends not only on the time step but also on the previously arrived requests.

Cite as

Sina Dehghani, Soheil Ehsani, MohammadTaghi Hajiaghayi, Vahid Liaghat, and Saeed Seddighin. Stochastic k-Server: How Should Uber Work?. In 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 80, pp. 126:1-126:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{dehghani_et_al:LIPIcs.ICALP.2017.126,
  author =	{Dehghani, Sina and Ehsani, Soheil and Hajiaghayi, MohammadTaghi and Liaghat, Vahid and Seddighin, Saeed},
  title =	{{Stochastic k-Server: How Should Uber Work?}},
  booktitle =	{44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)},
  pages =	{126:1--126:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-041-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{80},
  editor =	{Chatzigiannakis, Ioannis and Indyk, Piotr and Kuhn, Fabian and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2017.126},
  URN =		{urn:nbn:de:0030-drops-74806},
  doi =		{10.4230/LIPIcs.ICALP.2017.126},
  annote =	{Keywords: k-server, stochastic, competitive ratio, online algorithm, Uber}
}
Document
Price of Competition and Dueling Games

Authors: Sina Dehghani, Mohammad Taghi Hajiaghayi, Hamid Mahini, and Saeed Seddighin

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
We study competition in a general framework introduced by Immorlica, Kalai, Lucier, Moitra, Postlewaite, and Tennenholtz and answer their main open question. Immorlica et al. considered classic optimization problems in terms of competition and introduced a general class of games called dueling games. They model this competition as a zero-sum game, where two players are competing for a user’s satisfaction. In their main and most natural game, the ranking duel, a user requests a webpage by submitting a query and players output an ordering over all possible webpages based on the submitted query. The user tends to choose the ordering which displays her requested webpage in a higher rank. The goal of both players is to maximize the probability that her ordering beats that of her opponent and gets the user's attention. Immorlica et al. show this game directs both players to provide suboptimal search results. However, they leave the following as their main open question: "does competition between algorithms improve or degrade expected performance?" (see the introduction for more quotes) In this paper, we resolve this question for the ranking duel and a more general class of dueling games. More precisely, we study the quality of orderings in a competition between two players. This game is a zero-sum game, and thus any Nash equilibrium of the game can be described by minimax strategies. Let the value of the user for an ordering be a function of the position of her requested item in the corresponding ordering, and the social welfare for an ordering be the expected value of the corresponding ordering for the user. We propose the price of competition which is the ratio of the social welfare for the worst minimax strategy to the social welfare obtained by asocial planner. Finding the price of competition is another approach to obtain structural results of Nash equilibria. We use this criterion for analyzing the quality of orderings in the ranking duel. Although Immorlica et al. show that the competition leads to suboptimal strategies, we prove the quality of minimax results is surprisingly close to that of the optimum solution. In particular, via a novel factor-revealing LP for computing price of anarchy, we prove if the value of the user for an ordering is a linear function of its position, then the price of competition is at least 0.612 and bounded above by 0.833. Moreover we consider the cost minimization version of the problem. We prove, the social cost of the worst minimax strategy is at most 3 times the optimal social cost. Last but not least, we go beyond linear valuation functions and capture the main challenge for bounding the price of competition for any arbitrary valuation function. We present a principle which states that the lower bound for the price of competition for all 0-1 valuation functions is the same as the lower bound for the price of competition for all possible valuation functions. It is worth mentioning that this principle not only works for the ranking duel but also for all dueling games. This principle says, in any dueling game, the most challenging part of bounding the price of competition is finding a lower bound for 0-1 valuation functions. We leverage this principle to show that the price of competition is at least 0.25 for the generalized ranking duel.

Cite as

Sina Dehghani, Mohammad Taghi Hajiaghayi, Hamid Mahini, and Saeed Seddighin. Price of Competition and Dueling Games. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 21:1-21:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{dehghani_et_al:LIPIcs.ICALP.2016.21,
  author =	{Dehghani, Sina and Hajiaghayi, Mohammad Taghi and Mahini, Hamid and Seddighin, Saeed},
  title =	{{Price of Competition and Dueling Games}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{21:1--21:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.21},
  URN =		{urn:nbn:de:0030-drops-63009},
  doi =		{10.4230/LIPIcs.ICALP.2016.21},
  annote =	{Keywords: POC, POA, Dueling games, Nash equilibria, sponsored search}
}
Document
Online Weighted Degree-Bounded Steiner Networks via Novel Online Mixed Packing/Covering

Authors: Sina Dehghani, Soheil Ehsani, Mohammad Taghi Hajiaghayi, Vahid Liaghat, Harald Räcke, and Saeed Seddighin

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
We design the first online algorithm with poly-logarithmic competitive ratio for the edge-weighted degree-bounded Steiner forest (EW-DB-SF) problem and its generalized variant. We obtain our result by demonstrating a new generic approach for solving mixed packing/covering integer programs in the online paradigm. In EW-DB-SF, we are given an edge-weighted graph with a degree bound for every vertex. Given a root vertex in advance, we receive a sequence of terminal vertices in an online manner. Upon the arrival of a terminal, we need to augment our solution subgraph to connect the new terminal to the root. The goal is to minimize the total weight of the solution while respecting the degree bounds on the vertices. In the offline setting, edge-weighted degree-bounded Steiner tree (EW-DB-ST) and its many variations have been extensively studied since early eighties. Unfortunately, the recent advancements in the online network design problems are inherently difficult to adapt for degree-bounded problems. In particular, it is not known whether the fractional solution obtained by standard primal-dual techniques for mixed packing/covering LPs can be rounded online. In contrast, in this paper we obtain our result by using structural properties of the optimal solution, and reducing the EW-DB-SF problem to an exponential-size mixed packing/covering integer program in which every variable appears only once in covering constraints. We then design a generic integral algorithm for solving this restricted family of IPs. As mentioned above, we demonstrate a new technique for solving mixed packing/covering integer programs. Define the covering frequency k of a program as the maximum number of covering constraints in which a variable can participate. Let m denote the number of packing constraints. We design an online deterministic integral algorithm with competitive ratio of O(k*log(m)) for the mixed packing/covering integer programs. We prove the tightness of our result by providing a matching lower bound for any randomized algorithm. We note that our solution solely depends on m and k. Indeed, there can be exponentially many variables. Furthermore, our algorithm directly provides an integral solution, even if the integrality gap of the program is unbounded. We believe this technique can be used as an interesting alternative for the standard primal-dual techniques in solving online problems.

Cite as

Sina Dehghani, Soheil Ehsani, Mohammad Taghi Hajiaghayi, Vahid Liaghat, Harald Räcke, and Saeed Seddighin. Online Weighted Degree-Bounded Steiner Networks via Novel Online Mixed Packing/Covering. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 42:1-42:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{dehghani_et_al:LIPIcs.ICALP.2016.42,
  author =	{Dehghani, Sina and Ehsani, Soheil and Hajiaghayi, Mohammad Taghi and Liaghat, Vahid and R\"{a}cke, Harald and Seddighin, Saeed},
  title =	{{Online Weighted Degree-Bounded Steiner Networks via Novel Online Mixed Packing/Covering}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{42:1--42:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.42},
  URN =		{urn:nbn:de:0030-drops-63221},
  doi =		{10.4230/LIPIcs.ICALP.2016.42},
  annote =	{Keywords: Online, Steiner Tree, Approximation, Competitive ratio}
}
  • Refine by Author
  • 8 Seddighin, Saeed
  • 4 Dehghani, Sina
  • 3 Ehsani, Soheil
  • 3 Hajiaghayi, MohammadTaghi
  • 3 Liaghat, Vahid
  • Show More...

  • Refine by Classification
  • 4 Theory of computation → Design and analysis of algorithms
  • 1 Mathematics of computing → Combinatorial optimization
  • 1 Theory of computation → Caching and paging algorithms
  • 1 Theory of computation → Computational geometry
  • 1 Theory of computation → Discrete optimization
  • Show More...

  • Refine by Keyword
  • 2 Approximation
  • 1 0-1-Knapsack problem
  • 1 Approximation Algorithm
  • 1 Caching
  • 1 Center
  • Show More...

  • Refine by Type
  • 14 document

  • Refine by Publication Year
  • 6 2024
  • 2 2016
  • 2 2022
  • 1 2017
  • 1 2018
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail