5 Search Results for "Jiang, Haotian"


Document
Track A: Algorithms, Complexity and Games
Smoothed Analysis of the Komlós Conjecture

Authors: Nikhil Bansal, Haotian Jiang, Raghu Meka, Sahil Singla, and Makrand Sinha

Published in: LIPIcs, Volume 229, 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)


Abstract
The well-known Komlós conjecture states that given n vectors in ℝ^d with Euclidean norm at most one, there always exists a ± 1 coloring such that the 𝓁_∞ norm of the signed-sum vector is a constant independent of n and d. We prove this conjecture in a smoothed analysis setting where the vectors are perturbed by adding a small Gaussian noise and when the number of vectors n = ω(d log d). The dependence of n on d is the best possible even in a completely random setting. Our proof relies on a weighted second moment method, where instead of considering uniformly randomly colorings we apply the second moment method on an implicit distribution on colorings obtained by applying the Gram-Schmidt walk algorithm to a suitable set of vectors. The main technical idea is to use various properties of these colorings, including subgaussianity, to control the second moment.

Cite as

Nikhil Bansal, Haotian Jiang, Raghu Meka, Sahil Singla, and Makrand Sinha. Smoothed Analysis of the Komlós Conjecture. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 14:1-14:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bansal_et_al:LIPIcs.ICALP.2022.14,
  author =	{Bansal, Nikhil and Jiang, Haotian and Meka, Raghu and Singla, Sahil and Sinha, Makrand},
  title =	{{Smoothed Analysis of the Koml\'{o}s Conjecture}},
  booktitle =	{49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)},
  pages =	{14:1--14:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-235-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{229},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Merelli, Emanuela and Woodruff, David P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2022.14},
  URN =		{urn:nbn:de:0030-drops-163556},
  doi =		{10.4230/LIPIcs.ICALP.2022.14},
  annote =	{Keywords: Koml\'{o}s conjecture, smoothed analysis, weighted second moment method, subgaussian coloring}
}
Document
Prefix Discrepancy, Smoothed Analysis, and Combinatorial Vector Balancing

Authors: Nikhil Bansal, Haotian Jiang, Raghu Meka, Sahil Singla, and Makrand Sinha

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
A well-known result of Banaszczyk in discrepancy theory concerns the prefix discrepancy problem (also known as the signed series problem): given a sequence of T unit vectors in ℝ^d, find ± signs for each of them such that the signed sum vector along any prefix has a small 𝓁_∞-norm? This problem is central to proving upper bounds for the Steinitz problem, and the popular Komlós problem is a special case where one is only concerned with the final signed sum vector instead of all prefixes. Banaszczyk gave an O(√{log d+ log T}) bound for the prefix discrepancy problem. We investigate the tightness of Banaszczyk’s bound and consider natural generalizations of prefix discrepancy: - We first consider a smoothed analysis setting, where a small amount of additive noise perturbs the input vectors. We show an exponential improvement in T compared to Banaszczyk’s bound. Using a primal-dual approach and a careful chaining argument, we show that one can achieve a bound of O(√{log d+ log log T}) with high probability in the smoothed setting. Moreover, this smoothed analysis bound is the best possible without further improvement on Banaszczyk’s bound in the worst case. - We also introduce a generalization of the prefix discrepancy problem to arbitrary DAGs. Here, vertices correspond to unit vectors, and the discrepancy constraints correspond to paths on a DAG on T vertices - prefix discrepancy is precisely captured when the DAG is a simple path. We show that an analog of Banaszczyk’s O(√{log d+ log T}) bound continues to hold in this setting for adversarially given unit vectors and that the √{log T} factor is unavoidable for DAGs. We also show that unlike for prefix discrepancy, the dependence on T cannot be improved significantly in the smoothed case for DAGs. - We conclude by exploring a more general notion of vector balancing, which we call combinatorial vector balancing. In this problem, the discrepancy constraints are generalized from paths of a DAG to an arbitrary set system. We obtain near-optimal bounds in this setting, up to poly-logarithmic factors.

Cite as

Nikhil Bansal, Haotian Jiang, Raghu Meka, Sahil Singla, and Makrand Sinha. Prefix Discrepancy, Smoothed Analysis, and Combinatorial Vector Balancing. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 13:1-13:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bansal_et_al:LIPIcs.ITCS.2022.13,
  author =	{Bansal, Nikhil and Jiang, Haotian and Meka, Raghu and Singla, Sahil and Sinha, Makrand},
  title =	{{Prefix Discrepancy, Smoothed Analysis, and Combinatorial Vector Balancing}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{13:1--13:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.13},
  URN =		{urn:nbn:de:0030-drops-156092},
  doi =		{10.4230/LIPIcs.ITCS.2022.13},
  annote =	{Keywords: Prefix discrepancy, smoothed analysis, combinatorial vector balancing}
}
Document
Invited Talk
Convex Optimization and Dynamic Data Structure (Invited Talk)

Authors: Yin Tat Lee

Published in: LIPIcs, Volume 182, 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)


Abstract
In the last three years, there are many breakthroughs in optimization such as nearly quadratic time algorithms for bipartite matching, linear programming algorithms that are as fast as Ax = b. All of these algorithms are based on a careful combination of optimization techniques and dynamic data structures. In this talk, we will explain the framework underlying all the recent breakthroughs. Joint work with Jan van den Brand, Michael B. Cohen, Sally Dong, Haotian Jiang, Tarun Kathuria, Danupon Nanongkai, Swati Padmanabhan, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang, Sam Chiu-wai Wong, Guanghao Ye, Qiuyi Zhang.

Cite as

Yin Tat Lee. Convex Optimization and Dynamic Data Structure (Invited Talk). In 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 182, p. 3:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{lee:LIPIcs.FSTTCS.2020.3,
  author =	{Lee, Yin Tat},
  title =	{{Convex Optimization and Dynamic Data Structure}},
  booktitle =	{40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)},
  pages =	{3:1--3:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-174-0},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{182},
  editor =	{Saxena, Nitin and Simon, Sunil},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2020.3},
  URN =		{urn:nbn:de:0030-drops-132440},
  doi =		{10.4230/LIPIcs.FSTTCS.2020.3},
  annote =	{Keywords: Convex Optimization, Dynamic Data Structure}
}
Document
Algorithms and Adaptivity Gaps for Stochastic k-TSP

Authors: Haotian Jiang, Jian Li, Daogao Liu, and Sahil Singla

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Given a metric (V,d) and a root ∈ V, the classic k-TSP problem is to find a tour originating at the root of minimum length that visits at least k nodes in V. In this work, motivated by applications where the input to an optimization problem is uncertain, we study two stochastic versions of k-TSP. In Stoch-Reward k-TSP, originally defined by Ene-Nagarajan-Saket [Ene et al., 2018], each vertex v in the given metric (V,d) contains a stochastic reward R_v. The goal is to adaptively find a tour of minimum expected length that collects at least reward k; here "adaptively" means our next decision may depend on previous outcomes. Ene et al. give an O(log k)-approximation adaptive algorithm for this problem, and left open if there is an O(1)-approximation algorithm. We totally resolve their open question, and even give an O(1)-approximation non-adaptive algorithm for Stoch-Reward k-TSP. We also introduce and obtain similar results for the Stoch-Cost k-TSP problem. In this problem each vertex v has a stochastic cost C_v, and the goal is to visit and select at least k vertices to minimize the expected sum of tour length and cost of selected vertices. Besides being a natural stochastic generalization of k-TSP, this problem is also interesting because it generalizes the Price of Information framework [Singla, 2018] from deterministic probing costs to metric probing costs. Our techniques are based on two crucial ideas: "repetitions" and "critical scaling". In general, replacing a random variable with its expectation leads to very poor results. We show that for our problems, if we truncate the random variables at an ideal threshold, then their expected values form a good surrogate. Here, we rely on running several repetitions of our algorithm with the same threshold, and then argue concentration using Freedman’s and Jogdeo-Samuels' inequalities. Unfortunately, this ideal threshold depends on how far we are from achieving our target k, which a non-adaptive algorithm does not know. To overcome this barrier, we truncate the random variables at various different scales and identify a "critical" scale.

Cite as

Haotian Jiang, Jian Li, Daogao Liu, and Sahil Singla. Algorithms and Adaptivity Gaps for Stochastic k-TSP. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 45:1-45:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{jiang_et_al:LIPIcs.ITCS.2020.45,
  author =	{Jiang, Haotian and Li, Jian and Liu, Daogao and Singla, Sahil},
  title =	{{Algorithms and Adaptivity Gaps for Stochastic k-TSP}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{45:1--45:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.45},
  URN =		{urn:nbn:de:0030-drops-117308},
  doi =		{10.4230/LIPIcs.ITCS.2020.45},
  annote =	{Keywords: approximation algorithms, stochastic optimization, travelling salesman problem}
}
Document
A Unified PTAS for Prize Collecting TSP and Steiner Tree Problem in Doubling Metrics

Authors: T-H. Hubert Chan, Haotian Jiang, and Shaofeng H.-C. Jiang

Published in: LIPIcs, Volume 112, 26th Annual European Symposium on Algorithms (ESA 2018)


Abstract
We present a unified (randomized) polynomial-time approximation scheme (PTAS) for the prize collecting traveling salesman problem (PCTSP) and the prize collecting Steiner tree problem (PCSTP) in doubling metrics. Given a metric space and a penalty function on a subset of points known as terminals, a solution is a subgraph on points in the metric space, whose cost is the weight of its edges plus the penalty due to terminals not covered by the subgraph. Under our unified framework, the solution subgraph needs to be Eulerian for PCTSP, while it needs to be a tree for PCSTP. Before our work, even a QPTAS for the problems in doubling metrics is not known. Our unified PTAS is based on the previous dynamic programming frameworks proposed in [Talwar STOC 2004] and [Bartal, Gottlieb, Krauthgamer STOC 2012]. However, since it is unknown which part of the optimal cost is due to edge lengths and which part is due to penalties of uncovered terminals, we need to develop new techniques to apply previous divide-and-conquer strategies and sparse instance decompositions.

Cite as

T-H. Hubert Chan, Haotian Jiang, and Shaofeng H.-C. Jiang. A Unified PTAS for Prize Collecting TSP and Steiner Tree Problem in Doubling Metrics. In 26th Annual European Symposium on Algorithms (ESA 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 112, pp. 15:1-15:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chan_et_al:LIPIcs.ESA.2018.15,
  author =	{Chan, T-H. Hubert and Jiang, Haotian and Jiang, Shaofeng H.-C.},
  title =	{{A Unified PTAS for Prize Collecting TSP and Steiner Tree Problem in Doubling Metrics}},
  booktitle =	{26th Annual European Symposium on Algorithms (ESA 2018)},
  pages =	{15:1--15:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-081-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{112},
  editor =	{Azar, Yossi and Bast, Hannah and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2018.15},
  URN =		{urn:nbn:de:0030-drops-94781},
  doi =		{10.4230/LIPIcs.ESA.2018.15},
  annote =	{Keywords: Doubling Dimension, Traveling Salesman Problem, Polynomial Time Approximation Scheme, Steiner Tree Problem, Prize Collecting}
}
  • Refine by Author
  • 4 Jiang, Haotian
  • 3 Singla, Sahil
  • 2 Bansal, Nikhil
  • 2 Meka, Raghu
  • 2 Sinha, Makrand
  • Show More...

  • Refine by Classification
  • 1 Mathematics of computing → Combinatoric problems
  • 1 Mathematics of computing → Mathematical optimization
  • 1 Theory of computation → Approximation algorithms analysis
  • 1 Theory of computation → Randomness, geometry and discrete structures
  • 1 Theory of computation → Routing and network design problems
  • Show More...

  • Refine by Keyword
  • 2 smoothed analysis
  • 1 Convex Optimization
  • 1 Doubling Dimension
  • 1 Dynamic Data Structure
  • 1 Komlós conjecture
  • Show More...

  • Refine by Type
  • 5 document

  • Refine by Publication Year
  • 2 2020
  • 2 2022
  • 1 2018

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail