Search Results

Documents authored by Jiang, Shunhua


Document
Track A: Algorithms, Complexity and Games
Acceleration Meets Inverse Maintenance: Faster 𝓁_∞-Regression

Authors: Deeksha Adil, Shunhua Jiang, and Rasmus Kyng

Published in: LIPIcs, Volume 334, 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)


Abstract
We propose a randomized multiplicative weight update (MWU) algorithm for 𝓁_{∞} regression that runs in Õ(n^{2+1/22.5} poly(1/ε)) time when ω = 2+o(1), improving upon the previous best Õ(n^{2+1/18} polylog(1/ε)) runtime in the low-accuracy regime. Our algorithm combines state-of-the-art inverse maintenance data structures with acceleration. In order to do so, we propose a novel acceleration scheme for MWU that exhibits stability and robustness, which are required for the efficient implementations of the inverse maintenance data structures. We also design a faster deterministic MWU algorithm that runs in Õ(n^{2+1/12}poly(1/ε)) time when ω = 2+o(1), improving upon the previous best Õ(n^{2+1/6} poly log(1/ε)) runtime in the low-accuracy regime. We achieve this by showing a novel stability result that goes beyond previously known works based on interior point methods (IPMs). Our work is the first to use acceleration and inverse maintenance together efficiently, finally making the two most important building blocks of modern structured convex optimization compatible.

Cite as

Deeksha Adil, Shunhua Jiang, and Rasmus Kyng. Acceleration Meets Inverse Maintenance: Faster 𝓁_∞-Regression. In 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 334, pp. 5:1-5:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{adil_et_al:LIPIcs.ICALP.2025.5,
  author =	{Adil, Deeksha and Jiang, Shunhua and Kyng, Rasmus},
  title =	{{Acceleration Meets Inverse Maintenance: Faster 𝓁\underline∞-Regression}},
  booktitle =	{52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)},
  pages =	{5:1--5:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-372-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{334},
  editor =	{Censor-Hillel, Keren and Grandoni, Fabrizio and Ouaknine, Jo\"{e}l and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2025.5},
  URN =		{urn:nbn:de:0030-drops-233823},
  doi =		{10.4230/LIPIcs.ICALP.2025.5},
  annote =	{Keywords: Regression, Inverse Maintenance, Multiplicative Weights Update}
}
Document
Hardness Amplification for Dynamic Binary Search Trees

Authors: Shunhua Jiang, Victor Lecomte, Omri Weinstein, and Sorrachai Yingchareonthawornchai

Published in: LIPIcs, Volume 322, 35th International Symposium on Algorithms and Computation (ISAAC 2024)


Abstract
We prove direct-sum theorems for Wilber’s two lower bounds [Wilber, FOCS'86] on the cost of access sequences in the binary search tree (BST) model. These bounds are central to the question of dynamic optimality [Sleator and Tarjan, JACM'85]: the Alternation bound is the only bound to have yielded online BST algorithms beating log n competitive ratio, while the Funnel bound has repeatedly been conjectured to exactly characterize the cost of executing an access sequence using the optimal tree [Wilber, FOCS'86, Kozma'16], and has been explicitly linked to splay trees [Levy and Tarjan, SODA'19]. Previously, the direct-sum theorem for the Alternation bound was known only when approximation was allowed [Chalermsook, Chuzhoy and Saranurak, APPROX'20, ToC'24]. We use these direct-sum theorems to amplify the sequences from [Lecomte and Weinstein, ESA'20] that separate between Wilber’s Alternation and Funnel bounds, increasing the Alternation and Funnel bounds while optimally maintaining the separation. As a corollary, we show that Tango trees [Demaine et al., FOCS'04] are optimal among any BST algorithms that charge their costs to the Alternation bound. This is true for any value of the Alternation bound, even values for which Tango trees achieve a competitive ratio of o(log log n) instead of the default O(log log n). Previously, the optimality of Tango trees was shown only for a limited range of Alternation bound [Lecomte and Weinstein, ESA'20].

Cite as

Shunhua Jiang, Victor Lecomte, Omri Weinstein, and Sorrachai Yingchareonthawornchai. Hardness Amplification for Dynamic Binary Search Trees. In 35th International Symposium on Algorithms and Computation (ISAAC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 322, pp. 42:1-42:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{jiang_et_al:LIPIcs.ISAAC.2024.42,
  author =	{Jiang, Shunhua and Lecomte, Victor and Weinstein, Omri and Yingchareonthawornchai, Sorrachai},
  title =	{{Hardness Amplification for Dynamic Binary Search Trees}},
  booktitle =	{35th International Symposium on Algorithms and Computation (ISAAC 2024)},
  pages =	{42:1--42:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-354-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{322},
  editor =	{Mestre, Juli\'{a}n and Wirth, Anthony},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2024.42},
  URN =		{urn:nbn:de:0030-drops-221696},
  doi =		{10.4230/LIPIcs.ISAAC.2024.42},
  annote =	{Keywords: Data Structures, Amortized Analysis}
}
Document
Track A: Algorithms, Complexity and Games
A Faster Interior-Point Method for Sum-Of-Squares Optimization

Authors: Shunhua Jiang, Bento Natura, and Omri Weinstein

Published in: LIPIcs, Volume 229, 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)


Abstract
We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑_i q²_i be an n-variate SOS polynomial of degree 2d. Denoting by L : = binom(n+d,d) and U : = binom(n+2d,2d) the dimensions of the vector spaces in which q_i’s and p live respectively, our algorithm runs in time Õ(LU^{1.87}). This is polynomially faster than state-of-art SOS and semidefinite programming solvers [Jiang et al., 2020; Huang et al., 2021; Papp and Yildiz, 2019], which achieve runtime Õ(L^{0.5} min{U^{2.37}, L^{4.24}}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis [Papp and Yildiz, 2019], which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.

Cite as

Shunhua Jiang, Bento Natura, and Omri Weinstein. A Faster Interior-Point Method for Sum-Of-Squares Optimization. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 79:1-79:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{jiang_et_al:LIPIcs.ICALP.2022.79,
  author =	{Jiang, Shunhua and Natura, Bento and Weinstein, Omri},
  title =	{{A Faster Interior-Point Method for Sum-Of-Squares Optimization}},
  booktitle =	{49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)},
  pages =	{79:1--79:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-235-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{229},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Merelli, Emanuela and Woodruff, David P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2022.79},
  URN =		{urn:nbn:de:0030-drops-164205},
  doi =		{10.4230/LIPIcs.ICALP.2022.79},
  annote =	{Keywords: Interior Point Methods, Sum-of-squares Optimization, Dynamic Matrix Inverse}
}
Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail