4 Search Results for "Allen-Zhu, Zeyuan"


Document
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

Authors: Michael B. Cohen, Aaron Sidford, and Kevin Tian

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
We show that standard extragradient methods (i.e. mirror prox [Arkadi Nemirovski, 2004] and dual extrapolation [Yurii Nesterov, 2007]) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained 𝓁_∞ regression based on area convexity [Jonah Sherman, 2017] and complexity bounds achieved by accelerated (randomized) coordinate descent [Zeyuan {Allen Zhu} et al., 2016; Yurii Nesterov and Sebastian U. Stich, 2017] for smooth convex function minimization.

Cite as

Michael B. Cohen, Aaron Sidford, and Kevin Tian. Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 62:1-62:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.ITCS.2021.62,
  author =	{Cohen, Michael B. and Sidford, Aaron and Tian, Kevin},
  title =	{{Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{62:1--62:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.62},
  URN =		{urn:nbn:de:0030-drops-136011},
  doi =		{10.4230/LIPIcs.ITCS.2021.62},
  annote =	{Keywords: Variational inequalities, minimax optimization, acceleration, 𝓁\underline∞ regression}
}
Document
Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

Authors: Zeyuan Allen-Zhu and Lorenzo Orecchia

Published in: LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)


Abstract
First-order methods play a central role in large-scale machine learning. Even though many variations exist, each suited to a particular problem, almost all such methods fundamentally rely on two types of algorithmic steps: gradient descent, which yields primal progress, and mirror descent, which yields dual progress. We observe that the performances of gradient and mirror descent are complementary, so that faster algorithms can be designed by "linearly coupling" the two. We show how to reconstruct Nesterov's accelerated gradient methods using linear coupling, which gives a cleaner interpretation than Nesterov's original proofs. We also discuss the power of linear coupling by extending it to many other settings that Nesterov's methods cannot apply to.

Cite as

Zeyuan Allen-Zhu and Lorenzo Orecchia. Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 3:1-3:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{allenzhu_et_al:LIPIcs.ITCS.2017.3,
  author =	{Allen-Zhu, Zeyuan and Orecchia, Lorenzo},
  title =	{{Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent}},
  booktitle =	{8th Innovations in Theoretical Computer Science Conference (ITCS 2017)},
  pages =	{3:1--3:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-029-3},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{67},
  editor =	{Papadimitriou, Christos H.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.3},
  URN =		{urn:nbn:de:0030-drops-81850},
  doi =		{10.4230/LIPIcs.ITCS.2017.3},
  annote =	{Keywords: linear coupling, gradient descent, mirror descent, acceleration}
}
Document
Optimization Algorithms for Faster Computational Geometry

Authors: Zeyuan Allen-Zhu, Zhenyu Liao, and Yang Yuan

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
We study two fundamental problems in computational geometry: finding the maximum inscribed ball (MaxIB) inside a bounded polyhedron defined by m hyperplanes, and the minimum enclosing ball (MinEB) of a set of n points, both in d-dimensional space. We improve the running time of iterative algorithms on MaxIB from ~O(m*d*alpha^3/epsilon^3) to ~O(m*d + m*sqrt(d)*alpha/epsilon), a speed-up up to ~O(sqrt(d)*alpha^2/epsilon^2), and MinEB from ~O(n*d/sqrt(epsilon)) to ~O(n*d + n*sqrt(d)/sqrt(epsilon)), a speed-up up to ~O(sqrt(d)). Our improvements are based on a novel saddle-point optimization framework. We propose a new algorithm L1L2SPSolver for solving a class of regularized saddle-point problems, and apply a randomized Hadamard space rotation which is a technique borrowed from compressive sensing. Interestingly, the motivation of using Hadamard rotation solely comes from our optimization view but not the original geometry problem: indeed, it is not immediately clear why MaxIB or MinEB, as a geometric problem, should be easier to solve if we rotate the space by a unitary matrix. We hope that our optimization perspective sheds lights on solving other geometric problems as well.

Cite as

Zeyuan Allen-Zhu, Zhenyu Liao, and Yang Yuan. Optimization Algorithms for Faster Computational Geometry. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 53:1-53:6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{allenzhu_et_al:LIPIcs.ICALP.2016.53,
  author =	{Allen-Zhu, Zeyuan and Liao, Zhenyu and Yuan, Yang},
  title =	{{Optimization Algorithms for Faster Computational Geometry}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{53:1--53:6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.53},
  URN =		{urn:nbn:de:0030-drops-63325},
  doi =		{10.4230/LIPIcs.ICALP.2016.53},
  annote =	{Keywords: maximum inscribed balls, minimum enclosing balls, approximation algorithms}
}
Document
Restricted Isometry Property for General p-Norms

Authors: Zeyuan Allen-Zhu, Rati Gelashvili, and Ilya Razenshteyn

Published in: LIPIcs, Volume 34, 31st International Symposium on Computational Geometry (SoCG 2015)


Abstract
The Restricted Isometry Property (RIP) is a fundamental property of a matrix which enables sparse recovery. Informally, an m x n matrix satisfies RIP of order k for the L_p norm, if |Ax|_p is approximately |x|_p for every x with at most k non-zero coordinates. For every 1 <= p < infty we obtain almost tight bounds on the minimum number of rows m necessary for the RIP property to hold. Prior to this work, only the cases p = 1, 1 + 1/log(k), and 2 were studied. Interestingly, our results show that the case p=2 is a "singularity" point: the optimal number of rows m is Theta(k^p) for all p in [1, infty)-{2}, as opposed to Theta(k) for k=2. We also obtain almost tight bounds for the column sparsity of RIP matrices and discuss implications of our results for the Stable Sparse Recovery problem.

Cite as

Zeyuan Allen-Zhu, Rati Gelashvili, and Ilya Razenshteyn. Restricted Isometry Property for General p-Norms. In 31st International Symposium on Computational Geometry (SoCG 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 34, pp. 451-460, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{allenzhu_et_al:LIPIcs.SOCG.2015.451,
  author =	{Allen-Zhu, Zeyuan and Gelashvili, Rati and Razenshteyn, Ilya},
  title =	{{Restricted Isometry Property for General p-Norms}},
  booktitle =	{31st International Symposium on Computational Geometry (SoCG 2015)},
  pages =	{451--460},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-83-5},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{34},
  editor =	{Arge, Lars and Pach, J\'{a}nos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SOCG.2015.451},
  URN =		{urn:nbn:de:0030-drops-51273},
  doi =		{10.4230/LIPIcs.SOCG.2015.451},
  annote =	{Keywords: compressive sensing, dimension reduction, linear algebra, high-dimensional geometry}
}
  • Refine by Author
  • 3 Allen-Zhu, Zeyuan
  • 1 Cohen, Michael B.
  • 1 Gelashvili, Rati
  • 1 Liao, Zhenyu
  • 1 Orecchia, Lorenzo
  • Show More...

  • Refine by Classification
  • 1 Mathematics of computing → Convex optimization

  • Refine by Keyword
  • 2 acceleration
  • 1 Variational inequalities
  • 1 approximation algorithms
  • 1 compressive sensing
  • 1 dimension reduction
  • Show More...

  • Refine by Type
  • 4 document

  • Refine by Publication Year
  • 1 2015
  • 1 2016
  • 1 2017
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail