2 Search Results for "Diakonikolas, Jelena"


Document
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

Authors: Michael B. Cohen, Aaron Sidford, and Kevin Tian

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
We show that standard extragradient methods (i.e. mirror prox [Arkadi Nemirovski, 2004] and dual extrapolation [Yurii Nesterov, 2007]) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained 𝓁_∞ regression based on area convexity [Jonah Sherman, 2017] and complexity bounds achieved by accelerated (randomized) coordinate descent [Zeyuan {Allen Zhu} et al., 2016; Yurii Nesterov and Sebastian U. Stich, 2017] for smooth convex function minimization.

Cite as

Michael B. Cohen, Aaron Sidford, and Kevin Tian. Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 62:1-62:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.ITCS.2021.62,
  author =	{Cohen, Michael B. and Sidford, Aaron and Tian, Kevin},
  title =	{{Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{62:1--62:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.62},
  URN =		{urn:nbn:de:0030-drops-136011},
  doi =		{10.4230/LIPIcs.ITCS.2021.62},
  annote =	{Keywords: Variational inequalities, minimax optimization, acceleration, 𝓁\underline∞ regression}
}
Document
Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

Authors: Jelena Diakonikolas and Lorenzo Orecchia

Published in: LIPIcs, Volume 94, 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)


Abstract
We provide a novel accelerated first-order method that achieves the asymptotically optimal convergence rate for smooth functions in the first-order oracle model. To this day, Nesterov's Accelerated Gradient Descent (AGD) and variations thereof were the only methods achieving acceleration in this standard blackbox model. In contrast, our algorithm is significantly different from AGD, as it relies on a predictor-corrector approach similar to that used by Mirror-Prox [Nemirovski, 2004] and Extra-Gradient Descent [Korpelevich, 1977] in the solution of convex-concave saddle point problems. For this reason, we dub our algorithm Accelerated Extra-Gradient Descent (AXGD). Its construction is motivated by the discretization of an accelerated continuous-time dynamics [Krichene et al., 2015] using the classical method of implicit Euler discretization. Our analysis explicitly shows the effects of discretization through a conceptually novel primal-dual viewpoint. Moreover, we show that the method is quite general: it attains optimal convergence rates for other classes of objectives (e.g., those with generalized smoothness properties or that are non-smooth and Lipschitz-continuous) using the appropriate choices of step lengths. Finally, we present experiments showing that our algorithm matches the performance of Nesterov's method, while appearing more robust to noise in some cases.

Cite as

Jelena Diakonikolas and Lorenzo Orecchia. Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 94, pp. 23:1-23:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{diakonikolas_et_al:LIPIcs.ITCS.2018.23,
  author =	{Diakonikolas, Jelena and Orecchia, Lorenzo},
  title =	{{Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{23:1--23:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Karlin, Anna R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2018.23},
  URN =		{urn:nbn:de:0030-drops-83562},
  doi =		{10.4230/LIPIcs.ITCS.2018.23},
  annote =	{Keywords: Acceleration, dynamical systems, discretization, first-order methods}
}
  • Refine by Author
  • 1 Cohen, Michael B.
  • 1 Diakonikolas, Jelena
  • 1 Orecchia, Lorenzo
  • 1 Sidford, Aaron
  • 1 Tian, Kevin

  • Refine by Classification
  • 1 Mathematics of computing → Convex optimization

  • Refine by Keyword
  • 1 Acceleration
  • 1 Variational inequalities
  • 1 acceleration
  • 1 discretization
  • 1 dynamical systems
  • Show More...

  • Refine by Type
  • 2 document

  • Refine by Publication Year
  • 1 2018
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail