When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2017.2
URN: urn:nbn:de:0030-drops-81640
URL: http://drops.dagstuhl.de/opus/volltexte/2017/8164/
 Go to the corresponding LIPIcs Volume Portal

### Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions

 pdf-format:

### Abstract

Given a twice continuously differentiable cost function f, we prove that the set of initial conditions so that gradient descent converges to saddle points where \nabla^2 f has at least one strictly negative eigenvalue, has (Lebesgue) measure zero, even for cost functions f with non-isolated critical points, answering an open question in [Lee, Simchowitz, Jordan, Recht, COLT 2016]. Moreover, this result extends to forward-invariant convex subspaces, allowing for weak (non-globally Lipschitz) smoothness assumptions. Finally, we produce an upper bound on the allowable step-size.

### BibTeX - Entry

```@InProceedings{panageas_et_al:LIPIcs:2017:8164,
author =	{Ioannis Panageas and Georgios Piliouras},
title =	{{Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions}},
booktitle =	{8th Innovations in Theoretical Computer Science Conference (ITCS 2017)},
pages =	{2:1--2:12},
series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN =	{978-3-95977-029-3},
ISSN =	{1868-8969},
year =	{2017},
volume =	{67},