2 Search Results for "Panigrahy, Rina"


Document
Convergence Results for Neural Networks via Electrodynamics

Authors: Rina Panigrahy, Ali Rahimi, Sushant Sachdeva, and Qiuyi Zhang

Published in: LIPIcs, Volume 94, 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)


Abstract
We study whether a depth two neural network can learn another depth two network using gradient descent. Assuming a linear output node, we show that the question of whether gradient descent converges to the target function is equivalent to the following question in electrodynamics: Given k fixed protons in R^d, and k electrons, each moving due to the attractive force from the protons and repulsive force from the remaining electrons, whether at equilibrium all the electrons will be matched up with the protons, up to a permutation. Under the standard electrical force, this follows from the classic Earnshaw's theorem. In our setting, the force is determined by the activation function and the input distribution. Building on this equivalence, we prove the existence of an activation function such that gradient descent learns at least one of the hidden nodes in the target network. Iterating, we show that gradient descent can be used to learn the entire network one node at a time.

Cite as

Rina Panigrahy, Ali Rahimi, Sushant Sachdeva, and Qiuyi Zhang. Convergence Results for Neural Networks via Electrodynamics. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 94, pp. 22:1-22:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{panigrahy_et_al:LIPIcs.ITCS.2018.22,
  author =	{Panigrahy, Rina and Rahimi, Ali and Sachdeva, Sushant and Zhang, Qiuyi},
  title =	{{Convergence Results for Neural Networks via Electrodynamics}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{22:1--22:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Karlin, Anna R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2018.22},
  URN =		{urn:nbn:de:0030-drops-83521},
  doi =		{10.4230/LIPIcs.ITCS.2018.22},
  annote =	{Keywords: Deep Learning, Learning Theory, Non-convex Optimization}
}
Document
Deterministic approximation algorithms for the nearest codeword problem

Authors: Noga Alon, Rina Panigrahy, and Sergey Yekhanin

Published in: Dagstuhl Seminar Proceedings, Volume 9421, Algebraic Methods in Computational Complexity (2010)


Abstract
The Nearest Codeword Problem (NCP) is a basic algorithmic question in the theory of error-correcting codes. Given a point v in F_2^n and a linear space L in F_2^n of dimension k NCP asks to find a point l in L that minimizes the (Hamming) distance from v. It is well-known that the nearest codeword problem is NP-hard. Therefore approximation algorithms are of interest. The best effcient approximation algorithms for the NCP to date are due to Berman and Karpinski. They are a deterministic algorithm that achieves an approximation ratio of O(k/c) for an arbitrary constant c; and a randomized algorithm that achieves an approximation ratio of O(k/ log n). In this paper we present new deterministic algorithms for approximating the NCP that improve substantially upon the earlier work, (almost) de-randomizing the randomized algorithm of Berman and Karpinski. We also initiate a study of the following Remote Point Problem (RPP). Given a linear space L in F_2^n of dimension k RPP asks to find a point v in F_2^n that is far from L. We say that an algorithm achieves a remoteness of r for the RPP if it always outputs a point v that is at least r-far from L. In this paper we present a deterministic polynomial time algorithm that achieves a remoteness of Omega(n log k / k) for all k < n/2. We motivate the remote point problem by relating it to both the nearest codeword problem and the matrix rigidity approach to circuit lower bounds in computational complexity theory.

Cite as

Noga Alon, Rina Panigrahy, and Sergey Yekhanin. Deterministic approximation algorithms for the nearest codeword problem. In Algebraic Methods in Computational Complexity. Dagstuhl Seminar Proceedings, Volume 9421, pp. 1-13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{alon_et_al:DagSemProc.09421.4,
  author =	{Alon, Noga and Panigrahy, Rina and Yekhanin, Sergey},
  title =	{{Deterministic approximation algorithms for the nearest codeword problem}},
  booktitle =	{Algebraic Methods in Computational Complexity},
  pages =	{1--13},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9421},
  editor =	{Manindra Agrawal and Lance Fortnow and Thomas Thierauf and Christopher Umans},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09421.4},
  URN =		{urn:nbn:de:0030-drops-24133},
  doi =		{10.4230/DagSemProc.09421.4},
  annote =	{Keywords: }
}
  • Refine by Author
  • 2 Panigrahy, Rina
  • 1 Alon, Noga
  • 1 Rahimi, Ali
  • 1 Sachdeva, Sushant
  • 1 Yekhanin, Sergey
  • Show More...

  • Refine by Classification

  • Refine by Keyword
  • 1 Deep Learning
  • 1 Learning Theory
  • 1 Non-convex Optimization

  • Refine by Type
  • 2 document

  • Refine by Publication Year
  • 1 2010
  • 1 2018

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail