3 Search Results for "Lugosi, Gabor"


Document
Is the Space Complexity of Planted Clique Recovery the Same as That of Detection?

Authors: Jay Mardia

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
We study the planted clique problem in which a clique of size k is planted in an Erdős-Rényi graph G(n, 1/2), and one is interested in either detecting or recovering this planted clique. This problem is interesting because it is widely believed to show a statistical-computational gap at clique size k = Θ(√n), and has emerged as the prototypical problem with such a gap from which average-case hardness of other statistical problems can be deduced. It also displays a tight computational connection between the detection and recovery variants, unlike other problems of a similar nature. This wide investigation into the computational complexity of the planted clique problem has, however, mostly focused on its time complexity. To begin investigating the robustness of these statistical-computational phenomena to changes in our notion of computational efficiency, we ask- Do the statistical-computational phenomena that make the planted clique an interesting problem also hold when we use "space efficiency" as our notion of computational efficiency? It is relatively easy to show that a positive answer to this question depends on the existence of a O(log n) space algorithm that can recover planted cliques of size k = Ω(√n). Our main result comes very close to designing such an algorithm. We show that for k = Ω(√n), the recovery problem can be solved in O((log^*{n}-log^*{k/(√n}) ⋅ log n) bits of space. 1) If k = ω(√nlog^{(𝓁)}n) for any constant integer 𝓁 > 0, the space usage is O(log n) bits. 2) If k = Θ(√n), the space usage is O(log^* n ⋅ log n) bits. Our result suggests that there does exist an O(log n) space algorithm to recover cliques of size k = Ω(√n), since we come very close to achieving such parameters. This provides evidence that the statistical-computational phenomena that (conjecturally) hold for planted clique time complexity also (conjecturally) hold for space complexity. Due to space limitations, we omit proofs from this manuscript. The entire paper with full proofs can be found on arXiv at https://arxiv.org/abs/2008.12825.

Cite as

Jay Mardia. Is the Space Complexity of Planted Clique Recovery the Same as That of Detection?. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 34:1-34:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{mardia:LIPIcs.ITCS.2021.34,
  author =	{Mardia, Jay},
  title =	{{Is the Space Complexity of Planted Clique Recovery the Same as That of Detection?}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{34:1--34:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.34},
  URN =		{urn:nbn:de:0030-drops-135734},
  doi =		{10.4230/LIPIcs.ITCS.2021.34},
  annote =	{Keywords: Statistical computational gaps, Planted clique, Space complexity, Average case computational complexity}
}
Document
Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 15361)

Authors: Matthias Hein, Gabor Lugosi, and Lorenzo Rosasco

Published in: Dagstuhl Reports, Volume 5, Issue 8 (2016)


Abstract
Machine learning has become a core field in computer science. Over the last decade the statistical machine learning approach has been successfully applied in many areas such as bioinformatics, computer vision, robotics and information retrieval. The main reasons for the success of machine learning are its strong theoretical foundations and its multidisciplinary approach integrating aspects of computer science, applied mathematics, and statistics among others. The goal of the seminar was to bring together again experts from computer science, mathematics and statistics to discuss the state of the art in machine learning and identify and formulate the key challenges in learning which have to be addressed in the future. The main topics of this seminar were: - Interplay between Optimization and Learning, - Learning Data Representations.

Cite as

Matthias Hein, Gabor Lugosi, and Lorenzo Rosasco. Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 15361). In Dagstuhl Reports, Volume 5, Issue 8, pp. 54-73, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{hein_et_al:DagRep.5.8.54,
  author =	{Hein, Matthias and Lugosi, Gabor and Rosasco, Lorenzo},
  title =	{{Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 15361)}},
  pages =	{54--73},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2016},
  volume =	{5},
  number =	{8},
  editor =	{Hein, Matthias and Lugosi, Gabor and Rosasco, Lorenzo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.5.8.54},
  URN =		{urn:nbn:de:0030-drops-56783},
  doi =		{10.4230/DagRep.5.8.54},
  annote =	{Keywords: learning theory, non-smooth optimization (convex and non-convex), signal processing}
}
Document
Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 11291)

Authors: Matthias Hein, Gabor Lugosi, Lorenzo Rosasco, and Steve Smale

Published in: Dagstuhl Reports, Volume 1, Issue 7 (2011)


Abstract
The main goal of the seminar ``Mathematical and Computational Foundations of Learning Theory'' was to bring together experts from computer science, mathematics and statistics to discuss the state of the art in machine learning broadly construed and identify and formulate the key challenges in learning which have to be addressed in the future. This Dagstuhl seminar was one of the first meetings to cover the full broad range of facets of modern learning theory. The meeting was very successful and all participants agreed that such a meeting should take place on a regular basis.

Cite as

Matthias Hein, Gabor Lugosi, Lorenzo Rosasco, and Steve Smale. Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 11291). In Dagstuhl Reports, Volume 1, Issue 7, pp. 53-69, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@Article{hein_et_al:DagRep.1.7.53,
  author =	{Hein, Matthias and Lugosi, Gabor and Rosasco, Lorenzo and Smale, Steve},
  title =	{{Mathematical and Computational Foundations of Learning Theory (Dagstuhl Seminar 11291)}},
  pages =	{53--69},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2011},
  volume =	{1},
  number =	{7},
  editor =	{Hein, Matthias and Lugosi, Gabor and Rosasco, Lorenzo and Smale, Steve},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.1.7.53},
  URN =		{urn:nbn:de:0030-drops-33093},
  doi =		{10.4230/DagRep.1.7.53},
  annote =	{Keywords: learning theory, machine learning, sparsity, high-dimensional geometry, manifold learning, online learning}
}
  • Refine by Author
  • 2 Hein, Matthias
  • 2 Lugosi, Gabor
  • 2 Rosasco, Lorenzo
  • 1 Mardia, Jay
  • 1 Smale, Steve

  • Refine by Classification
  • 1 Theory of computation → Computational complexity and cryptography

  • Refine by Keyword
  • 2 learning theory
  • 1 Average case computational complexity
  • 1 Planted clique
  • 1 Space complexity
  • 1 Statistical computational gaps
  • Show More...

  • Refine by Type
  • 3 document

  • Refine by Publication Year
  • 1 2011
  • 1 2016
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail