2 Search Results for "Fandina, Ora Nova"


Document
Barriers for Faster Dimensionality Reduction

Authors: Ora Nova Fandina, Mikael Møller Høgsgaard, and Kasper Green Larsen

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
The Johnson-Lindenstrauss transform allows one to embed a dataset of n points in ℝ^d into ℝ^m, while preserving the pairwise distance between any pair of points up to a factor (1 ± ε), provided that m = Ω(ε^{-2} lg n). The transform has found an overwhelming number of algorithmic applications, allowing to speed up algorithms and reducing memory consumption at the price of a small loss in accuracy. A central line of research on such transforms, focus on developing fast embedding algorithms, with the classic example being the Fast JL transform by Ailon and Chazelle. All known such algorithms have an embedding time of Ω(d lg d), but no lower bounds rule out a clean O(d) embedding time. In this work, we establish the first non-trivial lower bounds (of magnitude Ω(m lg m)) for a large class of embedding algorithms, including in particular most known upper bounds.

Cite as

Ora Nova Fandina, Mikael Møller Høgsgaard, and Kasper Green Larsen. Barriers for Faster Dimensionality Reduction. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 31:1-31:15, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{novafandina_et_al:LIPIcs.STACS.2023.31,
  author =	{Nova Fandina, Ora and M{\o}ller H{\o}gsgaard, Mikael and Green Larsen, Kasper},
  title =	{{Barriers for Faster Dimensionality Reduction}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{31:1--31:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.31},
  URN =		{urn:nbn:de:0030-drops-176838},
  doi =		{10.4230/LIPIcs.STACS.2023.31},
  annote =	{Keywords: Dimensional reduction, Lower bound, Linear Circuits}
}
Document
Optimality of the Johnson-Lindenstrauss Dimensionality Reduction for Practical Measures

Authors: Yair Bartal, Ora Nova Fandina, and Kasper Green Larsen

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
It is well known that the Johnson-Lindenstrauss dimensionality reduction method is optimal for worst case distortion. While in practice many other methods and heuristics are used, not much is known in terms of bounds on their performance. The question of whether the JL method is optimal for practical measures of distortion was recently raised in [Yair Bartal et al., 2019] (NeurIPS'19). They provided upper bounds on its quality for a wide range of practical measures and showed that indeed these are best possible in many cases. Yet, some of the most important cases, including the fundamental case of average distortion were left open. In particular, they show that the JL transform has 1+ε average distortion for embedding into k-dimensional Euclidean space, where k = O(1/ε²), and for more general q-norms of distortion, k = O(max{1/ε²,q/ε}), whereas tight lower bounds were established only for large values of q via reduction to the worst case. In this paper we prove that these bounds are best possible for any dimensionality reduction method, for any 1 ≤ q ≤ O((log (2ε² n))/ε) and ε ≥ 1/(√n), where n is the size of the subset of Euclidean space. Our results also imply that the JL method is optimal for various distortion measures commonly used in practice, such as stress, energy and relative error. We prove that if any of these measures is bounded by ε then k = Ω(1/ε²), for any ε ≥ 1/(√n), matching the upper bounds of [Yair Bartal et al., 2019] and extending their tightness results for the full range moment analysis. Our results may indicate that the JL dimensionality reduction method should be considered more often in practical applications, and the bounds we provide for its quality should be served as a measure for comparison when evaluating the performance of other methods and heuristics.

Cite as

Yair Bartal, Ora Nova Fandina, and Kasper Green Larsen. Optimality of the Johnson-Lindenstrauss Dimensionality Reduction for Practical Measures. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 13:1-13:16, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bartal_et_al:LIPIcs.SoCG.2022.13,
  author =	{Bartal, Yair and Fandina, Ora Nova and Larsen, Kasper Green},
  title =	{{Optimality of the Johnson-Lindenstrauss Dimensionality Reduction for Practical Measures}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{13:1--13:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.13},
  URN =		{urn:nbn:de:0030-drops-160219},
  doi =		{10.4230/LIPIcs.SoCG.2022.13},
  annote =	{Keywords: average distortion, practical dimensionality reduction, JL transform}
}
  • Refine by Author
  • 1 Bartal, Yair
  • 1 Fandina, Ora Nova
  • 1 Green Larsen, Kasper
  • 1 Larsen, Kasper Green
  • 1 Møller Høgsgaard, Mikael
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Random projections and metric embeddings
  • 1 Theory of computation → Computational geometry
  • 1 Theory of computation → Unsupervised learning and clustering

  • Refine by Keyword
  • 1 Dimensional reduction
  • 1 JL transform
  • 1 Linear Circuits
  • 1 Lower bound
  • 1 average distortion
  • Show More...

  • Refine by Type
  • 2 document

  • Refine by Publication Year
  • 1 2022
  • 1 2023

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail