3 Search Results for "Houen, Jakob Bæk Tejs"


Document
RANDOM
Bias Reduction for Sum Estimation

Authors: Talya Eden, Jakob Bæk Tejs Houen, Shyam Narayanan, Will Rosenbaum, and Jakub Tětek

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
In classical statistics and distribution testing, it is often assumed that elements can be sampled exactly from some distribution 𝒫, and that when an element x is sampled, the probability 𝒫(x) of sampling x is also known. In this setting, recent work in distribution testing has shown that many algorithms are robust in the sense that they still produce correct output if the elements are drawn from any distribution 𝒬 that is sufficiently close to 𝒫. This phenomenon raises interesting questions: under what conditions is a "noisy" distribution 𝒬 sufficient, and what is the algorithmic cost of coping with this noise? In this paper, we investigate these questions for the problem of estimating the sum of a multiset of N real values x_1, …, x_N. This problem is well-studied in the statistical literature in the case 𝒫 = 𝒬, where the Hansen-Hurwitz estimator [Annals of Mathematical Statistics, 1943] is frequently used. We assume that for some (known) distribution 𝒫, values are sampled from a distribution 𝒬 that is pointwise close to 𝒫. That is, there is a parameter γ < 1 such that for all x_i, (1 - γ) 𝒫(i) ≤ 𝒬(i) ≤ (1 + γ) 𝒫(i). For every positive integer k we define an estimator ζ_k for μ = ∑_i x_i whose bias is proportional to γ^k (where our ζ₁ reduces to the classical Hansen-Hurwitz estimator). As a special case, we show that if 𝒬 is pointwise γ-close to uniform and all x_i ∈ {0, 1}, for any ε > 0, we can estimate μ to within additive error ε N using m = Θ(N^{1-1/k}/ε^{2/k}) samples, where k = ⌈lg ε/lg γ⌉. We then show that this sample complexity is essentially optimal. Interestingly, our upper and lower bounds show that the sample complexity need not vary uniformly with the desired error parameter ε: for some values of ε, perturbations in its value have no asymptotic effect on the sample complexity, while for other values, any decrease in its value results in an asymptotically larger sample complexity.

Cite as

Talya Eden, Jakob Bæk Tejs Houen, Shyam Narayanan, Will Rosenbaum, and Jakub Tětek. Bias Reduction for Sum Estimation. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 62:1-62:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{eden_et_al:LIPIcs.APPROX/RANDOM.2023.62,
  author =	{Eden, Talya and Houen, Jakob B{\ae}k Tejs and Narayanan, Shyam and Rosenbaum, Will and T\v{e}tek, Jakub},
  title =	{{Bias Reduction for Sum Estimation}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{62:1--62:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.62},
  URN =		{urn:nbn:de:0030-drops-188872},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.62},
  annote =	{Keywords: bias reduction, sum estimation, sublinear time algorithms, sample complexity}
}
Document
Track A: Algorithms, Complexity and Games
A Sparse Johnson-Lindenstrauss Transform Using Fast Hashing

Authors: Jakob Bæk Tejs Houen and Mikkel Thorup

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
The Sparse Johnson-Lindenstrauss Transform of Kane and Nelson (SODA 2012) provides a linear dimensionality-reducing map A ∈ ℝ^{m × u} in 𝓁₂ that preserves distances up to distortion of 1 + ε with probability 1 - δ, where m = O(ε^{-2} log 1/δ) and each column of A has O(ε m) non-zero entries. The previous analyses of the Sparse Johnson-Lindenstrauss Transform all assumed access to a Ω(log 1/δ)-wise independent hash function. The main contribution of this paper is a more general analysis of the Sparse Johnson-Lindenstrauss Transform with less assumptions on the hash function. We also show that the Mixed Tabulation hash function of Dahlgaard, Knudsen, Rotenberg, and Thorup (FOCS 2015) satisfies the conditions of our analysis, thus giving us the first analysis of a Sparse Johnson-Lindenstrauss Transform that works with a practical hash function.

Cite as

Jakob Bæk Tejs Houen and Mikkel Thorup. A Sparse Johnson-Lindenstrauss Transform Using Fast Hashing. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 76:1-76:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{houen_et_al:LIPIcs.ICALP.2023.76,
  author =	{Houen, Jakob B{\ae}k Tejs and Thorup, Mikkel},
  title =	{{A Sparse Johnson-Lindenstrauss Transform Using Fast Hashing}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{76:1--76:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.76},
  URN =		{urn:nbn:de:0030-drops-181281},
  doi =		{10.4230/LIPIcs.ICALP.2023.76},
  annote =	{Keywords: dimensionality reduction, hashing, concentration bounds, moment bounds}
}
Document
Track A: Algorithms, Complexity and Games
Understanding the Moments of Tabulation Hashing via Chaoses

Authors: Jakob Bæk Tejs Houen and Mikkel Thorup

Published in: LIPIcs, Volume 229, 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)


Abstract
Simple tabulation hashing dates back to Zobrist in 1970 and is defined as follows: Each key is viewed as c characters from some alphabet Σ, we have c fully random hash functions h₀, …, h_{c - 1} : Σ → {{0, …, 2^l - 1}}, and a key x = (x₀, …, x_{c - 1}) is hashed to h(x) = h₀(x₀) ⊕ … ⊕ h_{c - 1}(x_{c - 1}) where ⊕ is the bitwise XOR operation. The previous results on tabulation hashing by Pǎtraşcu and Thorup [J.ACM'11] and by Aamand et al. [STOC'20] focused on proving Chernoff-style tail bounds on hash-based sums, e.g., the number keys hashing to a given value, for simple tabulation hashing, but their bounds do not cover the entire tail. Thus their results cannot bound moments. The paper Dahlgaard et al. [FOCS'15] provides a bound on the moments of certain hash-based sums, but their bound only holds for constant moments, and we need logarithmic moments. Chaoses are random variables of the form ∑ a_{i₀, …, i_{c - 1}} X_{i₀} ⋅ … ⋅ X_{i_{c - 1}} where X_i are independent random variables. Chaoses are a well-studied concept from probability theory, and tight analysis has been proven in several instances, e.g., when the independent random variables are standard Gaussian variables and when the independent random variables have logarithmically convex tails. We notice that hash-based sums of simple tabulation hashing can be seen as a sum of chaoses that are not independent. This motivates us to use techniques from the theory of chaoses to analyze hash-based sums of simple tabulation hashing. In this paper, we obtain bounds for all the moments of hash-based sums for simple tabulation hashing which are tight up to constants depending only on c. In contrast with the previous attempts, our approach will mostly be analytical and does not employ intricate combinatorial arguments. The improved analysis of simple tabulation hashing allows us to obtain bounds for the moments of hash-based sums for the mixed tabulation hashing introduced by Dahlgaard et al. [FOCS'15]. With simple tabulation hashing, there are certain inputs for which the concentration is much worse than with fully random hashing. However, with mixed tabulation, we get logarithmic moment bounds that are only a constant factor worse than those with fully random hashing for any possible input. This is a strong addition to other powerful probabilistic properties of mixed tabulation hashing proved by Dahlgaard et al.

Cite as

Jakob Bæk Tejs Houen and Mikkel Thorup. Understanding the Moments of Tabulation Hashing via Chaoses. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 74:1-74:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{houen_et_al:LIPIcs.ICALP.2022.74,
  author =	{Houen, Jakob B{\ae}k Tejs and Thorup, Mikkel},
  title =	{{Understanding the Moments of Tabulation Hashing via Chaoses}},
  booktitle =	{49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)},
  pages =	{74:1--74:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-235-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{229},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Merelli, Emanuela and Woodruff, David P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2022.74},
  URN =		{urn:nbn:de:0030-drops-164154},
  doi =		{10.4230/LIPIcs.ICALP.2022.74},
  annote =	{Keywords: hashing, concentration bounds, moment bounds}
}
  • Refine by Author
  • 3 Houen, Jakob Bæk Tejs
  • 2 Thorup, Mikkel
  • 1 Eden, Talya
  • 1 Narayanan, Shyam
  • 1 Rosenbaum, Will
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Pseudorandomness and derandomization
  • 1 Mathematics of computing → Probabilistic algorithms
  • 1 Theory of computation → Lower bounds and information complexity
  • 1 Theory of computation → Random projections and metric embeddings
  • 1 Theory of computation → Sample complexity and generalization bounds
  • Show More...

  • Refine by Keyword
  • 2 concentration bounds
  • 2 hashing
  • 2 moment bounds
  • 1 bias reduction
  • 1 dimensionality reduction
  • Show More...

  • Refine by Type
  • 3 document

  • Refine by Publication Year
  • 2 2023
  • 1 2022

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail