14 Search Results for "Blocki, Jeremiah"


Document
RANDOM
Additive Noise Mechanisms for Making Randomized Approximation Algorithms Differentially Private

Authors: Jakub Tětek

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
The exponential increase in the amount of available data makes taking advantage of them without violating users' privacy one of the fundamental problems of computer science. This question has been investigated thoroughly under the framework of differential privacy. However, most of the literature has not focused on settings where the amount of data is so large that we are not even able to compute the exact answer in the non-private setting (such as in the streaming setting, sublinear-time setting, etc.). This can often make the use of differential privacy unfeasible in practice. In this paper, we show a general approach for making Monte-Carlo randomized approximation algorithms differentially private. We only need to assume the error R of the approximation algorithm is sufficiently concentrated around 0 (e.g. 𝔼[|R|] is bounded) and that the function being approximated has a small global sensitivity Δ. Specifically, if we have a randomized approximation algorithm with sufficiently concentrated error which has time/space/query complexity T(n,ρ) with ρ being an accuracy parameter, we can generally speaking get an algorithm with the same accuracy and complexity T(n,Θ(ε ρ)) that is ε-differentially private. Our technical results are as follows. First, we show that if the error is subexponential, then the Laplace mechanism with error magnitude proportional to the sum of the global sensitivity Δ and the subexponential diameter of the error of the algorithm makes the algorithm differentially private. This is true even if the worst-case global sensitivity of the algorithm is large or infinite. We then introduce a new additive noise mechanism, which we call the zero-symmetric Pareto mechanism. We show that using this mechanism, we can make an algorithm differentially private even if we only assume a bound on the first absolute moment of the error 𝔼[|R|]. Finally, we use our results to give either the first known or improved sublinear-complexity differentially private algorithms for various problems. This includes results for frequency moments, estimating the average degree of a graph in subliinear time, rank queries, or estimating the size of the maximum matching. Our results raise many new questions and we state multiple open problems.

Cite as

Jakub Tětek. Additive Noise Mechanisms for Making Randomized Approximation Algorithms Differentially Private. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 73:1-73:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{tetek:LIPIcs.APPROX/RANDOM.2024.73,
  author =	{T\v{e}tek, Jakub},
  title =	{{Additive Noise Mechanisms for Making Randomized Approximation Algorithms Differentially Private}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{73:1--73:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.73},
  URN =		{urn:nbn:de:0030-drops-210660},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.73},
  annote =	{Keywords: Differential privacy, Randomized approximation algorithms}
}
Document
Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries

Authors: Gil Cohen and Tal Yankovitz

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Recently, Kumar and Mon reached a significant milestone by constructing asymptotically good relaxed locally correctable codes (RLCCs) with poly-logarithmic query complexity. Specifically, they constructed n-bit RLCCs with O(log^{69} n) queries. Their construction relies on a clever reduction to locally testable codes (LTCs), capitalizing on recent breakthrough works in LTCs. As for lower bounds, Gur and Lachish (SICOMP 2021) proved that any asymptotically-good RLCC must make Ω̃(√{log n}) queries. Hence emerges the intriguing question regarding the identity of the least value 1/2 ≤ e ≤ 69 for which asymptotically-good RLCCs with query complexity (log n)^{e+o(1)} exist. In this work, we make substantial progress in narrowing the gap by devising asymptotically-good RLCCs with a query complexity of (log n)^{2+o(1)}. The key insight driving our work lies in recognizing that the strong guarantee of local testability overshoots the requirements for the Kumar-Mon reduction. In particular, we prove that we can replace the LTCs by "vanilla" expander codes which indeed have the necessary property: local testability in the code’s vicinity.

Cite as

Gil Cohen and Tal Yankovitz. Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.CCC.2024.8,
  author =	{Cohen, Gil and Yankovitz, Tal},
  title =	{{Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.8},
  URN =		{urn:nbn:de:0030-drops-204045},
  doi =		{10.4230/LIPIcs.CCC.2024.8},
  annote =	{Keywords: Relaxed locally decodable codes, Relxaed locally correctable codes, RLCC, RLDC}
}
Document
RANDOM
How to Make Your Approximation Algorithm Private: A Black-Box Differentially-Private Transformation for Tunable Approximation Algorithms of Functions with Low Sensitivity

Authors: Jeremiah Blocki, Elena Grigorescu, Tamalika Mukherjee, and Samson Zhou

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
We develop a framework for efficiently transforming certain approximation algorithms into differentially-private variants, in a black-box manner. Specifically, our results focus on algorithms A that output an approximation to a function f of the form (1-α)f(x)-κ ≤ A(x) ≤ (1+α)f(x)+κ, where κ ∈ ℝ_{≥ 0} denotes additive error and α ∈ [0,1) denotes multiplicative error can be"tuned" to small-enough values while incurring only a polynomial blowup in the running time/space. We show that such algorithms can be made differentially private without sacrificing accuracy, as long as the function f has small "global sensitivity". We achieve these results by applying the "smooth sensitivity" framework developed by Nissim, Raskhodnikova, and Smith (STOC 2007). Our framework naturally applies to transform non-private FPRAS and FPTAS algorithms into ε-differentially private approximation algorithms where the former case requires an additional postprocessing step. We apply our framework in the context of sublinear-time and sublinear-space algorithms, while preserving the nature of the algorithm in meaningful ranges of the parameters. Our results include the first (to the best of our knowledge) ε-edge differentially-private sublinear-time algorithm for estimating the number of triangles, the number of connected components, and the weight of a minimum spanning tree of a graph whose accuracy holds with high probability. In the area of streaming algorithms, our results include ε-DP algorithms for estimating L_p-norms, distinct elements, and weighted minimum spanning tree for both insertion-only and turnstile streams. Our transformation also provides a private version of the smooth histogram framework, which is commonly used for converting streaming algorithms into sliding window variants, and achieves a multiplicative approximation to many problems, such as estimating L_p-norms, distinct elements, and the length of the longest increasing subsequence.

Cite as

Jeremiah Blocki, Elena Grigorescu, Tamalika Mukherjee, and Samson Zhou. How to Make Your Approximation Algorithm Private: A Black-Box Differentially-Private Transformation for Tunable Approximation Algorithms of Functions with Low Sensitivity. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 59:1-59:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.APPROX/RANDOM.2023.59,
  author =	{Blocki, Jeremiah and Grigorescu, Elena and Mukherjee, Tamalika and Zhou, Samson},
  title =	{{How to Make Your Approximation Algorithm Private: A Black-Box Differentially-Private Transformation for Tunable Approximation Algorithms of Functions with Low Sensitivity}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{59:1--59:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.59},
  URN =		{urn:nbn:de:0030-drops-188849},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.59},
  annote =	{Keywords: Differential privacy, approximation algorithms}
}
Document
On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors

Authors: Alexander R. Block, Jeremiah Blocki, Kuan Cheng, Elena Grigorescu, Xin Li, Yu Zheng, and Minshen Zhu

Published in: LIPIcs, Volume 264, 38th Computational Complexity Conference (CCC 2023)


Abstract
Locally Decodable Codes (LDCs) are error-correcting codes C:Σⁿ → Σ^m, encoding messages in Σⁿ to codewords in Σ^m, with super-fast decoding algorithms. They are important mathematical objects in many areas of theoretical computer science, yet the best constructions so far have codeword length m that is super-polynomial in n, for codes with constant query complexity and constant alphabet size. In a very surprising result, Ben-Sasson, Goldreich, Harsha, Sudan, and Vadhan (SICOMP 2006) show how to construct a relaxed version of LDCs (RLDCs) with constant query complexity and almost linear codeword length over the binary alphabet, and used them to obtain significantly-improved constructions of Probabilistically Checkable Proofs. In this work, we study RLDCs in the standard Hamming-error setting, and introduce their variants in the insertion and deletion (Insdel) error setting. Standard LDCs for Insdel errors were first studied by Ostrovsky and Paskin-Cherniavsky (Information Theoretic Security, 2015), and are further motivated by recent advances in DNA random access bio-technologies. Our first result is an exponential lower bound on the length of Hamming RLDCs making 2 queries (even adaptively), over the binary alphabet. This answers a question explicitly raised by Gur and Lachish (SICOMP 2021) and is the first exponential lower bound for RLDCs. Combined with the results of Ben-Sasson et al., our result exhibits a "phase-transition"-type behavior on the codeword length for some constant-query complexity. We achieve these lower bounds via a transformation of RLDCs to standard Hamming LDCs, using a careful analysis of restrictions of message bits that fix codeword bits. We further define two variants of RLDCs in the Insdel-error setting, a weak and a strong version. On the one hand, we construct weak Insdel RLDCs with almost linear codeword length and constant query complexity, matching the parameters of the Hamming variants. On the other hand, we prove exponential lower bounds for strong Insdel RLDCs. These results demonstrate that, while these variants are equivalent in the Hamming setting, they are significantly different in the insdel setting. Our results also prove a strict separation between Hamming RLDCs and Insdel RLDCs.

Cite as

Alexander R. Block, Jeremiah Blocki, Kuan Cheng, Elena Grigorescu, Xin Li, Yu Zheng, and Minshen Zhu. On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors. In 38th Computational Complexity Conference (CCC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 264, pp. 14:1-14:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.CCC.2023.14,
  author =	{Block, Alexander R. and Blocki, Jeremiah and Cheng, Kuan and Grigorescu, Elena and Li, Xin and Zheng, Yu and Zhu, Minshen},
  title =	{{On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors}},
  booktitle =	{38th Computational Complexity Conference (CCC 2023)},
  pages =	{14:1--14:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-282-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{264},
  editor =	{Ta-Shma, Amnon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2023.14},
  URN =		{urn:nbn:de:0030-drops-182847},
  doi =		{10.4230/LIPIcs.CCC.2023.14},
  annote =	{Keywords: Relaxed Locally Decodable Codes, Hamming Errors, Insdel Errors, Lower Bounds}
}
Document
Track A: Algorithms, Complexity and Games
Privately Estimating Graph Parameters in Sublinear Time

Authors: Jeremiah Blocki, Elena Grigorescu, and Tamalika Mukherjee

Published in: LIPIcs, Volume 229, 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)


Abstract
We initiate a systematic study of algorithms that are both differentially-private and run in sublinear time for several problems in which the goal is to estimate natural graph parameters. Our main result is a differentially-private (1+ρ)-approximation algorithm for the problem of computing the average degree of a graph, for every ρ > 0. The running time of the algorithm is roughly the same (for sparse graphs) as its non-private version proposed by Goldreich and Ron (Sublinear Algorithms, 2005). We also obtain the first differentially-private sublinear-time approximation algorithms for the maximum matching size and the minimum vertex cover size of a graph. An overarching technique we employ is the notion of coupled global sensitivity of randomized algorithms. Related variants of this notion of sensitivity have been used in the literature in ad-hoc ways. Here we formalize the notion and develop it as a unifying framework for privacy analysis of randomized approximation algorithms.

Cite as

Jeremiah Blocki, Elena Grigorescu, and Tamalika Mukherjee. Privately Estimating Graph Parameters in Sublinear Time. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 26:1-26:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ICALP.2022.26,
  author =	{Blocki, Jeremiah and Grigorescu, Elena and Mukherjee, Tamalika},
  title =	{{Privately Estimating Graph Parameters in Sublinear Time}},
  booktitle =	{49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)},
  pages =	{26:1--26:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-235-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{229},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Merelli, Emanuela and Woodruff, David P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2022.26},
  URN =		{urn:nbn:de:0030-drops-163674},
  doi =		{10.4230/LIPIcs.ICALP.2022.26},
  annote =	{Keywords: differential privacy, sublinear time, graph algorithms}
}
Document
On Explicit Constructions of Extremely Depth Robust Graphs

Authors: Jeremiah Blocki, Mike Cinkoske, Seunghoon Lee, and Jin Young Son

Published in: LIPIcs, Volume 219, 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)


Abstract
A directed acyclic graph G = (V,E) is said to be (e,d)-depth robust if for every subset S ⊆ V of |S| ≤ e nodes the graph G-S still contains a directed path of length d. If the graph is (e,d)-depth-robust for any e,d such that e+d ≤ (1-ε)|V| then the graph is said to be ε-extreme depth-robust. In the field of cryptography, (extremely) depth-robust graphs with low indegree have found numerous applications including the design of side-channel resistant Memory-Hard Functions, Proofs of Space and Replication and in the design of Computationally Relaxed Locally Correctable Codes. In these applications, it is desirable to ensure the graphs are locally navigable, i.e., there is an efficient algorithm GetParents running in time polylog|V| which takes as input a node v ∈ V and returns the set of v’s parents. We give the first explicit construction of locally navigable ε-extreme depth-robust graphs with indegree O(log |V|). Previous constructions of ε-extreme depth-robust graphs either had indegree ω̃(log² |V|) or were not explicit.

Cite as

Jeremiah Blocki, Mike Cinkoske, Seunghoon Lee, and Jin Young Son. On Explicit Constructions of Extremely Depth Robust Graphs. In 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 219, pp. 14:1-14:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.STACS.2022.14,
  author =	{Blocki, Jeremiah and Cinkoske, Mike and Lee, Seunghoon and Son, Jin Young},
  title =	{{On Explicit Constructions of Extremely Depth Robust Graphs}},
  booktitle =	{39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)},
  pages =	{14:1--14:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-222-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{219},
  editor =	{Berenbrink, Petra and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2022.14},
  URN =		{urn:nbn:de:0030-drops-158241},
  doi =		{10.4230/LIPIcs.STACS.2022.14},
  annote =	{Keywords: Depth-Robust Graphs, Explicit Constructions, Data-Independent Memory Hard Functions, Proofs of Space and Replication}
}
Document
On the Security of Proofs of Sequential Work in a Post-Quantum World

Authors: Jeremiah Blocki, Seunghoon Lee, and Samson Zhou

Published in: LIPIcs, Volume 199, 2nd Conference on Information-Theoretic Cryptography (ITC 2021)


Abstract
A Proof of Sequential Work (PoSW) allows a prover to convince a resource-bounded verifier that the prover invested a substantial amount of sequential time to perform some underlying computation. PoSWs have many applications including time-stamping, blockchain design, and universally verifiable CPU benchmarks. Mahmoody, Moran, and Vadhan (ITCS 2013) gave the first construction of a PoSW in the random oracle model though the construction relied on expensive depth-robust graphs. In a recent breakthrough, Cohen and Pietrzak (EUROCRYPT 2018) gave an efficient PoSW construction that does not require expensive depth-robust graphs. In the classical parallel random oracle model, it is straightforward to argue that any successful PoSW attacker must produce a long ℋ-sequence and that any malicious party running in sequential time T-1 will fail to produce an ℋ-sequence of length T except with negligible probability. In this paper, we prove that any quantum attacker running in sequential time T-1 will fail to produce an ℋ-sequence except with negligible probability - even if the attacker submits a large batch of quantum queries in each round. The proof is substantially more challenging and highlights the power of Zhandry’s recent compressed oracle technique (CRYPTO 2019). We further extend this result to establish post-quantum security of a non-interactive PoSW obtained by applying the Fiat-Shamir transform to Cohen and Pietrzak’s efficient construction (EUROCRYPT 2018).

Cite as

Jeremiah Blocki, Seunghoon Lee, and Samson Zhou. On the Security of Proofs of Sequential Work in a Post-Quantum World. In 2nd Conference on Information-Theoretic Cryptography (ITC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 199, pp. 22:1-22:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ITC.2021.22,
  author =	{Blocki, Jeremiah and Lee, Seunghoon and Zhou, Samson},
  title =	{{On the Security of Proofs of Sequential Work in a Post-Quantum World}},
  booktitle =	{2nd Conference on Information-Theoretic Cryptography (ITC 2021)},
  pages =	{22:1--22:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-197-9},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{199},
  editor =	{Tessaro, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2021.22},
  URN =		{urn:nbn:de:0030-drops-143415},
  doi =		{10.4230/LIPIcs.ITC.2021.22},
  annote =	{Keywords: Proof of Sequential Work, Parallel Quantum Random Oracle Model, Lower Bounds}
}
Document
A New Connection Between Node and Edge Depth Robust Graphs

Authors: Jeremiah Blocki and Mike Cinkoske

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
Given a directed acyclic graph (DAG) G = (V,E), we say that G is (e,d)-depth-robust (resp. (e,d)-edge-depth-robust) if for any set S ⊂ V (resp. S ⊆ E) of at most |S| ≤ e nodes (resp. edges) the graph G-S contains a directed path of length d. While edge-depth-robust graphs are potentially easier to construct many applications in cryptography require node depth-robust graphs with small indegree. We create a graph reduction that transforms an (e, d)-edge-depth-robust graph with m edges into a (e/2,d)-depth-robust graph with O(m) nodes and constant indegree. One immediate consequence of this result is the first construction of a provably ((n log log n)/log n, n/{(log n)^{1 + log log n}})-depth-robust graph with constant indegree, where previous constructions for e = (n log log n)/log n had d = O(n^{1-ε}). Our reduction crucially relies on ST-Robust graphs, a new graph property we introduce which may be of independent interest. We say that a directed, acyclic graph with n inputs and n outputs is (k₁, k₂)-ST-Robust if we can remove any k₁ nodes and there exists a subgraph containing at least k₂ inputs and k₂ outputs such that each of the k₂ inputs is connected to all of the k₂ outputs. If the graph if (k₁,n-k₁)-ST-Robust for all k₁ ≤ n we say that the graph is maximally ST-robust. We show how to construct maximally ST-robust graphs with constant indegree and O(n) nodes. Given a family 𝕄 of ST-robust graphs and an arbitrary (e, d)-edge-depth-robust graph G we construct a new constant-indegree graph Reduce(G, 𝕄) by replacing each node in G with an ST-robust graph from 𝕄. We also show that ST-robust graphs can be used to construct (tight) proofs-of-space and (asymptotically) improved wide-block labeling functions.

Cite as

Jeremiah Blocki and Mike Cinkoske. A New Connection Between Node and Edge Depth Robust Graphs. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 64:1-64:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ITCS.2021.64,
  author =	{Blocki, Jeremiah and Cinkoske, Mike},
  title =	{{A New Connection Between Node and Edge Depth Robust Graphs}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{64:1--64:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.64},
  URN =		{urn:nbn:de:0030-drops-136038},
  doi =		{10.4230/LIPIcs.ITCS.2021.64},
  annote =	{Keywords: Depth robust graphs, memory hard functions}
}
Document
Locally Decodable/Correctable Codes for Insertions and Deletions

Authors: Alexander R. Block, Jeremiah Blocki, Elena Grigorescu, Shubhang Kulkarni, and Minshen Zhu

Published in: LIPIcs, Volume 182, 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)


Abstract
Recent efforts in coding theory have focused on building codes for insertions and deletions, called insdel codes, with optimal trade-offs between their redundancy and their error-correction capabilities, as well as efficient encoding and decoding algorithms. In many applications, polynomial running time may still be prohibitively expensive, which has motivated the study of codes with super-efficient decoding algorithms. These have led to the well-studied notions of Locally Decodable Codes (LDCs) and Locally Correctable Codes (LCCs). Inspired by these notions, Ostrovsky and Paskin-Cherniavsky (Information Theoretic Security, 2015) generalized Hamming LDCs to insertions and deletions. To the best of our knowledge, these are the only known results that study the analogues of Hamming LDCs in channels performing insertions and deletions. Here we continue the study of insdel codes that admit local algorithms. Specifically, we reprove the results of Ostrovsky and Paskin-Cherniavsky for insdel LDCs using a different set of techniques. We also observe that the techniques extend to constructions of LCCs. Specifically, we obtain insdel LDCs and LCCs from their Hamming LDCs and LCCs analogues, respectively. The rate and error-correction capability blow up only by a constant factor, while the query complexity blows up by a poly log factor in the block length. Since insdel locally decodable/correctble codes are scarcely studied in the literature, we believe our results and techniques may lead to further research. In particular, we conjecture that constant-query insdel LDCs/LCCs do not exist.

Cite as

Alexander R. Block, Jeremiah Blocki, Elena Grigorescu, Shubhang Kulkarni, and Minshen Zhu. Locally Decodable/Correctable Codes for Insertions and Deletions. In 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 182, pp. 16:1-16:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.FSTTCS.2020.16,
  author =	{Block, Alexander R. and Blocki, Jeremiah and Grigorescu, Elena and Kulkarni, Shubhang and Zhu, Minshen},
  title =	{{Locally Decodable/Correctable Codes for Insertions and Deletions}},
  booktitle =	{40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)},
  pages =	{16:1--16:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-174-0},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{182},
  editor =	{Saxena, Nitin and Simon, Sunil},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2020.16},
  URN =		{urn:nbn:de:0030-drops-132577},
  doi =		{10.4230/LIPIcs.FSTTCS.2020.16},
  annote =	{Keywords: Locally decodable/correctable codes, insert-delete channel}
}
Document
On Locally Decodable Codes in Resource Bounded Channels

Authors: Jeremiah Blocki, Shubhang Kulkarni, and Samson Zhou

Published in: LIPIcs, Volume 163, 1st Conference on Information-Theoretic Cryptography (ITC 2020)


Abstract
Constructions of locally decodable codes (LDCs) have one of two undesirable properties: low rate or high locality (polynomial in the length of the message). In settings where the encoder/decoder have already exchanged cryptographic keys and the channel is a probabilistic polynomial time (PPT) algorithm, it is possible to circumvent these barriers and design LDCs with constant rate and small locality. However, the assumption that the encoder/decoder have exchanged cryptographic keys is often prohibitive. We thus consider the problem of designing explicit and efficient LDCs in settings where the channel is slightly more constrained than the encoder/decoder with respect to some resource e.g., space or (sequential) time. Given an explicit function f that the channel cannot compute, we show how the encoder can transmit a random secret key to the local decoder using f(⋅) and a random oracle 𝖧(⋅). We then bootstrap the private key LDC construction of Ostrovsky, Pandey and Sahai (ICALP, 2007), thereby answering an open question posed by Guruswami and Smith (FOCS 2010) of whether such bootstrapping techniques are applicable to LDCs in channel models weaker than just PPT algorithms. Specifically, in the random oracle model we show how to construct explicit constant rate LDCs with locality of polylog in the security parameter against various resource constrained channels.

Cite as

Jeremiah Blocki, Shubhang Kulkarni, and Samson Zhou. On Locally Decodable Codes in Resource Bounded Channels. In 1st Conference on Information-Theoretic Cryptography (ITC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 163, pp. 16:1-16:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ITC.2020.16,
  author =	{Blocki, Jeremiah and Kulkarni, Shubhang and Zhou, Samson},
  title =	{{On Locally Decodable Codes in Resource Bounded Channels}},
  booktitle =	{1st Conference on Information-Theoretic Cryptography (ITC 2020)},
  pages =	{16:1--16:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-151-1},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{163},
  editor =	{Tauman Kalai, Yael and Smith, Adam D. and Wichs, Daniel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2020.16},
  URN =		{urn:nbn:de:0030-drops-121216},
  doi =		{10.4230/LIPIcs.ITC.2020.16},
  annote =	{Keywords: Locally Decodable Codes, Resource Bounded Channels}
}
Document
Approximating Cumulative Pebbling Cost Is Unique Games Hard

Authors: Jeremiah Blocki, Seunghoon Lee, and Samson Zhou

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
The cumulative pebbling complexity of a directed acyclic graph G is defined as cc(G) = min_P ∑_i |P_i|, where the minimum is taken over all legal (parallel) black pebblings of G and |P_i| denotes the number of pebbles on the graph during round i. Intuitively, cc(G) captures the amortized Space-Time complexity of pebbling m copies of G in parallel. The cumulative pebbling complexity of a graph G is of particular interest in the field of cryptography as cc(G) is tightly related to the amortized Area-Time complexity of the Data-Independent Memory-Hard Function (iMHF) f_{G,H} [Joël Alwen and Vladimir Serbinenko, 2015] defined using a constant indegree directed acyclic graph (DAG) G and a random oracle H(⋅). A secure iMHF should have amortized Space-Time complexity as high as possible, e.g., to deter brute-force password attacker who wants to find x such that f_{G,H}(x) = h. Thus, to analyze the (in)security of a candidate iMHF f_{G,H}, it is crucial to estimate the value cc(G) but currently, upper and lower bounds for leading iMHF candidates differ by several orders of magnitude. Blocki and Zhou recently showed that it is NP-Hard to compute cc(G), but their techniques do not even rule out an efficient (1+ε)-approximation algorithm for any constant ε>0. We show that for any constant c > 0, it is Unique Games hard to approximate cc(G) to within a factor of c. Along the way, we show the hardness of approximation of the DAG Vertex Deletion problem on DAGs of constant indegree. Namely, we show that for any k,ε >0 and given a DAG G with N nodes and constant indegree, it is Unique Games hard to distinguish between the case that G is (e_1, d_1)-reducible with e_1=N^{1/(1+2 ε)}/k and d_1=k N^{2 ε/(1+2 ε)}, and the case that G is (e_2, d_2)-depth-robust with e_2 = (1-ε)k e_1 and d_2= 0.9 N^{(1+ε)/(1+2 ε)}, which may be of independent interest. Our result generalizes a result of Svensson who proved an analogous result for DAGs with indegree ?(N).

Cite as

Jeremiah Blocki, Seunghoon Lee, and Samson Zhou. Approximating Cumulative Pebbling Cost Is Unique Games Hard. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 13:1-13:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ITCS.2020.13,
  author =	{Blocki, Jeremiah and Lee, Seunghoon and Zhou, Samson},
  title =	{{Approximating Cumulative Pebbling Cost Is Unique Games Hard}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{13:1--13:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.13},
  URN =		{urn:nbn:de:0030-drops-116983},
  doi =		{10.4230/LIPIcs.ITCS.2020.13},
  annote =	{Keywords: Cumulative Pebbling Cost, Approximation Algorithm, Unique Games Conjecture, \gamma-Extreme Depth Robust Graph, Superconcentrator, Memory-Hard Function}
}
Document
Computationally Data-Independent Memory Hard Functions

Authors: Mohammad Hassan Ameri, Jeremiah Blocki, and Samson Zhou

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Memory hard functions (MHFs) are an important cryptographic primitive that are used to design egalitarian proofs of work and in the construction of moderately expensive key-derivation functions resistant to brute-force attacks. Broadly speaking, MHFs can be divided into two categories: data-dependent memory hard functions (dMHFs) and data-independent memory hard functions (iMHFs). iMHFs are resistant to certain side-channel attacks as the memory access pattern induced by the honest evaluation algorithm is independent of the potentially sensitive input e.g., password. While dMHFs are potentially vulnerable to side-channel attacks (the induced memory access pattern might leak useful information to a brute-force attacker), they can achieve higher cumulative memory complexity (CMC) in comparison than an iMHF. In particular, any iMHF that can be evaluated in N steps on a sequential machine has CMC at most ?((N^2 log log N)/log N). By contrast, the dMHF scrypt achieves maximal CMC Ω(N^2) - though the CMC of scrypt would be reduced to just ?(N) after a side-channel attack. In this paper, we introduce the notion of computationally data-independent memory hard functions (ciMHFs). Intuitively, we require that memory access pattern induced by the (randomized) ciMHF evaluation algorithm appears to be independent from the standpoint of a computationally bounded eavesdropping attacker - even if the attacker selects the initial input. We then ask whether it is possible to circumvent known upper bound for iMHFs and build a ciMHF with CMC Ω(N^2). Surprisingly, we answer the question in the affirmative when the ciMHF evaluation algorithm is executed on a two-tiered memory architecture (RAM/Cache). We introduce the notion of a k-restricted dynamic graph to quantify the continuum between unrestricted dMHFs (k=n) and iMHFs (k=1). For any ε > 0 we show how to construct a k-restricted dynamic graph with k=Ω(N^(1-ε)) that provably achieves maximum cumulative pebbling cost Ω(N^2). We can use k-restricted dynamic graphs to build a ciMHF provided that cache is large enough to hold k hash outputs and the dynamic graph satisfies a certain property that we call "amenable to shuffling". In particular, we prove that the induced memory access pattern is indistinguishable to a polynomial time attacker who can monitor the locations of read/write requests to RAM, but not cache. We also show that when k=o(N^(1/log log N)) , then any k-restricted graph with constant indegree has cumulative pebbling cost o(N^2). Our results almost completely characterize the spectrum of k-restricted dynamic graphs.

Cite as

Mohammad Hassan Ameri, Jeremiah Blocki, and Samson Zhou. Computationally Data-Independent Memory Hard Functions. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 36:1-36:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ameri_et_al:LIPIcs.ITCS.2020.36,
  author =	{Ameri, Mohammad Hassan and Blocki, Jeremiah and Zhou, Samson},
  title =	{{Computationally Data-Independent Memory Hard Functions}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{36:1--36:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.36},
  URN =		{urn:nbn:de:0030-drops-117214},
  doi =		{10.4230/LIPIcs.ITCS.2020.36},
  annote =	{Keywords: Computationally Data-Independent Memory Hard Function, Cumulative Memory Complexity, Dynamic Pebbling Game}
}
Document
Brief Announcement
Brief Announcement: Relaxed Locally Correctable Codes in Computationally Bounded Channels

Authors: Jeremiah Blocki, Venkata Gandikota, Elena Grigorescu, and Samson Zhou

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
We study variants of locally decodable and locally correctable codes in computationally bounded, adversarial channels, under the assumption that collision-resistant hash functions exist, and with no public-key or private-key cryptographic setup. Specifically, we provide constructions of relaxed locally correctable and relaxed locally decodable codes over the binary alphabet, with constant information rate, and poly-logarithmic locality. Our constructions compare favorably with existing schemes built under much stronger cryptographic assumptions, and with their classical analogues in the computationally unbounded, Hamming channel. Our constructions crucially employ collision-resistant hash functions and local expander graphs, extending ideas from recent cryptographic constructions of memory-hard functions.

Cite as

Jeremiah Blocki, Venkata Gandikota, Elena Grigorescu, and Samson Zhou. Brief Announcement: Relaxed Locally Correctable Codes in Computationally Bounded Channels. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 106:1-106:4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ICALP.2018.106,
  author =	{Blocki, Jeremiah and Gandikota, Venkata and Grigorescu, Elena and Zhou, Samson},
  title =	{{Brief Announcement: Relaxed Locally Correctable Codes in Computationally Bounded Channels}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{106:1--106:4},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.106},
  URN =		{urn:nbn:de:0030-drops-91102},
  doi =		{10.4230/LIPIcs.ICALP.2018.106},
  annote =	{Keywords: Relaxed locally correctable codes, computationally bounded channels, local expanders}
}
Document
Towards Human Computable Passwords

Authors: Jeremiah Blocki, Manuel Blum, Anupam Datta, and Santosh Vempala

Published in: LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)


Abstract
An interesting challenge for the cryptography community is to design authentication protocols that are so simple that a human can execute them without relying on a fully trusted computer. We propose several candidate authentication protocols for a setting in which the human user can only receive assistance from a semi-trusted computer - a computer that stores information and performs computations correctly but does not provide confidentiality. Our schemes use a semi-trusted computer to store and display public challenges C_i\in[n]^k. The human user memorizes a random secret mapping \sigma:[n]\rightarrow \mathbb{Z}_d and authenticates by computing responses f(\sigma(C_i)) to a sequence of public challenges where f:\mathbb{Z}_d^k\rightarrow \mathbb{Z}_d is a function that is easy for the human to evaluate. We prove that any statistical adversary needs to sample m=\tilde{\Omega}\paren{n^{s(f)}} challenge-response pairs to recover \sigma, for a security parameter s(f) that depends on two key properties of f. Our lower bound generalizes recent results of Feldman et al. [Feldman'15] who proved analogous results for the special case d=2. To obtain our results, we apply the general hypercontractivity theorem [O'Donnell'14] to lower bound the statistical dimension of the distribution over challenge-response pairs induced by f and \sigma. Our statistical dimension lower bounds apply to arbitrary functions f:\mathbb{Z}_d^k\rightarrow \mathbb{Z}_d (not just to functions that are easy for a human to evaluate). As an application, we propose a family of human computable password functions f_{k_1,k_2} in which the user needs to perform 2k_1+2k_2+1 primitive operations (e.g., adding two digits or remembering a secret value \sigma(i)), and we show that s(f) = \min{k_1+1, (k_2+1)/2}. For these schemes, we prove that forging passwords is equivalent to recovering the secret mapping. Thus, our human computable password schemes can maintain strong security guarantees even after an adversary has observed the user login to many different accounts.

Cite as

Jeremiah Blocki, Manuel Blum, Anupam Datta, and Santosh Vempala. Towards Human Computable Passwords. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 10:1-10:47, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{blocki_et_al:LIPIcs.ITCS.2017.10,
  author =	{Blocki, Jeremiah and Blum, Manuel and Datta, Anupam and Vempala, Santosh},
  title =	{{Towards Human Computable Passwords}},
  booktitle =	{8th Innovations in Theoretical Computer Science Conference (ITCS 2017)},
  pages =	{10:1--10:47},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-029-3},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{67},
  editor =	{Papadimitriou, Christos H.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.10},
  URN =		{urn:nbn:de:0030-drops-81847},
  doi =		{10.4230/LIPIcs.ITCS.2017.10},
  annote =	{Keywords: Passwords, Cognitive Authentication, Human Computation, Planted Constraint Satisfaction Problem, Statistical Dimension}
}
  • Refine by Author
  • 12 Blocki, Jeremiah
  • 6 Zhou, Samson
  • 5 Grigorescu, Elena
  • 3 Lee, Seunghoon
  • 2 Block, Alexander R.
  • Show More...

  • Refine by Classification
  • 5 Theory of computation → Error-correcting codes
  • 3 Security and privacy → Hash functions and message authentication codes
  • 2 Theory of computation → Cryptographic primitives
  • 2 Theory of computation → Streaming, sublinear and near linear time algorithms
  • 1 Mathematics of computing → Coding theory
  • Show More...

  • Refine by Keyword
  • 2 Differential privacy
  • 2 Lower Bounds
  • 1 Approximation Algorithm
  • 1 Cognitive Authentication
  • 1 Computationally Data-Independent Memory Hard Function
  • Show More...

  • Refine by Type
  • 14 document

  • Refine by Publication Year
  • 4 2020
  • 2 2021
  • 2 2022
  • 2 2023
  • 2 2024
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail