12 Search Results for "De, Anindya"


Document
Testing Intersecting and Union-Closed Families

Authors: Xi Chen, Anindya De, Yuhao Li, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
Inspired by the classic problem of Boolean function monotonicity testing, we investigate the testability of other well-studied properties of combinatorial finite set systems, specifically intersecting families and union-closed families. A function f: {0,1}ⁿ → {0,1} is intersecting (respectively, union-closed) if its set of satisfying assignments corresponds to an intersecting family (respectively, a union-closed family) of subsets of [n]. Our main results are that - in sharp contrast with the property of being a monotone set system - the property of being an intersecting set system, and the property of being a union-closed set system, both turn out to be information-theoretically difficult to test. We show that: - For ε ≥ Ω(1/√n), any non-adaptive two-sided ε-tester for intersectingness must make 2^{Ω(n^{1/4}/√{ε})} queries. We also give a 2^{Ω(√{n log(1/ε)})}-query lower bound for non-adaptive one-sided ε-testers for intersectingness. - For ε ≥ 1/2^{Ω(n^{0.49})}, any non-adaptive two-sided ε-tester for union-closedness must make n^{Ω(log(1/ε))} queries. Thus, neither intersectingness nor union-closedness shares the poly(n,1/ε)-query non-adaptive testability that is enjoyed by monotonicity. To complement our lower bounds, we also give a simple poly(n^{√{nlog(1/ε)}},1/ε)-query, one-sided, non-adaptive algorithm for ε-testing each of these properties (intersectingness and union-closedness). We thus achieve nearly tight upper and lower bounds for two-sided testing of intersectingness when ε = Θ(1/√n), and for one-sided testing of intersectingness when ε = Θ(1).

Cite as

Xi Chen, Anindya De, Yuhao Li, Shivam Nadimpalli, and Rocco A. Servedio. Testing Intersecting and Union-Closed Families. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 33:1-33:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITCS.2024.33,
  author =	{Chen, Xi and De, Anindya and Li, Yuhao and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Testing Intersecting and Union-Closed Families}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{33:1--33:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.33},
  URN =		{urn:nbn:de:0030-drops-195610},
  doi =		{10.4230/LIPIcs.ITCS.2024.33},
  annote =	{Keywords: Sublinear algorithms, property testing, computational complexity, monotonicity, intersecting families, union-closed families}
}
Document
Improved Pseudorandom Generators for AC⁰ Circuits

Authors: Xin Lyu

Published in: LIPIcs, Volume 234, 37th Computational Complexity Conference (CCC 2022)


Abstract
We give PRG for depth-d, size-m AC⁰ circuits with seed length O(log^{d-1}(m)log(m/ε)log log(m)). Our PRG improves on previous work [Luca Trevisan and Tongke Xue, 2013; Rocco A. Servedio and Li-Yang Tan, 2019; Zander Kelley, 2021] from various aspects. It has optimal dependence on 1/ε and is only one "log log(m)" away from the lower bound barrier. For the case of d = 2, the seed length tightly matches the best-known PRG for CNFs [Anindya De et al., 2010; Avishay Tal, 2017]. There are two technical ingredients behind our new result; both of them might be of independent interest. First, we use a partitioning-based approach to construct PRGs based on restriction lemmas for AC⁰. Previous works [Luca Trevisan and Tongke Xue, 2013; Rocco A. Servedio and Li-Yang Tan, 2019; Zander Kelley, 2021] usually built PRGs on the Ajtai-Wigderson framework [Miklós Ajtai and Avi Wigderson, 1989]. Compared with them, the partitioning approach avoids the extra "log(n)" factor that usually arises from the Ajtai-Wigderson framework, allowing us to get the almost-tight seed length. The partitioning approach is quite general, and we believe it can help design PRGs for classes beyond constant-depth circuits. Second, improving and extending [Luca Trevisan and Tongke Xue, 2013; Rocco A. Servedio and Li-Yang Tan, 2019; Zander Kelley, 2021], we prove a full derandomization of the powerful multi-switching lemma [Johan Håstad, 2014]. We show that one can use a short random seed to sample a restriction, such that a family of DNFs simultaneously simplifies under the restriction with high probability. This answers an open question in [Zander Kelley, 2021]. Previous derandomizations were either partial (that is, they pseudorandomly choose variables to restrict, and then fix those variables to truly-random bits) or had sub-optimal seed length. In our application, having a fully-derandomized switching lemma is crucial, and the randomness-efficiency of our derandomization allows us to get an almost-tight seed length.

Cite as

Xin Lyu. Improved Pseudorandom Generators for AC⁰ Circuits. In 37th Computational Complexity Conference (CCC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 234, pp. 34:1-34:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{lyu:LIPIcs.CCC.2022.34,
  author =	{Lyu, Xin},
  title =	{{Improved Pseudorandom Generators for AC⁰ Circuits}},
  booktitle =	{37th Computational Complexity Conference (CCC 2022)},
  pages =	{34:1--34:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-241-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{234},
  editor =	{Lovett, Shachar},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2022.34},
  URN =		{urn:nbn:de:0030-drops-165963},
  doi =		{10.4230/LIPIcs.CCC.2022.34},
  annote =	{Keywords: pseudorandom generators, derandomization, switching Lemmas, AC⁰}
}
Document
Convex Influences

Authors: Anindya De, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We introduce a new notion of influence for symmetric convex sets over Gaussian space, which we term "convex influence". We show that this new notion of influence shares many of the familiar properties of influences of variables for monotone Boolean functions f: {±1}ⁿ → {±1}. Our main results for convex influences give Gaussian space analogues of many important results on influences for monotone Boolean functions. These include (robust) characterizations of extremal functions, the Poincaré inequality, the Kahn-Kalai-Linial theorem [J. Kahn et al., 1988], a sharp threshold theorem of Kalai [G. Kalai, 2004], a stability version of the Kruskal-Katona theorem due to O'Donnell and Wimmer [R. O'Donnell and K. Wimmer, 2009], and some partial results towards a Gaussian space analogue of Friedgut’s junta theorem [E. Friedgut, 1998]. The proofs of our results for convex influences use very different techniques than the analogous proofs for Boolean influences over {±1}ⁿ. Taken as a whole, our results extend the emerging analogy between symmetric convex sets in Gaussian space and monotone Boolean functions from {±1}ⁿ to {±1}.

Cite as

Anindya De, Shivam Nadimpalli, and Rocco A. Servedio. Convex Influences. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 53:1-53:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2022.53,
  author =	{De, Anindya and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Convex Influences}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{53:1--53:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.53},
  URN =		{urn:nbn:de:0030-drops-156498},
  doi =		{10.4230/LIPIcs.ITCS.2022.53},
  annote =	{Keywords: Fourier analysis of Boolean functions, convex geometry, influences, threshold phenomena}
}
Document
Junta Distance Approximation with Sub-Exponential Queries

Authors: Vishnu Iyer, Avishay Tal, and Michael Whitmeyer

Published in: LIPIcs, Volume 200, 36th Computational Complexity Conference (CCC 2021)


Abstract
Leveraging tools of De, Mossel, and Neeman [FOCS, 2019], we show two different results pertaining to the tolerant testing of juntas. Given black-box access to a Boolean function f:{±1}ⁿ → {±1}: 1) We give a poly(k, 1/(ε)) query algorithm that distinguishes between functions that are γ-close to k-juntas and (γ+ε)-far from k'-juntas, where k' = O(k/(ε²)). 2) In the non-relaxed setting, we extend our ideas to give a 2^{Õ(√{k/ε})} (adaptive) query algorithm that distinguishes between functions that are γ-close to k-juntas and (γ+ε)-far from k-juntas. To the best of our knowledge, this is the first subexponential-in-k query algorithm for approximating the distance of f to being a k-junta (previous results of Blais, Canonne, Eden, Levi, and Ron [SODA, 2018] and De, Mossel, and Neeman [FOCS, 2019] required exponentially many queries in k). Our techniques are Fourier analytical and make use of the notion of "normalized influences" that was introduced by Talagrand [Michel Talagrand, 1994].

Cite as

Vishnu Iyer, Avishay Tal, and Michael Whitmeyer. Junta Distance Approximation with Sub-Exponential Queries. In 36th Computational Complexity Conference (CCC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 200, pp. 24:1-24:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{iyer_et_al:LIPIcs.CCC.2021.24,
  author =	{Iyer, Vishnu and Tal, Avishay and Whitmeyer, Michael},
  title =	{{Junta Distance Approximation with Sub-Exponential Queries}},
  booktitle =	{36th Computational Complexity Conference (CCC 2021)},
  pages =	{24:1--24:38},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-193-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{200},
  editor =	{Kabanets, Valentine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2021.24},
  URN =		{urn:nbn:de:0030-drops-142988},
  doi =		{10.4230/LIPIcs.CCC.2021.24},
  annote =	{Keywords: Algorithms, Complexity Theory, Fourier Analysis, Juntas, Normalized Influence, Property Testing, Tolerant Property Testing}
}
Document
Track A: Algorithms, Complexity and Games
Minimum-Norm Load Balancing Is (Almost) as Easy as Minimizing Makespan

Authors: Sharat Ibrahimpur and Chaitanya Swamy

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
We consider the minimum-norm load-balancing (MinNormLB) problem, wherein there are n jobs, each of which needs to be assigned to one of m machines, and we are given the processing times {p_{ij}} of the jobs on the machines. We also have a monotone, symmetric norm f:ℝ^m → ℝ_{≥ 0}. We seek an assignment σ of jobs to machines that minimizes the f-norm of the induced load vector load->_σ ∈ ℝ_{≥ 0}^m, where load_σ(i) = ∑_{j:σ(j) = i}p_{ij}. This problem was introduced by [Deeparnab Chakrabarty and Chaitanya Swamy, 2019], and the current-best result for MinNormLB is a (4+ε)-approximation [Deeparnab Chakrabarty and Chaitanya Swamy, 2019]. In the stochastic version of MinNormLB, the job processing times are given by nonnegative random variables X_{ij}, and jobs are independent; the goal is to find an assignment σ that minimizes the expected f-norm of the induced random load vector. We obtain results that (essentially) match the best-known guarantees for deterministic makespan minimization (MinNormLB with 𝓁_∞ norm). For MinNormLB, we obtain a (2+ε)-approximation for unrelated machines, and a PTAS for identical machines. For stochastic MinNormLB, we consider the setting where the X_{ij}s are Poisson random variables, denoted PoisNormLB. Our main result here is a novel and powerful reduction showing that, for any machine environment (e.g., unrelated/identical machines), any α-approximation algorithm for MinNormLB in that machine environment yields a randomized α(1+ε)-approximation for PoisNormLB in that machine environment. Combining this with our results for MinNormLB, we immediately obtain a (2+ε)-approximation for PoisNormLB on unrelated machines, and a PTAS for PoisNormLB on identical machines. The latter result substantially generalizes a PTAS for makespan minimization with Poisson jobs obtained recently by [Anindya De et al., 2020].

Cite as

Sharat Ibrahimpur and Chaitanya Swamy. Minimum-Norm Load Balancing Is (Almost) as Easy as Minimizing Makespan. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 81:1-81:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{ibrahimpur_et_al:LIPIcs.ICALP.2021.81,
  author =	{Ibrahimpur, Sharat and Swamy, Chaitanya},
  title =	{{Minimum-Norm Load Balancing Is (Almost) as Easy as Minimizing Makespan}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{81:1--81:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.81},
  URN =		{urn:nbn:de:0030-drops-141504},
  doi =		{10.4230/LIPIcs.ICALP.2021.81},
  annote =	{Keywords: Approximation algorithms, Load balancing, Minimum-norm optimization, LP rounding}
}
Document
Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime

Authors: Xi Chen, Anindya De, Chin Ho Lee, Rocco A. Servedio, and Sandip Sinha

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
In the trace reconstruction problem, an unknown source string x ∈ {0,1}ⁿ is transmitted through a probabilistic deletion channel which independently deletes each bit with some fixed probability δ and concatenates the surviving bits, resulting in a trace of x. The problem is to reconstruct x given access to independent traces. Trace reconstruction of arbitrary (worst-case) strings is a challenging problem, with the current state of the art for poly(n)-time algorithms being the 2004 algorithm of Batu et al. [T. Batu et al., 2004]. This algorithm can reconstruct an arbitrary source string x ∈ {0,1}ⁿ in poly(n) time provided that the deletion rate δ satisfies δ ≤ n^{-(1/2 + ε)} for some ε > 0. In this work we improve on the result of [T. Batu et al., 2004] by giving a poly(n)-time algorithm for trace reconstruction for any deletion rate δ ≤ n^{-(1/3 + ε)}. Our algorithm works by alternating an alignment-based procedure, which we show effectively reconstructs portions of the source string that are not "highly repetitive", with a novel procedure that efficiently determines the length of highly repetitive subwords of the source string.

Cite as

Xi Chen, Anindya De, Chin Ho Lee, Rocco A. Servedio, and Sandip Sinha. Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 20:1-20:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ITCS.2021.20,
  author =	{Chen, Xi and De, Anindya and Lee, Chin Ho and Servedio, Rocco A. and Sinha, Sandip},
  title =	{{Polynomial-Time Trace Reconstruction in the Low Deletion Rate Regime}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{20:1--20:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.20},
  URN =		{urn:nbn:de:0030-drops-135595},
  doi =		{10.4230/LIPIcs.ITCS.2021.20},
  annote =	{Keywords: trace reconstruction}
}
Document
Quantitative Correlation Inequalities via Semigroup Interpolation

Authors: Anindya De, Shivam Nadimpalli, and Rocco A. Servedio

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
Most correlation inequalities for high-dimensional functions in the literature, such as the Fortuin-Kasteleyn-Ginibre inequality and the celebrated Gaussian Correlation Inequality of Royen, are qualitative statements which establish that any two functions of a certain type have non-negative correlation. We give a general approach that can be used to bootstrap many qualitative correlation inequalities for functions over product spaces into quantitative statements. The approach combines a new extremal result about power series, proved using complex analysis, with harmonic analysis of functions over product spaces. We instantiate this general approach in several different concrete settings to obtain a range of new and near-optimal quantitative correlation inequalities, including: - A {quantitative} version of Royen’s celebrated Gaussian Correlation Inequality [Royen, 2014]. In [Royen, 2014] Royen confirmed a conjecture, open for 40 years, stating that any two symmetric convex sets must be non-negatively correlated under any centered Gaussian distribution. We give a lower bound on the correlation in terms of the vector of degree-2 Hermite coefficients of the two convex sets, conceptually similar to Talagrand’s quantitative correlation bound for monotone Boolean functions over {0,1}ⁿ [M. Talagrand, 1996]. We show that our quantitative version of Royen’s theorem is within a logarithmic factor of being optimal. - A quantitative version of the well-known FKG inequality for monotone functions over any finite product probability space. This is a broad generalization of Talagrand’s quantitative correlation bound for functions from {0,1}ⁿ to {0,1} under the uniform distribution [M. Talagrand, 1996]; the only prior generalization of which we are aware is due to Keller [Nathan Keller, 2012; Keller, 2008; Nathan Keller, 2009], which extended [M. Talagrand, 1996] to product distributions over {0,1}ⁿ. In the special case of p-biased distributions over {0,1}ⁿ that was considered by Keller, our new bound essentially saves a factor of p log(1/p) over the quantitative bounds given in [Nathan Keller, 2012; Keller, 2008; Nathan Keller, 2009]. We also give {a quantitative version of} the FKG inequality for monotone functions over the continuous domain [0,1]ⁿ, answering a question of Keller [Nathan Keller, 2009].

Cite as

Anindya De, Shivam Nadimpalli, and Rocco A. Servedio. Quantitative Correlation Inequalities via Semigroup Interpolation. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 69:1-69:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2021.69,
  author =	{De, Anindya and Nadimpalli, Shivam and Servedio, Rocco A.},
  title =	{{Quantitative Correlation Inequalities via Semigroup Interpolation}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{69:1--69:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.69},
  URN =		{urn:nbn:de:0030-drops-136081},
  doi =		{10.4230/LIPIcs.ITCS.2021.69},
  annote =	{Keywords: complex analysis, correlation inequality, FKG inequality, Gaussian correlation inequality, harmonic analysis, Markov semigroups}
}
Document
Track A: Algorithms, Complexity and Games
An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs

Authors: Anindya De, Sanjeev Khanna, Huan Li, and Hesam Nikpey

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give the first polynomial-time approximation scheme (PTAS) for the stochastic load balancing problem when the job sizes follow Poisson distributions. This improves upon the 2-approximation algorithm due to Goel and Indyk (FOCS'99). Moreover, our approximation scheme is an efficient PTAS that has a running time double exponential in 1/ε but nearly-linear in n, where n is the number of jobs and ε is the target error. Previously, a PTAS (not efficient) was only known for jobs that obey exponential distributions (Goel and Indyk, FOCS'99). Our algorithm relies on several probabilistic ingredients including some (seemingly) new results on scaling and the so-called "focusing effect" of maximum of Poisson random variables which might be of independent interest.

Cite as

Anindya De, Sanjeev Khanna, Huan Li, and Hesam Nikpey. An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 37:1-37:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ICALP.2020.37,
  author =	{De, Anindya and Khanna, Sanjeev and Li, Huan and Nikpey, Hesam},
  title =	{{An Efficient PTAS for Stochastic Load Balancing with Poisson Jobs}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{37:1--37:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.37},
  URN =		{urn:nbn:de:0030-drops-124449},
  doi =		{10.4230/LIPIcs.ICALP.2020.37},
  annote =	{Keywords: Efficient PTAS, Makespan Minimization, Scheduling, Stochastic Load Balancing, Poisson Distribution}
}
Document
Simple and Efficient Pseudorandom Generators from Gaussian Processes

Authors: Eshan Chattopadhyay, Anindya De, and Rocco A. Servedio

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
We show that a very simple pseudorandom generator fools intersections of k linear threshold functions (LTFs) and arbitrary functions of k LTFs over n-dimensional Gaussian space. The two analyses of our PRG (for intersections versus arbitrary functions of LTFs) are quite different from each other and from previous analyses of PRGs for functions of halfspaces. Our analysis for arbitrary functions of LTFs establishes bounds on the Wasserstein distance between Gaussian random vectors with similar covariance matrices, and combines these bounds with a conversion from Wasserstein distance to "union-of-orthants" distance from [Xi Chen et al., 2014]. Our analysis for intersections of LTFs uses extensions of the classical Sudakov-Fernique type inequalities, which give bounds on the difference between the expectations of the maxima of two Gaussian random vectors with similar covariance matrices. For all values of k, our generator has seed length O(log n) + poly(k) for arbitrary functions of k LTFs and O(log n) + poly(log k) for intersections of k LTFs. The best previous result, due to [Gopalan et al., 2010], only gave such PRGs for arbitrary functions of k LTFs when k=O(log log n) and for intersections of k LTFs when k=O((log n)/(log log n)). Thus our PRG achieves an O(log n) seed length for values of k that are exponentially larger than previous work could achieve. By combining our PRG over Gaussian space with an invariance principle for arbitrary functions of LTFs and with a regularity lemma, we obtain a deterministic algorithm that approximately counts satisfying assignments of arbitrary functions of k general LTFs over {0,1}^n in time poly(n) * 2^{poly(k,1/epsilon)} for all values of k. This algorithm has a poly(n) runtime for k =(log n)^c for some absolute constant c>0, while the previous best poly(n)-time algorithms could only handle k = O(log log n). For intersections of LTFs, by combining these tools with a recent PRG due to [R. O'Donnell et al., 2018], we obtain a deterministic algorithm that can approximately count satisfying assignments of intersections of k general LTFs over {0,1}^n in time poly(n) * 2^{poly(log k, 1/epsilon)}. This algorithm has a poly(n) runtime for k =2^{(log n)^c} for some absolute constant c>0, while the previous best poly(n)-time algorithms for intersections of k LTFs, due to [Gopalan et al., 2010], could only handle k=O((log n)/(log log n)).

Cite as

Eshan Chattopadhyay, Anindya De, and Rocco A. Servedio. Simple and Efficient Pseudorandom Generators from Gaussian Processes. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 4:1-4:33, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{chattopadhyay_et_al:LIPIcs.CCC.2019.4,
  author =	{Chattopadhyay, Eshan and De, Anindya and Servedio, Rocco A.},
  title =	{{Simple and Efficient Pseudorandom Generators from Gaussian Processes}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{4:1--4:33},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.4},
  URN =		{urn:nbn:de:0030-drops-108262},
  doi =		{10.4230/LIPIcs.CCC.2019.4},
  annote =	{Keywords: Polynomial threshold functions, Gaussian processes, Johnson-Lindenstrauss, pseudorandom generators}
}
Document
Density Estimation for Shift-Invariant Multidimensional Distributions

Authors: Anindya De, Philip M. Long, and Rocco A. Servedio

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We study density estimation for classes of shift-invariant distributions over R^d. A multidimensional distribution is "shift-invariant" if, roughly speaking, it is close in total variation distance to a small shift of it in any direction. Shift-invariance relaxes smoothness assumptions commonly used in non-parametric density estimation to allow jump discontinuities. The different classes of distributions that we consider correspond to different rates of tail decay. For each such class we give an efficient algorithm that learns any distribution in the class from independent samples with respect to total variation distance. As a special case of our general result, we show that d-dimensional shift-invariant distributions which satisfy an exponential tail bound can be learned to total variation distance error epsilon using O~_d(1/ epsilon^{d+2}) examples and O~_d(1/ epsilon^{2d+2}) time. This implies that, for constant d, multivariate log-concave distributions can be learned in O~_d(1/epsilon^{2d+2}) time using O~_d(1/epsilon^{d+2}) samples, answering a question of [Diakonikolas et al., 2016]. All of our results extend to a model of noise-tolerant density estimation using Huber's contamination model, in which the target distribution to be learned is a (1-epsilon,epsilon) mixture of some unknown distribution in the class with some other arbitrary and unknown distribution, and the learning algorithm must output a hypothesis distribution with total variation distance error O(epsilon) from the target distribution. We show that our general results are close to best possible by proving a simple Omega (1/epsilon^d) information-theoretic lower bound on sample complexity even for learning bounded distributions that are shift-invariant.

Cite as

Anindya De, Philip M. Long, and Rocco A. Servedio. Density Estimation for Shift-Invariant Multidimensional Distributions. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 28:1-28:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.ITCS.2019.28,
  author =	{De, Anindya and Long, Philip M. and Servedio, Rocco A.},
  title =	{{Density Estimation for Shift-Invariant Multidimensional Distributions}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.28},
  URN =		{urn:nbn:de:0030-drops-101214},
  doi =		{10.4230/LIPIcs.ITCS.2019.28},
  annote =	{Keywords: Density estimation, unsupervised learning, log-concave distributions, non-parametrics}
}
Document
Noise Stability Is Computable and Approximately Low-Dimensional

Authors: Anindya De, Elchanan Mossel, and Joe Neeman

Published in: LIPIcs, Volume 79, 32nd Computational Complexity Conference (CCC 2017)


Abstract
Questions of noise stability play an important role in hardness of approximation in computer science as well as in the theory of voting. In many applications, the goal is to find an optimizer of noise stability among all possible partitions of R^n for n >= 1 to k parts with given Gaussian measures mu_1, ..., mu_k. We call a partition epsilon-optimal, if its noise stability is optimal up to an additive epsilon. In this paper, we give an explicit, computable function n(epsilon) such that an epsilon-optimal partition exists in R^{n(epsilon)}. This result has implications for the computability of certain problems in non-interactive simulation, which are addressed in a subsequent work.

Cite as

Anindya De, Elchanan Mossel, and Joe Neeman. Noise Stability Is Computable and Approximately Low-Dimensional. In 32nd Computational Complexity Conference (CCC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 79, pp. 10:1-10:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{de_et_al:LIPIcs.CCC.2017.10,
  author =	{De, Anindya and Mossel, Elchanan and Neeman, Joe},
  title =	{{Noise Stability Is Computable and Approximately Low-Dimensional}},
  booktitle =	{32nd Computational Complexity Conference (CCC 2017)},
  pages =	{10:1--10:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-040-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{79},
  editor =	{O'Donnell, Ryan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2017.10},
  URN =		{urn:nbn:de:0030-drops-75390},
  doi =		{10.4230/LIPIcs.CCC.2017.10},
  annote =	{Keywords: Gaussian noise stability; Plurality is stablest; Ornstein Uhlenbeck operator}
}
Document
Concurrent Data Structures Linked in Time (Artifact)

Authors: Germán Andrés Delbianco, Ilya Sergey, Aleksandar Nanevski, and Anindya Banerjee

Published in: DARTS, Volume 3, Issue 2, Special Issue of the 31st European Conference on Object-Oriented Programming (ECOOP 2017)


Abstract
This artifact provides the full mechanization in FCSL of the developments in the companion paper, "Concurrent Data Structures Linked in Time". In the latter, we propose a new method, based on a separation-style logic, for reasoning about concurrent objects with such linearization points. We embrace the dynamic nature of linearization points, and encode it as part of the data structure's auxiliary state, so that it can be dynamically modified in place by auxiliary code, as needed when some appropriate run-time event occurs. We illustrate the method by verifying (mechanically in FCSL) an intricate optimal snapshot algorithm due to Jayanti, as well as some clients. FCSL is the first completely formalized framework for mechanized verification of full functional correctness of fine-grained concurrent programs. It is implemented as an embedded domain-specific language (DSL) in the dependently-typed language of the Coq proof assistant, and is powerful enough to reason about programming features such as higher-order functions and local thread spawning. By incorporating a uniform concurrency model, based on state-transition systems and partial commutative monoids, FCSL makes it possible to build proofs about concurrent libraries in a thread-local, compositional way, thus facilitating scalability and reuse: libraries are verified just once, and their specifications are used ubiquitously in client-side reasoning.

Cite as

Germán Andrés Delbianco, Ilya Sergey, Aleksandar Nanevski, and Anindya Banerjee. Concurrent Data Structures Linked in Time (Artifact). In Special Issue of the 31st European Conference on Object-Oriented Programming (ECOOP 2017). Dagstuhl Artifacts Series (DARTS), Volume 3, Issue 2, pp. 4:1-4:4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@Article{delbianco_et_al:DARTS.3.2.4,
  author =	{Delbianco, Germ\'{a}n Andr\'{e}s and Sergey, Ilya and Nanevski, Aleksandar and Banerjee, Anindya},
  title =	{{Concurrent Data Structures Linked in Time (Artifact)}},
  pages =	{4:1--4:4},
  journal =	{Dagstuhl Artifacts Series},
  ISSN =	{2509-8195},
  year =	{2017},
  volume =	{3},
  number =	{2},
  editor =	{Delbianco, Germ\'{a}n Andr\'{e}s and Sergey, Ilya and Nanevski, Aleksandar and Banerjee, Anindya},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DARTS.3.2.4},
  URN =		{urn:nbn:de:0030-drops-72850},
  doi =		{10.4230/DARTS.3.2.4},
  annote =	{Keywords: separation logic, linearization Points, concurrent snapshots, FCSL}
}
  • Refine by Author
  • 8 De, Anindya
  • 6 Servedio, Rocco A.
  • 3 Nadimpalli, Shivam
  • 2 Chen, Xi
  • 1 Banerjee, Anindya
  • Show More...

  • Refine by Classification
  • 2 Mathematics of computing
  • 2 Mathematics of computing → Probability and statistics
  • 2 Theory of computation → Approximation algorithms analysis
  • 2 Theory of computation → Computational complexity and cryptography
  • 1 Mathematics of computing → Combinatorics
  • Show More...

  • Refine by Keyword
  • 2 pseudorandom generators
  • 1 AC⁰
  • 1 Algorithms
  • 1 Approximation algorithms
  • 1 Complexity Theory
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 4 2021
  • 2 2017
  • 2 2019
  • 2 2022
  • 1 2020
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail