13 Search Results for "Bhangale, Amey"


Document
Mixing of 3-Term Progressions in Quasirandom Groups

Authors: Amey Bhangale, Prahladh Harsha, and Sourya Roy

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
In this paper, we show the mixing of three-term progressions (x, xg, xg²) in every finite quasirandom group, fully answering a question of Gowers. More precisely, we show that for any D-quasirandom group G and any three sets A₁, A₂, A₃ ⊂ G, we have |Pr_{x,y∼ G}[x ∈ A₁, xy ∈ A₂, xy² ∈ A₃] - ∏_{i = 1}³ Pr_{x∼ G}[x ∈ A_i]| ≤ (2/(√{D)})^{1/4}. Prior to this, Tao answered this question when the underlying quasirandom group is SL_{d}(𝔽_q). Subsequently, Peluse extended the result to all non-abelian finite simple groups. In this work, we show that a slight modification of Peluse’s argument is sufficient to fully resolve Gowers' quasirandom conjecture for 3-term progressions. Surprisingly, unlike the proofs of Tao and Peluse, our proof is elementary and only uses basic facts from non-abelian Fourier analysis.

Cite as

Amey Bhangale, Prahladh Harsha, and Sourya Roy. Mixing of 3-Term Progressions in Quasirandom Groups. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 20:1-20:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.ITCS.2022.20,
  author =	{Bhangale, Amey and Harsha, Prahladh and Roy, Sourya},
  title =	{{Mixing of 3-Term Progressions in Quasirandom Groups}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{20:1--20:9},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.20},
  URN =		{urn:nbn:de:0030-drops-156163},
  doi =		{10.4230/LIPIcs.ITCS.2022.20},
  annote =	{Keywords: Quasirandom groups, 3-term arithmetic progressions}
}
Document
Max-3-Lin over Non-Abelian Groups with Universal Factor Graphs

Authors: Amey Bhangale and Aleksa Stanković

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
Factor graph of an instance of a constraint satisfaction problem with n variables and m constraints is the bipartite graph between [m] and [n] describing which variable appears in which constraints. Thus, an instance of a CSP is completely defined by its factor graph and the list of predicates. We show inapproximability of Max-3-LIN over non-abelian groups (both in the perfect completeness case and in the imperfect completeness case), with the same inapproximability factor as in the general case, even when the factor graph is fixed. Along the way, we also show that these optimal hardness results hold even when we restrict the linear equations in the Max-3-LIN instances to the form x⋅ y⋅ z = g, where x,y,z are the variables and g is a group element. We use representation theory and Fourier analysis over non-abelian groups to analyze the reductions.

Cite as

Amey Bhangale and Aleksa Stanković. Max-3-Lin over Non-Abelian Groups with Universal Factor Graphs. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 21:1-21:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.ITCS.2022.21,
  author =	{Bhangale, Amey and Stankovi\'{c}, Aleksa},
  title =	{{Max-3-Lin over Non-Abelian Groups with Universal Factor Graphs}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{21:1--21:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.21},
  URN =		{urn:nbn:de:0030-drops-156177},
  doi =		{10.4230/LIPIcs.ITCS.2022.21},
  annote =	{Keywords: Universal factor graphs, linear equations, non-abelian groups, hardness of approximation}
}
Document
APPROX
Hardness of Approximation of (Multi-)LCS over Small Alphabet

Authors: Amey Bhangale, Diptarka Chakraborty, and Rajendra Kumar

Published in: LIPIcs, Volume 176, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020)


Abstract
The problem of finding longest common subsequence (LCS) is one of the fundamental problems in computer science, which finds application in fields such as computational biology, text processing, information retrieval, data compression etc. It is well known that (decision version of) the problem of finding the length of a LCS of an arbitrary number of input sequences (which we refer to as Multi-LCS problem) is NP-complete. Jiang and Li [SICOMP'95] showed that if Max-Clique is hard to approximate within a factor of s then Multi-LCS is also hard to approximate within a factor of Θ(s). By the NP-hardness of the problem of approximating Max-Clique by Zuckerman [ToC'07], for any constant δ > 0, the length of a LCS of arbitrary number of input sequences of length n each, cannot be approximated within an n^{1-δ}-factor in polynomial time unless {P}={NP}. However, the reduction of Jiang and Li assumes the alphabet size to be Ω(n). So far no hardness result is known for the problem of approximating Multi-LCS over sub-linear sized alphabet. On the other hand, it is easy to get 1/|Σ|-factor approximation for strings of alphabet Σ. In this paper, we make a significant progress towards proving hardness of approximation over small alphabet by showing a polynomial-time reduction from the well-studied densest k-subgraph problem with perfect completeness to approximating Multi-LCS over alphabet of size poly(n/k). As a consequence, from the known hardness result of densest k-subgraph problem (e.g. [Manurangsi, STOC'17]) we get that no polynomial-time algorithm can give an n^{-o(1)}-factor approximation of Multi-LCS over an alphabet of size n^{o(1)}, unless the Exponential Time Hypothesis is false.

Cite as

Amey Bhangale, Diptarka Chakraborty, and Rajendra Kumar. Hardness of Approximation of (Multi-)LCS over Small Alphabet. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 176, pp. 38:1-38:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.APPROX/RANDOM.2020.38,
  author =	{Bhangale, Amey and Chakraborty, Diptarka and Kumar, Rajendra},
  title =	{{Hardness of Approximation of (Multi-)LCS over Small Alphabet}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020)},
  pages =	{38:1--38:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-164-1},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{176},
  editor =	{Byrka, Jaros{\l}aw and Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2020.38},
  URN =		{urn:nbn:de:0030-drops-126418},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2020.38},
  annote =	{Keywords: Longest common subsequence, Hardness of approximation, ETH-hardness, Densest k-subgraph problem}
}
Document
Simultaneous Max-Cut Is Harder to Approximate Than Max-Cut

Authors: Amey Bhangale and Subhash Khot

Published in: LIPIcs, Volume 169, 35th Computational Complexity Conference (CCC 2020)


Abstract
A systematic study of simultaneous optimization of constraint satisfaction problems was initiated by Bhangale et al. [ICALP, 2015]. The simplest such problem is the simultaneous Max-Cut. Bhangale et al. [SODA, 2018] gave a .878-minimum approximation algorithm for simultaneous Max-Cut which is almost optimal assuming the Unique Games Conjecture (UGC). For single instance Max-Cut, Goemans-Williamson [JACM, 1995] gave an α_GW-approximation algorithm where α_GW ≈ .87856720... which is optimal assuming the UGC. It was left open whether one can achieve an α_GW-minimum approximation algorithm for simultaneous Max-Cut. We answer the question by showing that there exists an absolute constant ε₀ ≥ 10^{-5} such that it is NP-hard to get an (α_GW- ε₀)-minimum approximation for simultaneous Max-Cut assuming the Unique Games Conjecture.

Cite as

Amey Bhangale and Subhash Khot. Simultaneous Max-Cut Is Harder to Approximate Than Max-Cut. In 35th Computational Complexity Conference (CCC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 169, pp. 9:1-9:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.CCC.2020.9,
  author =	{Bhangale, Amey and Khot, Subhash},
  title =	{{Simultaneous Max-Cut Is Harder to Approximate Than Max-Cut}},
  booktitle =	{35th Computational Complexity Conference (CCC 2020)},
  pages =	{9:1--9:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-156-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{169},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2020.9},
  URN =		{urn:nbn:de:0030-drops-125610},
  doi =		{10.4230/LIPIcs.CCC.2020.9},
  annote =	{Keywords: Simultaneous CSPs, Unique Games hardness, Max-Cut}
}
Document
Smooth and Strong PCPs

Authors: Orr Paradise

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Probabilistically checkable proofs (PCPs) can be verified based only on a constant amount of random queries, such that any correct claim has a proof that is always accepted, and incorrect claims are rejected with high probability (regardless of the given alleged proof). We consider two possible features of PCPs: - A PCP is strong if it rejects an alleged proof of a correct claim with probability proportional to its distance from some correct proof of that claim. - A PCP is smooth if each location in a proof is queried with equal probability. We prove that all sets in NP have PCPs that are both smooth and strong, are of polynomial length, and can be verified based on a constant number of queries. This is achieved by following the proof of the PCP theorem of Arora, Lund, Motwani, Sudan and Szegedy (JACM, 1998), providing a stronger analysis of the Hadamard and Reed - Muller based PCPs and a refined PCP composition theorem. In fact, we show that any set in NP has a smooth strong canonical PCP of Proximity (PCPP), meaning that there is an efficiently computable bijection of NP witnesses to correct proofs. This improves on the recent construction of Dinur, Gur and Goldreich (ITCS, 2019) of PCPPs that are strong canonical but inherently non-smooth. Our result implies the hardness of approximating the satisfiability of "stable" 3CNF formulae with bounded variable occurrence, where stable means that the number of clauses violated by an assignment is proportional to its distance from a satisfying assignment (in the relative Hamming metric). This proves a hypothesis used in the work of Friggstad, Khodamoradi and Salavatipour (SODA, 2019), suggesting a connection between the hardness of these instances and other stable optimization problems.

Cite as

Orr Paradise. Smooth and Strong PCPs. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 2:1-2:41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{paradise:LIPIcs.ITCS.2020.2,
  author =	{Paradise, Orr},
  title =	{{Smooth and Strong PCPs}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{2:1--2:41},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.2},
  URN =		{urn:nbn:de:0030-drops-116875},
  doi =		{10.4230/LIPIcs.ITCS.2020.2},
  annote =	{Keywords: Interactive and probabilistic proof systems, Probabilistically checkable proofs, Hardness of approximation}
}
Document
APPROX
Rainbow Coloring Hardness via Low Sensitivity Polymorphisms

Authors: Venkatesan Guruswami and Sai Sandeep

Published in: LIPIcs, Volume 145, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)


Abstract
A k-uniform hypergraph is said to be r-rainbow colorable if there is an r-coloring of its vertices such that every hyperedge intersects all r color classes. Given as input such a hypergraph, finding a r-rainbow coloring of it is NP-hard for all k >= 3 and r >= 2. Therefore, one settles for finding a rainbow coloring with fewer colors (which is an easier task). When r=k (the maximum possible value), i.e., the hypergraph is k-partite, one can efficiently 2-rainbow color the hypergraph, i.e., 2-color its vertices so that there are no monochromatic edges. In this work we consider the next smaller value of r=k-1, and prove that in this case it is NP-hard to rainbow color the hypergraph with q := ceil[(k-2)/2] colors. In particular, for k <=6, it is NP-hard to 2-color (k-1)-rainbow colorable k-uniform hypergraphs. Our proof follows the algebraic approach to promise constraint satisfaction problems. It proceeds by characterizing the polymorphisms associated with the approximate rainbow coloring problem, which are rainbow colorings of some product hypergraphs on vertex set [r]^n. We prove that any such polymorphism f: [r]^n -> [q] must be C-fixing, i.e., there is a small subset S of C coordinates and a setting a in [q]^S such that fixing x_{|S} = a determines the value of f(x). The key step in our proof is bounding the sensitivity of certain rainbow colorings, thereby arguing that they must be juntas. Armed with the C-fixing characterization, our NP-hardness is obtained via a reduction from smooth Label Cover.

Cite as

Venkatesan Guruswami and Sai Sandeep. Rainbow Coloring Hardness via Low Sensitivity Polymorphisms. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 145, pp. 15:1-15:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{guruswami_et_al:LIPIcs.APPROX-RANDOM.2019.15,
  author =	{Guruswami, Venkatesan and Sandeep, Sai},
  title =	{{Rainbow Coloring Hardness via Low Sensitivity Polymorphisms}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)},
  pages =	{15:1--15:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-125-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{145},
  editor =	{Achlioptas, Dimitris and V\'{e}gh, L\'{a}szl\'{o} A.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2019.15},
  URN =		{urn:nbn:de:0030-drops-112303},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2019.15},
  annote =	{Keywords: inapproximability, hardness of approximation, constraint satisfaction, hypergraph coloring, polymorphisms}
}
Document
UG-Hardness to NP-Hardness by Losing Half

Authors: Amey Bhangale and Subhash Khot

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
The 2-to-2 Games Theorem of [Subhash Khot et al., 2017; Dinur et al., 2018; Dinur et al., 2018; Dinur et al., 2018] implies that it is NP-hard to distinguish between Unique Games instances with assignment satisfying at least (1/2-epsilon) fraction of the constraints vs. no assignment satisfying more than epsilon fraction of the constraints, for every constant epsilon>0. We show that the reduction can be transformed in a non-trivial way to give a stronger guarantee in the completeness case: For at least (1/2-epsilon) fraction of the vertices on one side, all the constraints associated with them in the Unique Games instance can be satisfied. We use this guarantee to convert the known UG-hardness results to NP-hardness. We show: 1) Tight inapproximability of approximating independent sets in degree d graphs within a factor of Omega(d/(log^2 d)), where d is a constant. 2) NP-hardness of approximate the Maximum Acyclic Subgraph problem within a factor of 2/3+epsilon, improving the previous ratio of 14/15+epsilon by Austrin et al. [Austrin et al., 2015]. 3) For any predicate P^{-1}(1) subseteq [q]^k supporting a balanced pairwise independent distribution, given a P-CSP instance with value at least 1/2-epsilon, it is NP-hard to satisfy more than (|P^{-1}(1)|/(q^k))+epsilon fraction of constraints.

Cite as

Amey Bhangale and Subhash Khot. UG-Hardness to NP-Hardness by Losing Half. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 3:1-3:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.CCC.2019.3,
  author =	{Bhangale, Amey and Khot, Subhash},
  title =	{{UG-Hardness to NP-Hardness by Losing Half}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{3:1--3:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.3},
  URN =		{urn:nbn:de:0030-drops-108258},
  doi =		{10.4230/LIPIcs.CCC.2019.3},
  annote =	{Keywords: NP-hardness, Inapproximability, Unique Games Conjecture}
}
Document
NP-Hardness of Coloring 2-Colorable Hypergraph with Poly-Logarithmically Many Colors

Authors: Amey Bhangale

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
We give very short and simple proofs of the following statements: Given a 2-colorable 4-uniform hypergraph on n vertices, 1) It is NP-hard to color it with log^delta n colors for some delta>0. 2) It is quasi-NP-hard to color it with O({log^{1-o(1)} n}) colors. In terms of NP-hardness, it improves the result of Guruswam, Håstad and Sudani [SIAM Journal on Computing, 2002], combined with Moshkovitz-Raz [Journal of the ACM, 2010], by an `exponential' factor. The second result improves the result of Saket [Conference on Computational Complexity (CCC), 2014] which shows quasi-NP-hardness of coloring a 2-colorable 4-uniform hypergraph with O(log^gamma n) colors for a sufficiently small constant 1 >> gamma>0. Our result is the first to show the NP-hardness of coloring a c-colorable k-uniform hypergraph with poly-logarithmically many colors, for any constants c >= 2 and k >= 3.

Cite as

Amey Bhangale. NP-Hardness of Coloring 2-Colorable Hypergraph with Poly-Logarithmically Many Colors. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 15:1-15:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bhangale:LIPIcs.ICALP.2018.15,
  author =	{Bhangale, Amey},
  title =	{{NP-Hardness of Coloring 2-Colorable Hypergraph with Poly-Logarithmically Many Colors}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{15:1--15:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.15},
  URN =		{urn:nbn:de:0030-drops-90190},
  doi =		{10.4230/LIPIcs.ICALP.2018.15},
  annote =	{Keywords: Hypergraph coloring, Inapproximability, Schrijver graph}
}
Document
An Improved Dictatorship Test with Perfect Completeness

Authors: Amey Bhangale, Subhash Khot, and Devanathan Thiruvenkatachari

Published in: LIPIcs, Volume 93, 37th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2017)


Abstract
A Boolean function f:{0,1}^n\->{0,1} is called a dictator if it depends on exactly one variable i.e f(x_1, x_2, ..., x_n) = x_i for some i in [n]. In this work, we study a k-query dictatorship test. Dictatorship tests are central in proving many hardness results for constraint satisfaction problems. The dictatorship test is said to have perfect completeness if it accepts any dictator function. The soundness of a test is the maximum probability with which it accepts any function far from a dictator. Our main result is a k-query dictatorship test with perfect completeness and soundness (2k + 1)/(2^k), where k is of the form 2^t -1 for any integer t > 2. This improves upon the result of [Tamaki-Yoshida, Random Structures & Algorithms, 2015] which gave a dictatorship test with soundness (2k + 3)/(2^k).

Cite as

Amey Bhangale, Subhash Khot, and Devanathan Thiruvenkatachari. An Improved Dictatorship Test with Perfect Completeness. In 37th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 93, pp. 15:1-15:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.FSTTCS.2017.15,
  author =	{Bhangale, Amey and Khot, Subhash and Thiruvenkatachari, Devanathan},
  title =	{{An Improved Dictatorship Test with Perfect Completeness}},
  booktitle =	{37th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2017)},
  pages =	{15:1--15:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-055-2},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{93},
  editor =	{Lokam, Satya and Ramanujam, R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2017.15},
  URN =		{urn:nbn:de:0030-drops-83854},
  doi =		{10.4230/LIPIcs.FSTTCS.2017.15},
  annote =	{Keywords: Property Testing, Distatorship Test, Fourier Analysis}
}
Document
Cube vs. Cube Low Degree Test

Authors: Amey Bhangale, Irit Dinur, and Inbal Livni Navon

Published in: LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)


Abstract
We revisit the Raz-Safra plane-vs.-plane test and study the closely related cube vs. cube test. In this test the tester has access to a "cubes table" which assigns to every cube a low degree polynomial. The tester randomly selects two cubes (affine sub-spaces of dimension 3) that intersect on a point x in F^m, and checks that the assignments to the cubes agree with each other on the point x. Our main result is a new combinatorial proof for a low degree test that comes closer to the soundness limit, as it works for all epsilon >= poly(d)/{|F|}^{1/2}, where d is the degree. This should be compared to the previously best soundness value of epsilon >= poly(m, d)/|F|^{1/8}. Our soundness limit improves upon the dependence on the field size and does not depend on the dimension of the ambient space. Our proof is combinatorial and direct: unlike the Raz-Safra proof, it proceeds in one shot and does not require induction on the dimension of the ambient space. The ideas in our proof come from works on direct product testing which are even simpler in the current setting thanks to the low degree. Along the way we also prove a somewhat surprising fact about connection between different agreement tests: it does not matter if the tester chooses the cubes to intersect on points or on lines: for every given table, its success probability in either test is nearly the same.

Cite as

Amey Bhangale, Irit Dinur, and Inbal Livni Navon. Cube vs. Cube Low Degree Test. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 40:1-40:31, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.ITCS.2017.40,
  author =	{Bhangale, Amey and Dinur, Irit and Livni Navon, Inbal},
  title =	{{Cube vs. Cube Low Degree Test}},
  booktitle =	{8th Innovations in Theoretical Computer Science Conference (ITCS 2017)},
  pages =	{40:1--40:31},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-029-3},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{67},
  editor =	{Papadimitriou, Christos H.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.40},
  URN =		{urn:nbn:de:0030-drops-81748},
  doi =		{10.4230/LIPIcs.ITCS.2017.40},
  annote =	{Keywords: Low Degree Test, Probabilistically Checkable Proofs, Locally Testable Codes}
}
Document
Bicovering: Covering Edges With Two Small Subsets of Vertices

Authors: Amey Bhangale, Rajiv Gandhi, Mohammad Taghi Hajiaghayi, Rohit Khandekar, and Guy Kortsarz

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
We study the following basic problem called Bi-Covering. Given a graph G(V, E), find two (not necessarily disjoint) sets A subseteq V and B subseteq V such that A union B = V and that every edge e belongs to either the graph induced by A or to the graph induced by B. The goal is to minimize max{|A|, |B|}. This is the most simple case of the Channel Allocation problem [Gandhi et al., Networks, 2006]. A solution that outputs V,emptyset gives ratio at most 2. We show that under the similar Strong Unique Game Conjecture by [Bansal-Khot, FOCS, 2009] there is no 2 - epsilon ratio algorithm for the problem, for any constant epsilon > 0. Given a bipartite graph, Max-bi-clique is a problem of finding largest k*k complete bipartite sub graph. For Max-bi-clique problem, a constant factor hardness was known under random 3-SAT hypothesis of Feige [Feige, STOC, 2002] and also under the assumption that NP !subseteq intersection_{epsilon > 0} BPTIME(2^{n^{epsilon}}) [Khot, SIAM J. on Comp., 2011]. It was an open problem in [Ambühl et. al., SIAM J. on Comp., 2011] to prove inapproximability of Max-bi-clique assuming weaker conjecture. Our result implies similar hardness result assuming the Strong Unique Games Conjecture. On the algorithmic side, we also give better than 2 approximation for Bi-Covering on numerous special graph classes. In particular, we get 1.876 approximation for Chordal graphs, exact algorithm for Interval Graphs, 1 + o(1) for Minor Free Graph, 2 - 4*delta/3 for graphs with minimum degree delta*n, 2/(1+delta^2/8) for delta-vertex expander, 8/5 for Split Graphs, 2 - (6/5)*1/d for graphs with minimum constant degree d etc. Our algorithmic results are quite non-trivial. In achieving these results, we use various known structural results about the graphs, combined with the techniques that we develop tailored to getting better than 2 approximation.

Cite as

Amey Bhangale, Rajiv Gandhi, Mohammad Taghi Hajiaghayi, Rohit Khandekar, and Guy Kortsarz. Bicovering: Covering Edges With Two Small Subsets of Vertices. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 6:1-6:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.ICALP.2016.6,
  author =	{Bhangale, Amey and Gandhi, Rajiv and Hajiaghayi, Mohammad Taghi and Khandekar, Rohit and Kortsarz, Guy},
  title =	{{Bicovering: Covering Edges With Two Small Subsets of Vertices}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{6:1--6:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.6},
  URN =		{urn:nbn:de:0030-drops-62728},
  doi =		{10.4230/LIPIcs.ICALP.2016.6},
  annote =	{Keywords: Bi-covering, Unique Games, Max Bi-clique}
}
Document
On Fortification of Projection Games

Authors: Amey Bhangale, Ramprasad Saptharishi, Girish Varma, and Rakesh Venkat

Published in: LIPIcs, Volume 40, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)


Abstract
A recent result of Moshkovitz [Moshkovitz14] presented an ingenious method to provide a completely elementary proof of the Parallel Repetition Theorem for certain projection games via a construction called fortification. However, the construction used in [Moshkovitz14] to fortify arbitrary label cover instances using an arbitrary extractor is insufficient to prove parallel repetition. In this paper, we provide a fix by using a stronger graph that we call fortifiers. Fortifiers are graphs that have both l_1 and l_2 guarantees on induced distributions from large subsets. We then show that an expander with sufficient spectral gap, or a bi-regular extractor with stronger parameters (the latter is also the construction used in an independent update [Moshkovitz15] of [Moshkovitz14] with an alternate argument), is a good fortifier. We also show that using a fortifier (in particular l_2 guarantees) is necessary for obtaining the robustness required for fortification.

Cite as

Amey Bhangale, Ramprasad Saptharishi, Girish Varma, and Rakesh Venkat. On Fortification of Projection Games. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 40, pp. 497-511, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.APPROX-RANDOM.2015.497,
  author =	{Bhangale, Amey and Saptharishi, Ramprasad and Varma, Girish and Venkat, Rakesh},
  title =	{{On Fortification of Projection Games}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)},
  pages =	{497--511},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-89-7},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{40},
  editor =	{Garg, Naveen and Jansen, Klaus and Rao, Anup and Rolim, Jos\'{e} D. P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2015.497},
  URN =		{urn:nbn:de:0030-drops-53204},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2015.497},
  annote =	{Keywords: Parallel Repetition, Fortification}
}
Document
A Characterization of Hard-to-cover CSPs

Authors: Amey Bhangale, Prahladh Harsha, and Girish Varma

Published in: LIPIcs, Volume 33, 30th Conference on Computational Complexity (CCC 2015)


Abstract
We continue the study of covering complexity of constraint satisfaction problems (CSPs) initiated by Guruswami, Håstad and Sudan [SIAM J. Computing, 31(6):1663--1686, 2002] and Dinur and Kol [in Proc. 28th IEEE Conference on Computational Complexity, 2013]. The covering number of a CSP instance Phi, denoted by nu(Phi) is the smallest number of assignments to the variables of Phi, such that each constraint of Phi is satisfied by at least one of the assignments. We show the following results regarding how well efficient algorithms can approximate the covering number of a given CSP instance. 1. Assuming a covering unique games conjecture, introduced by Dinur and Kol, we show that for every non-odd predicate P over any constant sized alphabet and every integer K, it is NP-hard to distinguish between P-CSP instances (i.e., CSP instances where all the constraints are of type P) which are coverable by a constant number of assignments and those whose covering number is at least K. Previously, Dinur and Kol, using the same covering unique games conjecture, had shown a similar hardness result for every non-odd predicate over the Boolean alphabet that supports a pairwise independent distribution. Our generalization yields a complete characterization of CSPs over constant sized alphabet Sigma that are hard to cover since CSPs over odd predicates are trivially coverable with |Sigma| assignments. 2. For a large class of predicates that are contained in the 2k-LIN predicate, we show that it is quasi-NP-hard to distinguish between instances which have covering number at most two and covering number at least Omega(log(log(n))). This generalizes the 4-LIN result of Dinur and Kol that states it is quasi-NP-hard to distinguish between 4-LIN-CSP instances which have covering number at most two and covering number at least Omega(log(log(log(n)))).

Cite as

Amey Bhangale, Prahladh Harsha, and Girish Varma. A Characterization of Hard-to-cover CSPs. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 280-303, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.CCC.2015.280,
  author =	{Bhangale, Amey and Harsha, Prahladh and Varma, Girish},
  title =	{{A Characterization of Hard-to-cover CSPs}},
  booktitle =	{30th Conference on Computational Complexity (CCC 2015)},
  pages =	{280--303},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-81-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{33},
  editor =	{Zuckerman, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2015.280},
  URN =		{urn:nbn:de:0030-drops-50574},
  doi =		{10.4230/LIPIcs.CCC.2015.280},
  annote =	{Keywords: CSPs, Covering Problem, Hardness of Approximation, Unique Games, Invariance Principle}
}
  • Refine by Author
  • 11 Bhangale, Amey
  • 3 Khot, Subhash
  • 2 Harsha, Prahladh
  • 2 Varma, Girish
  • 1 Chakraborty, Diptarka
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Approximation algorithms analysis
  • 2 Theory of computation → Problems, reductions and completeness
  • 1 Mathematics of computing → Combinatoric problems
  • 1 Theory of computation → Complexity theory and logic
  • 1 Theory of computation → Interactive proof systems

  • Refine by Keyword
  • 2 Hardness of approximation
  • 2 Inapproximability
  • 2 Unique Games
  • 2 hardness of approximation
  • 1 3-term arithmetic progressions
  • Show More...

  • Refine by Type
  • 13 document

  • Refine by Publication Year
  • 3 2020
  • 2 2015
  • 2 2018
  • 2 2019
  • 2 2022
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail