12 Search Results for "Lee, D. T."


Document
A Largish Sum-Of-Squares Implies Circuit Hardness and Derandomization

Authors: Pranjal Dutta, Nitin Saxena, and Thomas Thierauf

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
For a polynomial f, we study the sum of squares representation (SOS), i.e. f = ∑_{i ∈ [s]} c_i f_i² , where c_i are field elements and the f_i’s are polynomials. The size of the representation is the number of monomials that appear across the f_i’s. Its minimum is the support-sum S(f) of f. For simplicity of exposition, we consider univariate f. A trivial lower bound for the support-sum of, a full-support univariate polynomial, f of degree d is S(f) ≥ d^{0.5}. We show that the existence of an explicit polynomial f with support-sum just slightly larger than the trivial bound, that is, S(f) ≥ d^{0.5+ε(d)}, for a sub-constant function ε(d) > ω(√{log log d/log d}), implies that VP ≠ VNP. The latter is a major open problem in algebraic complexity. A further consequence is that blackbox-PIT is in SUBEXP. Note that a random polynomial fulfills the condition, as there we have S(f) = Θ(d). We also consider the sum-of-cubes representation (SOC) of polynomials. In a similar way, we show that here, an explicit hard polynomial even implies that blackbox-PIT is in P.

Cite as

Pranjal Dutta, Nitin Saxena, and Thomas Thierauf. A Largish Sum-Of-Squares Implies Circuit Hardness and Derandomization. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 23:1-23:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{dutta_et_al:LIPIcs.ITCS.2021.23,
  author =	{Dutta, Pranjal and Saxena, Nitin and Thierauf, Thomas},
  title =	{{A Largish Sum-Of-Squares Implies Circuit Hardness and Derandomization}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{23:1--23:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.23},
  URN =		{urn:nbn:de:0030-drops-135629},
  doi =		{10.4230/LIPIcs.ITCS.2021.23},
  annote =	{Keywords: VP, VNP, hitting set, circuit, polynomial, sparsity, SOS, SOC, PIT, lower bound}
}
Document
Extended Abstract
High-Entropy Dual Functions and Locally Decodable Codes (Extended Abstract)

Authors: Jop Briët and Farrokh Labib

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
Locally decodable codes (LDCs) allow any single encoded message symbol to be retrieved from a codeword with good probability by reading only a tiny number of codeword symbols, even if the codeword is partially corrupted. LDCs have surprisingly many applications in computer science and mathematics (we refer to [Yekhanin, 2012; Lovett, 2007] for extensive surveys). But despite their ubiquity, they are poorly understood. Of particular interest is the tradeoff between the codeword length N as a function of message length k when the query complexity - the number of probed codeword symbols - and alphabet size are constant. The Hadamard code is a 2-query LDC of length N = 2^O(k) and this length is optimal in the 2-query regime [Lovett, 2007]. For q ≥ 3, near-exponential gaps persist between the best-known upper and lower bounds. The family of Reed-Muller codes, which generalize the Hadamard code, were for a long time the best-known examples, giving q-query LDCs of length exp(O(k^{1/(q-1)})), until breakthrough constructions of matching vector LDCs of Yekhanin and Efremenko [Yekhanin, 2008; Efremenko, 2012]. In contrast with other combinatorial objects such as expander graphs, the probabilistic method has so far not been successfully used to beat the best explicit LDC constructions. In [Lovett, 2007], a probabilistic framework was given that could in principle yield best-possible LDCs, albeit non-constructively. A special instance of this framework connects LDCs with a probabilistic version of Szemerédi’s theorem. The setup for this is as follows: For a finite abelian group G of size N = |G|, let D ⊆ G be a random subset where each element is present with probability ρ independently of all others. For k ≥ 3 and ε ∈ (0,1), let E be the event that every subset A ⊆ G of size |A| ≥ ε |G| contains a proper k-term arithmetic progression with common difference in D. For fixed ε > 0 and sufficiently large N, it is an open problem to determine the smallest value of ρ - denoted ρ_k - such that Pr[E] ≥ 1/2. In [Lovett, 2007] it is shown that there exist k-query LDCs of message length Ω(ρ_k N) and codeword length O(N). As such, Szemerédi’s theorem with random differences, in particular lower bounds on ρ_k, can be used to show the existence of LDCs. Conversely, this connection indirectly implies the best-known upper bounds on ρ_k for all k ≥ 3 [Lovett, 2007; Lovett, 2007]. However, a conjecture from [Lovett, 2007] states that over ℤ_N we have ρ_k ≤ O_k(N^{-1}log N) for all k, which would be best-possible. Truth of this conjecture would imply that over this group, Szemerédi’s theorem with random differences cannot give LDCs better than the Hadamard code. For finite fields, Altman [Lovett, 2007] showed that this is false. In particular, over 𝔽_pⁿ for p odd, he proved that ρ₃ ≥ Ω(p^{-n} n²); generally, ρ_k ≥ Ω(p^{-n} n^{k-1}) holds when p ≥ k+1 [Lovett, 2007]. In turn, these bounds are conjectured to be optimal for the finite-field setting, which would imply that over finite fields, Szemerédi’s theorem with random differences cannot give LDCs better than Reed-Muller codes. The finite-field conjecture is motivated mainly by the possibility that so-called dual functions can be approximated well by polynomial phases, functions of the form e^{2π i P(x)/p} where P is a multivariate polynomial over 𝔽_p. We show that this is false. Using Yekhanin’s matching-vector-code construction, we give dual functions of order k over 𝔽_pⁿ that cannot be approximated in L_∞-distance by polynomial phases of degree k-1. This answers in the negative a natural finite-field analog of a problem of Frantzikinakis over ℕ [Lovett, 2007].

Cite as

Jop Briët and Farrokh Labib. High-Entropy Dual Functions and Locally Decodable Codes (Extended Abstract). In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 76:1-76:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{briet_et_al:LIPIcs.ITCS.2021.76,
  author =	{Bri\"{e}t, Jop and Labib, Farrokh},
  title =	{{High-Entropy Dual Functions and Locally Decodable Codes}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{76:1--76:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.76},
  URN =		{urn:nbn:de:0030-drops-136157},
  doi =		{10.4230/LIPIcs.ITCS.2021.76},
  annote =	{Keywords: Higher-order Fourier analysis, dual functions, finite fields, additive combinatorics, coding theory}
}
Document
Radon Numbers Grow Linearly

Authors: Dömötör Pálvölgyi

Published in: LIPIcs, Volume 164, 36th International Symposium on Computational Geometry (SoCG 2020)


Abstract
Define the k-th Radon number r_k of a convexity space as the smallest number (if it exists) for which any set of r_k points can be partitioned into k parts whose convex hulls intersect. Combining the recent abstract fractional Helly theorem of Holmsen and Lee with earlier methods of Bukh, we prove that r_k grows linearly, i.e., r_k ≤ c(r₂)⋅ k.

Cite as

Dömötör Pálvölgyi. Radon Numbers Grow Linearly. In 36th International Symposium on Computational Geometry (SoCG 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 164, pp. 60:1-60:5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{palvolgyi:LIPIcs.SoCG.2020.60,
  author =	{P\'{a}lv\"{o}lgyi, D\"{o}m\"{o}t\"{o}r},
  title =	{{Radon Numbers Grow Linearly}},
  booktitle =	{36th International Symposium on Computational Geometry (SoCG 2020)},
  pages =	{60:1--60:5},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-143-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{164},
  editor =	{Cabello, Sergio and Chen, Danny Z.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2020.60},
  URN =		{urn:nbn:de:0030-drops-122183},
  doi =		{10.4230/LIPIcs.SoCG.2020.60},
  annote =	{Keywords: discrete geometry, convexity space, Radon number}
}
Document
Near-Optimal Pseudorandom Generators for Constant-Depth Read-Once Formulas

Authors: Dean Doron, Pooya Hatami, and William M. Hoza

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
We give an explicit pseudorandom generator (PRG) for read-once AC^0, i.e., constant-depth read-once formulas over the basis {wedge, vee, neg} with unbounded fan-in. The seed length of our PRG is O~(log(n/epsilon)). Previously, PRGs with near-optimal seed length were known only for the depth-2 case [Gopalan et al., 2012]. For a constant depth d > 2, the best prior PRG is a recent construction by Forbes and Kelley with seed length O~(log^2 n + log n log(1/epsilon)) for the more general model of constant-width read-once branching programs with arbitrary variable order [Michael A. Forbes and Zander Kelley, 2018]. Looking beyond read-once AC^0, we also show that our PRG fools read-once AC^0[oplus] with seed length O~(t + log(n/epsilon)), where t is the number of parity gates in the formula. Our construction follows Ajtai and Wigderson’s approach of iterated pseudorandom restrictions [Ajtai and Wigderson, 1989]. We assume by recursion that we already have a PRG for depth-d AC^0 formulas. To fool depth-(d + 1) AC^0 formulas, we use the given PRG, combined with a small-bias distribution and almost k-wise independence, to sample a pseudorandom restriction. The analysis of Forbes and Kelley [Michael A. Forbes and Zander Kelley, 2018] shows that our restriction approximately preserves the expectation of the formula. The crux of our work is showing that after poly(log log n) independent applications of our pseudorandom restriction, the formula simplifies in the sense that every gate other than the output has only polylog n remaining children. Finally, as the last step, we use a recent PRG by Meka, Reingold, and Tal [Meka et al., 2019] to fool this simpler formula.

Cite as

Dean Doron, Pooya Hatami, and William M. Hoza. Near-Optimal Pseudorandom Generators for Constant-Depth Read-Once Formulas. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 16:1-16:34, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{doron_et_al:LIPIcs.CCC.2019.16,
  author =	{Doron, Dean and Hatami, Pooya and Hoza, William M.},
  title =	{{Near-Optimal Pseudorandom Generators for Constant-Depth Read-Once Formulas}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{16:1--16:34},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.16},
  URN =		{urn:nbn:de:0030-drops-108382},
  doi =		{10.4230/LIPIcs.CCC.2019.16},
  annote =	{Keywords: Pseudorandom generators, Constant-depth formulas, Explicit constructions}
}
Document
Relaxed Voronoi: A Simple Framework for Terminal-Clustering Problems

Authors: Arnold Filtser, Robert Krauthgamer, and Ohad Trabelsi

Published in: OASIcs, Volume 69, 2nd Symposium on Simplicity in Algorithms (SOSA 2019)


Abstract
We reprove three known algorithmic bounds for terminal-clustering problems, using a single framework that leads to simpler proofs. In this genre of problems, the input is a metric space (X,d) (possibly arising from a graph) and a subset of terminals K subset X, and the goal is to partition the points X such that each part, called a cluster, contains exactly one terminal (possibly with connectivity requirements) so as to minimize some objective. The three bounds we reprove are for Steiner Point Removal on trees [Gupta, SODA 2001], for Metric 0-Extension in bounded doubling dimension [Lee and Naor, unpublished 2003], and for Connected Metric 0-Extension [Englert et al., SICOMP 2014]. A natural approach is to cluster each point with its closest terminal, which would partition X into so-called Voronoi cells, but this approach can fail miserably due to its stringent cluster boundaries. A now-standard fix, which we call the Relaxed-Voronoi framework, is to use enlarged Voronoi cells, but to obtain disjoint clusters, the cells are computed greedily according to some order. This method, first proposed by Calinescu, Karloff and Rabani [SICOMP 2004], was employed successfully to provide state-of-the-art results for terminal-clustering problems on general metrics. However, for restricted families of metrics, e.g., trees and doubling metrics, only more complicated, ad-hoc algorithms are known. Our main contribution is to demonstrate that the Relaxed-Voronoi algorithm is applicable to restricted metrics, and actually leads to relatively simple algorithms and analyses.

Cite as

Arnold Filtser, Robert Krauthgamer, and Ohad Trabelsi. Relaxed Voronoi: A Simple Framework for Terminal-Clustering Problems. In 2nd Symposium on Simplicity in Algorithms (SOSA 2019). Open Access Series in Informatics (OASIcs), Volume 69, pp. 10:1-10:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{filtser_et_al:OASIcs.SOSA.2019.10,
  author =	{Filtser, Arnold and Krauthgamer, Robert and Trabelsi, Ohad},
  title =	{{Relaxed Voronoi: A Simple Framework for Terminal-Clustering Problems}},
  booktitle =	{2nd Symposium on Simplicity in Algorithms (SOSA 2019)},
  pages =	{10:1--10:14},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-099-6},
  ISSN =	{2190-6807},
  year =	{2019},
  volume =	{69},
  editor =	{Fineman, Jeremy T. and Mitzenmacher, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.SOSA.2019.10},
  URN =		{urn:nbn:de:0030-drops-100369},
  doi =		{10.4230/OASIcs.SOSA.2019.10},
  annote =	{Keywords: Clustering, Steiner point removal, Zero extension, Doubling dimension, Relaxed voronoi}
}
Document
Colouring (P_r+P_s)-Free Graphs

Authors: Tereza Klimosová, Josef Malík, Tomás Masarík, Jana Novotná, Daniël Paulusma, and Veronika Slívová

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
The k-Colouring problem is to decide if the vertices of a graph can be coloured with at most k colours for a fixed integer k such that no two adjacent vertices are coloured alike. If each vertex u must be assigned a colour from a prescribed list L(u) subseteq {1,...,k}, then we obtain the List k-Colouring problem. A graph G is H-free if G does not contain H as an induced subgraph. We continue an extensive study into the complexity of these two problems for H-free graphs. We prove that List 3-Colouring is polynomial-time solvable for (P_2+P_5)-free graphs and for (P_3+P_4)-free graphs. Combining our results with known results yields complete complexity classifications of 3-Colouring and List 3-Colouring on H-free graphs for all graphs H up to seven vertices. We also prove that 5-Colouring is NP-complete for (P_3+P_5)-free graphs.

Cite as

Tereza Klimosová, Josef Malík, Tomás Masarík, Jana Novotná, Daniël Paulusma, and Veronika Slívová. Colouring (P_r+P_s)-Free Graphs. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 5:1-5:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{klimosova_et_al:LIPIcs.ISAAC.2018.5,
  author =	{Klimosov\'{a}, Tereza and Mal{\'\i}k, Josef and Masar{\'\i}k, Tom\'{a}s and Novotn\'{a}, Jana and Paulusma, Dani\"{e}l and Sl{\'\i}vov\'{a}, Veronika},
  title =	{{Colouring (P\underliner+P\underlines)-Free Graphs}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{5:1--5:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.5},
  URN =		{urn:nbn:de:0030-drops-99533},
  doi =		{10.4230/LIPIcs.ISAAC.2018.5},
  annote =	{Keywords: vertex colouring, H-free graph, linear forest}
}
Document
New and Improved Algorithms for Unordered Tree Inclusion

Authors: Tatsuya Akutsu, Jesper Jansson, Ruiming Li, Atsuhiro Takasu, and Takeyuki Tamura

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
The tree inclusion problem is, given two node-labeled trees P and T (the "pattern tree" and the "text tree"), to locate every minimal subtree in T (if any) that can be obtained by applying a sequence of node insertion operations to P. Although the ordered tree inclusion problem is solvable in polynomial time, the unordered tree inclusion problem is NP-hard. The currently fastest algorithm for the latter is from 1995 and runs in O(poly(m,n) * 2^{2d}) = O^*(2^{2d}) time, where m and n are the sizes of the pattern and text trees, respectively, and d is the maximum outdegree of the pattern tree. Here, we develop a new algorithm that improves the exponent 2d to d by considering a particular type of ancestor-descendant relationships and applying dynamic programming, thus reducing the time complexity to O^*(2^d). We then study restricted variants of the unordered tree inclusion problem where the number of occurrences of different node labels and/or the input trees' heights are bounded. We show that although the problem remains NP-hard in many such cases, it can be solved in polynomial time for c = 2 and in O^*(1.8^d) time for c = 3 if the leaves of P are distinctly labeled and each label occurs at most c times in T. We also present a randomized O^*(1.883^d)-time algorithm for the case that the heights of P and T are one and two, respectively.

Cite as

Tatsuya Akutsu, Jesper Jansson, Ruiming Li, Atsuhiro Takasu, and Takeyuki Tamura. New and Improved Algorithms for Unordered Tree Inclusion. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 27:1-27:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{akutsu_et_al:LIPIcs.ISAAC.2018.27,
  author =	{Akutsu, Tatsuya and Jansson, Jesper and Li, Ruiming and Takasu, Atsuhiro and Tamura, Takeyuki},
  title =	{{New and Improved Algorithms for Unordered Tree Inclusion}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{27:1--27:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.27},
  URN =		{urn:nbn:de:0030-drops-99752},
  doi =		{10.4230/LIPIcs.ISAAC.2018.27},
  annote =	{Keywords: parameterized algorithms, tree inclusion, unordered trees, dynamic programming}
}
Document
On Polynomial Time Constructions of Minimum Height Decision Tree

Authors: Nader H. Bshouty and Waseem Makhoul

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
A decision tree T in B_m:={0,1}^m is a binary tree where each of its internal nodes is labeled with an integer in [m]={1,2,...,m}, each leaf is labeled with an assignment a in B_m and each internal node has two outgoing edges that are labeled with 0 and 1, respectively. Let A subset {0,1}^m. We say that T is a decision tree for A if (1) For every a in A there is one leaf of T that is labeled with a. (2) For every path from the root to a leaf with internal nodes labeled with i_1,i_2,...,i_k in[m], a leaf labeled with a in A and edges labeled with xi_{i_1},...,xi_{i_k}in {0,1}, a is the only element in A that satisfies a_{i_j}=xi_{i_j} for all j=1,...,k. Our goal is to write a polynomial time (in n:=|A| and m) algorithm that for an input A subseteq B_m outputs a decision tree for A of minimum depth. This problem has many applications that include, to name a few, computer vision, group testing, exact learning from membership queries and game theory. Arkin et al. and Moshkov [Esther M. Arkin et al., 1998; Mikhail Ju. Moshkov, 2004] gave a polynomial time (ln |A|)- approximation algorithm (for the depth). The result of Dinur and Steurer [Irit Dinur and David Steurer, 2014] for set cover implies that this problem cannot be approximated with ratio (1-o(1))* ln |A|, unless P=NP. Moshkov studied in [Mikhail Ju. Moshkov, 2004; Mikhail Ju. Moshkov, 1982; Mikhail Ju. Moshkov, 1982] the combinatorial measure of extended teaching dimension of A, ETD(A). He showed that ETD(A) is a lower bound for the depth of the decision tree for A and then gave an exponential time ETD(A)/log(ETD(A))-approximation algorithm and a polynomial time 2(ln 2)ETD(A)-approximation algorithm. In this paper we further study the ETD(A) measure and a new combinatorial measure, DEN(A), that we call the density of the set A. We show that DEN(A) <=ETD(A)+1. We then give two results. The first result is that the lower bound ETD(A) of Moshkov for the depth of the decision tree for A is greater than the bounds that are obtained by the classical technique used in the literature. The second result is a polynomial time (ln 2)DEN(A)-approximation (and therefore (ln 2)ETD(A)-approximation) algorithm for the depth of the decision tree of A. We then apply the above results to learning the class of disjunctions of predicates from membership queries [Nader H. Bshouty et al., 2017]. We show that the ETD of this class is bounded from above by the degree d of its Hasse diagram. We then show that Moshkov algorithm can be run in polynomial time and is (d/log d)-approximation algorithm. This gives optimal algorithms when the degree is constant. For example, learning axis parallel rays over constant dimension space.

Cite as

Nader H. Bshouty and Waseem Makhoul. On Polynomial Time Constructions of Minimum Height Decision Tree. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 34:1-34:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{h.bshouty_et_al:LIPIcs.ISAAC.2018.34,
  author =	{H. Bshouty, Nader and Makhoul, Waseem},
  title =	{{On Polynomial Time Constructions of Minimum Height Decision Tree}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{34:1--34:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.34},
  URN =		{urn:nbn:de:0030-drops-99824},
  doi =		{10.4230/LIPIcs.ISAAC.2018.34},
  annote =	{Keywords: Decision Tree, Minimal Depth, Approximation algorithms}
}
Document
Multimedia Exposition
Geometric Realizations of the 3D Associahedron (Multimedia Exposition)

Authors: Satyan L. Devadoss, Daniel D. Johnson, Justin Lee, and Jackson Warley

Published in: LIPIcs, Volume 99, 34th International Symposium on Computational Geometry (SoCG 2018)


Abstract
The associahedron is a convex polytope whose 1-skeleton is isomorphic to the flip graph of a convex polygon. There exists an elegant geometric realization of the associahedron, using the remarkable theory of secondary polytopes, based on the geometry of the underlying polygon. We present an interactive application that visualizes this correspondence in the 3D case.

Cite as

Satyan L. Devadoss, Daniel D. Johnson, Justin Lee, and Jackson Warley. Geometric Realizations of the 3D Associahedron (Multimedia Exposition). In 34th International Symposium on Computational Geometry (SoCG 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 99, pp. 75:1-75:4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{devadoss_et_al:LIPIcs.SoCG.2018.75,
  author =	{Devadoss, Satyan L. and Johnson, Daniel D. and Lee, Justin and Warley, Jackson},
  title =	{{Geometric Realizations of the 3D Associahedron}},
  booktitle =	{34th International Symposium on Computational Geometry (SoCG 2018)},
  pages =	{75:1--75:4},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-066-8},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{99},
  editor =	{Speckmann, Bettina and T\'{o}th, Csaba D.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2018.75},
  URN =		{urn:nbn:de:0030-drops-87886},
  doi =		{10.4230/LIPIcs.SoCG.2018.75},
  annote =	{Keywords: associahedron, secondary polytope, realization}
}
Document
Tight Approximation for Partial Vertex Cover with Hard Capacities

Authors: Jia-Yau Shiau, Mong-Jen Kao, Ching-Chi Lin, and D. T. Lee

Published in: LIPIcs, Volume 92, 28th International Symposium on Algorithms and Computation (ISAAC 2017)


Abstract
We consider the partial vertex cover problem with hard capacity constraints (Partial VC-HC) on hypergraphs. In this problem we are given a hypergraph G=(V,E) with a maximum edge size f and a covering requirement R. Each edge is associated with a demand, and each vertex is associated with a capacity and an (integral) available multiplicity. The objective is to compute a minimum vertex multiset such that at least R units of demand from the edges are covered by the capacities of the vertices in the multiset and the multiplicity of each vertex does not exceed its available multiplicity. In this paper we present an f-approximation for this problem, improving over a previous result of (2f+2)(1+epsilon) by Cheung et al to the tight extent possible. Our new ingredient of this work is a generalized analysis on the extreme points of the natural LP, developed from previous works, and a strengthened LP lower-bound obtained for the optimal solutions.

Cite as

Jia-Yau Shiau, Mong-Jen Kao, Ching-Chi Lin, and D. T. Lee. Tight Approximation for Partial Vertex Cover with Hard Capacities. In 28th International Symposium on Algorithms and Computation (ISAAC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 92, pp. 64:1-64:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{shiau_et_al:LIPIcs.ISAAC.2017.64,
  author =	{Shiau, Jia-Yau and Kao, Mong-Jen and Lin, Ching-Chi and Lee, D. T.},
  title =	{{Tight Approximation for Partial Vertex Cover with Hard Capacities}},
  booktitle =	{28th International Symposium on Algorithms and Computation (ISAAC 2017)},
  pages =	{64:1--64:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-054-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{92},
  editor =	{Okamoto, Yoshio and Tokuyama, Takeshi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2017.64},
  URN =		{urn:nbn:de:0030-drops-82694},
  doi =		{10.4230/LIPIcs.ISAAC.2017.64},
  annote =	{Keywords: Approximation Algorithm, Capacitated Vertex Cover, Hard Capacities}
}
Document
Global and Fixed-Terminal Cuts in Digraphs

Authors: Kristóf Bérczi, Karthekeyan Chandrasekaran, Tamás Király, Euiwoong Lee, and Chao Xu

Published in: LIPIcs, Volume 81, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2017)


Abstract
The computational complexity of multicut-like problems may vary significantly depending on whether the terminals are fixed or not. In this work we present a comprehensive study of this phenomenon in two types of cut problems in directed graphs: double cut and bicut. 1. Fixed-terminal edge-weighted double cut is known to be solvable efficiently. We show that fixed-terminal node-weighted double cut cannot be approximated to a factor smaller than 2 under the Unique Games Conjecture (UGC), and we also give a 2-approximation algorithm. For the global version of the problem, we prove an inapproximability bound of 3/2 under UGC. 2. Fixed-terminal edge-weighted bicut is known to have an approximability factor of 2 that is tight under UGC. We show that the global edge-weighted bicut is approximable to a factor strictly better than 2, and that the global node-weighted bicut cannot be approximated to a factor smaller than 3/2 under UGC. 3. In relation to these investigations, we also prove two results on undirected graphs which are of independent interest. First, we show NP-completeness and a tight inapproximability bound of 4/3 for the node-weighted 3-cut problem under UGC. Second, we show that for constant k, there exists an efficient algorithm to solve the minimum {s,t}-separating k-cut problem. Our techniques for the algorithms are combinatorial, based on LPs and based on the enumeration of approximate min-cuts. Our hardness results are based on combinatorial reductions and integrality gap instances.

Cite as

Kristóf Bérczi, Karthekeyan Chandrasekaran, Tamás Király, Euiwoong Lee, and Chao Xu. Global and Fixed-Terminal Cuts in Digraphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 81, pp. 2:1-2:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{berczi_et_al:LIPIcs.APPROX-RANDOM.2017.2,
  author =	{B\'{e}rczi, Krist\'{o}f and Chandrasekaran, Karthekeyan and Kir\'{a}ly, Tam\'{a}s and Lee, Euiwoong and Xu, Chao},
  title =	{{Global and Fixed-Terminal Cuts in Digraphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2017)},
  pages =	{2:1--2:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-044-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{81},
  editor =	{Jansen, Klaus and Rolim, Jos\'{e} D. P. and Williamson, David P. and Vempala, Santosh S.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2017.2},
  URN =		{urn:nbn:de:0030-drops-75511},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2017.2},
  annote =	{Keywords: Directed Graphs, Arborescence, Graph Cuts, Hardness of Approximation}
}
Document
O(f) Bi-Approximation for Capacitated Covering with Hard Capacities

Authors: Mong-Jen Kao, Hai-Lun Tu, and D. T. Lee

Published in: LIPIcs, Volume 64, 27th International Symposium on Algorithms and Computation (ISAAC 2016)


Abstract
We consider capacitated vertex cover with hard capacity constraints (VC-HC) on hypergraphs. In this problem we are given a hypergraph G = (V, E) with a maximum edge size f. Each edge is associated with a demand and each vertex is associated with a weight (cost), a capacity, and an available multiplicity. The objective is to find a minimum-weight vertex multiset such that the demands of the edges can be covered by the capacities of the vertices and the multiplicity of each vertex does not exceed its available multiplicity. In this paper we present an O(f) bi-approximation for VC-HC that gives a trade-off on the number of augmented multiplicity and the cost of the resulting cover. In particular, we show that, by augmenting the available multiplicity by a factor of k geq 2, a cover with a cost ratio of (1+ frac{1}{k - 1})(f - 1) to the optimal cover for the original instance can be obtained. This improves over a previous result, which has a cost ratio of f^2 via augmenting the available multiplicity by a factor of f.

Cite as

Mong-Jen Kao, Hai-Lun Tu, and D. T. Lee. O(f) Bi-Approximation for Capacitated Covering with Hard Capacities. In 27th International Symposium on Algorithms and Computation (ISAAC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 64, pp. 40:1-40:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{kao_et_al:LIPIcs.ISAAC.2016.40,
  author =	{Kao, Mong-Jen and Tu, Hai-Lun and Lee, D. T.},
  title =	{{O(f) Bi-Approximation for Capacitated Covering with Hard Capacities}},
  booktitle =	{27th International Symposium on Algorithms and Computation (ISAAC 2016)},
  pages =	{40:1--40:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-026-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{64},
  editor =	{Hong, Seok-Hee},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2016.40},
  URN =		{urn:nbn:de:0030-drops-68102},
  doi =		{10.4230/LIPIcs.ISAAC.2016.40},
  annote =	{Keywords: Capacitated Covering, Hard Capacities, Bi-criteria Approximation}
}
  • Refine by Author
  • 2 Kao, Mong-Jen
  • 2 Lee, D. T.
  • 1 Akutsu, Tatsuya
  • 1 Briët, Jop
  • 1 Bérczi, Kristóf
  • Show More...

  • Refine by Classification
  • 1 Mathematics of computing → Combinatorial optimization
  • 1 Mathematics of computing → Graph theory
  • 1 Mathematics of computing → Hypergraphs
  • 1 Theory of computation
  • 1 Theory of computation → Algebraic complexity theory
  • Show More...

  • Refine by Keyword
  • 2 Hard Capacities
  • 1 Approximation Algorithm
  • 1 Approximation algorithms
  • 1 Arborescence
  • 1 Bi-criteria Approximation
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 4 2018
  • 2 2017
  • 2 2019
  • 2 2021
  • 1 2016
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail