Search Results

Documents authored by Chaudhury, Bhaskar Ray


Document
On the Existence of Competitive Equilibrium with Chores

Authors: Bhaskar Ray Chaudhury, Jugal Garg, Peter McGlaughlin, and Ruta Mehta

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We study the chore division problem in the classic Arrow-Debreu exchange setting, where a set of agents want to divide their divisible chores (bads) to minimize their disutilities (costs). We assume that agents have linear disutility functions. Like the setting with goods, a division based on competitive equilibrium is regarded as one of the best mechanisms for bads. Equilibrium existence for goods has been extensively studied, resulting in a simple, polynomial-time verifiable, necessary and sufficient condition. However, dividing bads has not received a similar extensive study even though it is as relevant as dividing goods in day-to-day life. In this paper, we show that the problem of checking whether an equilibrium exists in chore division is NP-complete, which is in sharp contrast to the case of goods. Further, we derive a simple, polynomial-time verifiable, sufficient condition for existence. Our fixed-point formulation to show existence makes novel use of both Kakutani and Brouwer fixed-point theorems, the latter nested inside the former, to avoid the undefined demand issue specific to bads.

Cite as

Bhaskar Ray Chaudhury, Jugal Garg, Peter McGlaughlin, and Ruta Mehta. On the Existence of Competitive Equilibrium with Chores. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 41:1-41:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{chaudhury_et_al:LIPIcs.ITCS.2022.41,
  author =	{Chaudhury, Bhaskar Ray and Garg, Jugal and McGlaughlin, Peter and Mehta, Ruta},
  title =	{{On the Existence of Competitive Equilibrium with Chores}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{41:1--41:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.41},
  URN =		{urn:nbn:de:0030-drops-156378},
  doi =		{10.4230/LIPIcs.ITCS.2022.41},
  annote =	{Keywords: Fair Division, Competitive Equilibrium, Fixed Point Theorems}
}
Document
Polyline Simplification has Cubic Complexity

Authors: Karl Bringmann and Bhaskar Ray Chaudhury

Published in: LIPIcs, Volume 129, 35th International Symposium on Computational Geometry (SoCG 2019)


Abstract
In the classic polyline simplification problem we want to replace a given polygonal curve P, consisting of n vertices, by a subsequence P' of k vertices from P such that the polygonal curves P and P' are "close". Closeness is usually measured using the Hausdorff or Fréchet distance. These distance measures can be applied globally, i.e., to the whole curves P and P', or locally, i.e., to each simplified subcurve and the line segment that it was replaced with separately (and then taking the maximum). We provide an O(n^3) time algorithm for simplification under Global-Fréchet distance, improving the previous best algorithm by a factor of Omega(kn^2). We also provide evidence that in high dimensions cubic time is essentially optimal for all three problems (Local-Hausdorff, Local-Fréchet, and Global-Fréchet). Specifically, improving the cubic time to O(n^{3-epsilon} poly(d)) for polyline simplification over (R^d,L_p) for p = 1 would violate plausible conjectures. We obtain similar results for all p in [1,infty), p != 2. In total, in high dimensions and over general L_p-norms we resolve the complexity of polyline simplification with respect to Local-Hausdorff, Local-Fréchet, and Global-Fréchet, by providing new algorithms and conditional lower bounds.

Cite as

Karl Bringmann and Bhaskar Ray Chaudhury. Polyline Simplification has Cubic Complexity. In 35th International Symposium on Computational Geometry (SoCG 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 129, pp. 18:1-18:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.SoCG.2019.18,
  author =	{Bringmann, Karl and Chaudhury, Bhaskar Ray},
  title =	{{Polyline Simplification has Cubic Complexity}},
  booktitle =	{35th International Symposium on Computational Geometry (SoCG 2019)},
  pages =	{18:1--18:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-104-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{129},
  editor =	{Barequet, Gill and Wang, Yusu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2019.18},
  URN =		{urn:nbn:de:0030-drops-104224},
  doi =		{10.4230/LIPIcs.SoCG.2019.18},
  annote =	{Keywords: Polyline simplification, Fr\'{e}chet distance, Hausdorff distance, Conditional lower bounds}
}
Document
On Fair Division for Indivisible Items

Authors: Bhaskar Ray Chaudhury, Yun Kuen Cheung, Jugal Garg, Naveen Garg, Martin Hoefer, and Kurt Mehlhorn

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We consider the task of assigning indivisible goods to a set of agents in a fair manner. Our notion of fairness is Nash social welfare, i.e., the goal is to maximize the geometric mean of the utilities of the agents. Each good comes in multiple items or copies, and the utility of an agent diminishes as it receives more items of the same good. The utility of a bundle of items for an agent is the sum of the utilities of the items in the bundle. Each agent has a utility cap beyond which he does not value additional items. We give a polynomial time approximation algorithm that maximizes Nash social welfare up to a factor of e^{1/{e}} ~~ 1.445. The computed allocation is Pareto-optimal and approximates envy-freeness up to one item up to a factor of 2 + epsilon.

Cite as

Bhaskar Ray Chaudhury, Yun Kuen Cheung, Jugal Garg, Naveen Garg, Martin Hoefer, and Kurt Mehlhorn. On Fair Division for Indivisible Items. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 25:1-25:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chaudhury_et_al:LIPIcs.FSTTCS.2018.25,
  author =	{Chaudhury, Bhaskar Ray and Cheung, Yun Kuen and Garg, Jugal and Garg, Naveen and Hoefer, Martin and Mehlhorn, Kurt},
  title =	{{On Fair Division for Indivisible Items}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{25:1--25:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.25},
  URN =		{urn:nbn:de:0030-drops-99242},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.25},
  annote =	{Keywords: Fair Division, Indivisible Goods, Envy-Free}
}
Document
Combinatorial Algorithms for General Linear Arrow-Debreu Markets

Authors: Bhaskar Ray Chaudhury and Kurt Mehlhorn

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We present a combinatorial algorithm for determining the market clearing prices of a general linear Arrow-Debreu market, where every agent can own multiple goods. The existing combinatorial algorithms for linear Arrow-Debreu markets consider the case where each agent can own all of one good only. We present an O~((n+m)^7 log^3(UW)) algorithm where n, m, U and W refer to the number of agents, the number of goods, the maximal integral utility and the maximum quantity of any good in the market respectively. The algorithm refines the iterative algorithm of Duan, Garg and Mehlhorn using several new ideas. We also identify the hard instances for existing combinatorial algorithms for linear Arrow-Debreu markets. In particular we find instances where the ratio of the maximum to the minimum equilibrium price of a good is U^{Omega(n)} and the number of iterations required by the existing iterative combinatorial algorithms of Duan, and Mehlhorn and Duan, Garg, and Mehlhorn are high. Our instances also separate the two algorithms.

Cite as

Bhaskar Ray Chaudhury and Kurt Mehlhorn. Combinatorial Algorithms for General Linear Arrow-Debreu Markets. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 26:1-26:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chaudhury_et_al:LIPIcs.FSTTCS.2018.26,
  author =	{Chaudhury, Bhaskar Ray and Mehlhorn, Kurt},
  title =	{{Combinatorial Algorithms for General Linear Arrow-Debreu Markets}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{26:1--26:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.26},
  URN =		{urn:nbn:de:0030-drops-99255},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.26},
  annote =	{Keywords: Linear Exchange Markets, Equilibrium, Combinatorial Algorithms}
}
Document
Sketching, Streaming, and Fine-Grained Complexity of (Weighted) LCS

Authors: Karl Bringmann and Bhaskar Ray Chaudhury

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We study sketching and streaming algorithms for the Longest Common Subsequence problem (LCS) on strings of small alphabet size |Sigma|. For the problem of deciding whether the LCS of strings x,y has length at least L, we obtain a sketch size and streaming space usage of O(L^{|Sigma| - 1} log L). We also prove matching unconditional lower bounds. As an application, we study a variant of LCS where each alphabet symbol is equipped with a weight that is given as input, and the task is to compute a common subsequence of maximum total weight. Using our sketching algorithm, we obtain an O(min{nm, n + m^{|Sigma|}})-time algorithm for this problem, on strings x,y of length n,m, with n >= m. We prove optimality of this running time up to lower order factors, assuming the Strong Exponential Time Hypothesis.

Cite as

Karl Bringmann and Bhaskar Ray Chaudhury. Sketching, Streaming, and Fine-Grained Complexity of (Weighted) LCS. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 40:1-40:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.FSTTCS.2018.40,
  author =	{Bringmann, Karl and Chaudhury, Bhaskar Ray},
  title =	{{Sketching, Streaming, and Fine-Grained Complexity of (Weighted) LCS}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{40:1--40:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.40},
  URN =		{urn:nbn:de:0030-drops-99390},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.40},
  annote =	{Keywords: algorithms, SETH, communication complexity, run-length encoding}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail