Search Results

Documents authored by Sen, Sandeep


Document
Generalizations of Length Limited Huffman Coding for Hierarchical Memory Settings

Authors: Shashwat Banchhor, Rishikesh Gajjala, Yogish Sabharwal, and Sandeep Sen

Published in: LIPIcs, Volume 213, 41st IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2021)


Abstract
In this paper, we study the problem of designing prefix-free encoding schemes having minimum average code length that can be decoded efficiently under a decode cost model that captures memory hierarchy induced cost functions. We also study a special case of this problem that is closely related to the length limited Huffman coding (LLHC) problem; we call this the soft-length limited Huffman coding problem. In this version, there is a penalty associated with each of the n characters of the alphabet whose encodings exceed a specified bound D(≤ n) where the penalty increases linearly with the length of the encoding beyond D. The goal of the problem is to find a prefix-free encoding having minimum average code length and total penalty within a pre-specified bound P. This generalizes the LLHC problem. We present an algorithm to solve this problem that runs in time O(nD). We study a further generalization in which the penalty function and the objective function can both be arbitrary monotonically non-decreasing functions of the codeword length. We provide dynamic programming based exact and PTAS algorithms for this setting.

Cite as

Shashwat Banchhor, Rishikesh Gajjala, Yogish Sabharwal, and Sandeep Sen. Generalizations of Length Limited Huffman Coding for Hierarchical Memory Settings. In 41st IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 213, pp. 8:1-8:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{banchhor_et_al:LIPIcs.FSTTCS.2021.8,
  author =	{Banchhor, Shashwat and Gajjala, Rishikesh and Sabharwal, Yogish and Sen, Sandeep},
  title =	{{Generalizations of Length Limited Huffman Coding for Hierarchical Memory Settings}},
  booktitle =	{41st IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2021)},
  pages =	{8:1--8:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-215-0},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{213},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Chekuri, Chandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2021.8},
  URN =		{urn:nbn:de:0030-drops-155193},
  doi =		{10.4230/LIPIcs.FSTTCS.2021.8},
  annote =	{Keywords: Approximation algorithms, Hierarchical memory, Prefix free codes}
}
Document
A Unified Approach to Tail Estimates for Randomized Incremental Construction

Authors: Sandeep Sen

Published in: LIPIcs, Volume 126, 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)


Abstract
By combining several interesting applications of random sampling in geometric algorithms like point location, linear programming, segment intersections, binary space partitioning, Clarkson and Shor [Kenneth L. Clarkson and Peter W. Shor, 1989] developed a general framework of randomized incremental construction (RIC ). The basic idea is to add objects in a random order and show that this approach yields efficient/optimal bounds on expected running time. Even quicksort can be viewed as a special case of this paradigm. However, unlike quicksort, for most of these problems, sharper tail estimates on their running times are not known. Barring some promising attempts in [Kurt Mehlhorn et al., 1993; Kenneth L. Clarkson et al., 1992; Raimund Seidel, 1991], the general question remains unresolved. In this paper we present a general technique to obtain tail estimates for RIC and and provide applications to some fundamental problems like Delaunay triangulations and construction of Visibility maps of intersecting line segments. The main result of the paper is derived from a new and careful application of Freedman’s [David Freedman, 1975] inequality for Martingale concentration that overcomes the bottleneck of the better known Azuma-Hoeffding inequality. Further, we explore instances, where an RIC based algorithm may not have inverse polynomial tail estimates. In particular, we show that the RIC time bounds for trapezoidal map can encounter a running time of Omega (n log n log log n) with probability exceeding 1/(sqrt{n)}. This rules out inverse polynomial concentration bounds within a constant factor of the O(n log n) expected running time.

Cite as

Sandeep Sen. A Unified Approach to Tail Estimates for Randomized Incremental Construction. In 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 126, pp. 58:1-58:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{sen:LIPIcs.STACS.2019.58,
  author =	{Sen, Sandeep},
  title =	{{A Unified Approach to Tail Estimates for Randomized Incremental Construction}},
  booktitle =	{36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)},
  pages =	{58:1--58:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-100-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{126},
  editor =	{Niedermeier, Rolf and Paul, Christophe},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2019.58},
  URN =		{urn:nbn:de:0030-drops-102977},
  doi =		{10.4230/LIPIcs.STACS.2019.58},
  annote =	{Keywords: ric, tail estimates, martingale, lower bound}
}
Document
Complete Volume
LIPICs, Volume 65, FSTTCS'16, Complete Volume

Authors: Akash Lal, S. Akshay, Saket Saurabh, and Sandeep Sen

Published in: LIPIcs, Volume 65, 36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016)


Abstract
LIPICs, Volume 65, FSTTCS'16, Complete Volume

Cite as

36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 65, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Proceedings{lal_et_al:LIPIcs.FSTTCS.2016,
  title =	{{LIPICs, Volume 65, FSTTCS'16, Complete Volume}},
  booktitle =	{36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-027-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{65},
  editor =	{Lal, Akash and Akshay, S. and Saurabh, Saket and Sen, Sandeep},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2016},
  URN =		{urn:nbn:de:0030-drops-69074},
  doi =		{10.4230/LIPIcs.FSTTCS.2016},
  annote =	{Keywords: Software/Program Verification, Models of Computation, Modes of Computation, Complexity Measures and Classes, Nonnumerical Algorithms and Problems Specifying and Verifying and Reasoning about Programs, Mathematical Logic, Formal Languages}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization, External Reviewers

Authors: Akash Lal, S. Akshay, Saket Saurabh, and Sandeep Sen

Published in: LIPIcs, Volume 65, 36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization, External Reviewers

Cite as

36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 65, pp. 0:i-0:xiv, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{lal_et_al:LIPIcs.FSTTCS.2016.0,
  author =	{Lal, Akash and Akshay, S. and Saurabh, Saket and Sen, Sandeep},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization, External Reviewers}},
  booktitle =	{36th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2016)},
  pages =	{0:i--0:xiv},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-027-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{65},
  editor =	{Lal, Akash and Akshay, S. and Saurabh, Saket and Sen, Sandeep},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2016.0},
  URN =		{urn:nbn:de:0030-drops-68412},
  doi =		{10.4230/LIPIcs.FSTTCS.2016.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization, External Reviewers}
}
Document
On Density, Threshold and Emptiness Queries for Intervals in the Streaming Model

Authors: Arijit Bishnu, Amit Chakrabarti, Subhas C. Nandy, and Sandeep Sen

Published in: LIPIcs, Volume 45, 35th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2015)


Abstract
In this paper, we study the maximum density, threshold and emptiness queries for intervals in the streaming model. The input is a stream S of n points in the real line R and a floating closed interval W of width alpha. The specific problems we consider in this paper are as follows. - Maximum density: find a placement of W in R containing the maximum number of points of S. - Threshold query: find a placement of W in R, if it exists, that contains at least Delta elements of S. - Emptiness query: find, if possible, a placement of W within the extent of S so that the interior of W does not contain any element of S. The stream S, being huge, does not fit into main memory and can be read sequentially at most a constant number of times, usually once. The problems studied here in the geometric setting have relations to frequency estimation and heavy hitter identification in a stream of data. We provide lower bounds and results on trade-off between extra space and quality of solution. We also discuss generalizations for the higher dimensional variants for a few cases.

Cite as

Arijit Bishnu, Amit Chakrabarti, Subhas C. Nandy, and Sandeep Sen. On Density, Threshold and Emptiness Queries for Intervals in the Streaming Model. In 35th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 45, pp. 336-349, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{bishnu_et_al:LIPIcs.FSTTCS.2015.336,
  author =	{Bishnu, Arijit and Chakrabarti, Amit and Nandy, Subhas C. and Sen, Sandeep},
  title =	{{On Density, Threshold and Emptiness Queries for Intervals in the Streaming Model}},
  booktitle =	{35th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2015)},
  pages =	{336--349},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-97-2},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{45},
  editor =	{Harsha, Prahladh and Ramalingam, G.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2015.336},
  URN =		{urn:nbn:de:0030-drops-56488},
  doi =		{10.4230/LIPIcs.FSTTCS.2015.336},
  annote =	{Keywords: Density, threshold, emptiness queries, interval queries, streaming model, heavy hitter, frequency estimation}
}
Document
Maintaining Approximate Maximum Weighted Matching in Fully Dynamic Graphs

Authors: Abhash Anand, Surender Baswana, Manoj Gupta, and Sandeep Sen

Published in: LIPIcs, Volume 18, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2012)


Abstract
We present a fully dynamic algorithm for maintaining approximate maximum weight matching in general weighted graphs. The algorithm maintains a matching M whose weight is at least 1/8 M^{*} where M^{*} is the weight of the maximum weight matching. The algorithm achieves an expected amortized O(log n log C) time per edge insertion or deletion, where C is the ratio of the weights of the highest weight edge to the smallest weight edge in the given graph.

Cite as

Abhash Anand, Surender Baswana, Manoj Gupta, and Sandeep Sen. Maintaining Approximate Maximum Weighted Matching in Fully Dynamic Graphs. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2012). Leibniz International Proceedings in Informatics (LIPIcs), Volume 18, pp. 257-266, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{anand_et_al:LIPIcs.FSTTCS.2012.257,
  author =	{Anand, Abhash and Baswana, Surender and Gupta, Manoj and Sen, Sandeep},
  title =	{{Maintaining Approximate Maximum Weighted Matching in Fully  Dynamic Graphs}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2012)},
  pages =	{257--266},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-47-7},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{18},
  editor =	{D'Souza, Deepak and Radhakrishnan, Jaikumar and Telikepalli, Kavitha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2012.257},
  URN =		{urn:nbn:de:0030-drops-38648},
  doi =		{10.4230/LIPIcs.FSTTCS.2012.257},
  annote =	{Keywords: Matching, Dynamic Algorithm, Graph Algorithm}
}
Document
The update complexity of selection and related problems

Authors: Manoj Gupta, Yogish Sabharwal, and Sandeep Sen

Published in: LIPIcs, Volume 13, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011)


Abstract
We present a framework for computing with input data specified by intervals, representing uncertainty in the values of the input parameters. To compute a solution, the algorithm can query the input parameters that yield more refined estimates in form of sub-intervals and the objective is to minimize the number of queries.The previous approaches address the scenario where every query returns an exact value. Our framework is more general as it can deal with a wider variety of inputs and query responses and we establish interesting relationships between them that have not been investigated previously. Although some of the approaches of the previous restricted models can be adapted to the more general model, we require more sophisticated techniques for the analysis and we also obtain improved algorithms for the previous model. We address selection problems in the generalized model and show that there exist 2-update competitive algorithms that do not depend on the lengths or distribution of the sub-intervals and hold against the worst case adversary. We also obtain similar bounds on the competitive ratio for the MST problem in graphs.

Cite as

Manoj Gupta, Yogish Sabharwal, and Sandeep Sen. The update complexity of selection and related problems. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011). Leibniz International Proceedings in Informatics (LIPIcs), Volume 13, pp. 325-338, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{gupta_et_al:LIPIcs.FSTTCS.2011.325,
  author =	{Gupta, Manoj and Sabharwal, Yogish and Sen, Sandeep},
  title =	{{The update complexity of selection and related problems}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011)},
  pages =	{325--338},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-34-7},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{13},
  editor =	{Chakraborty, Supratik and Kumar, Amit},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2011.325},
  URN =		{urn:nbn:de:0030-drops-33314},
  doi =		{10.4230/LIPIcs.FSTTCS.2011.325},
  annote =	{Keywords: Uncertain data, Competitive analysis}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail