21 Search Results for "Mehlhorn, Kurt"


Document
A Formal Analysis of RANKING

Authors: Mohammad Abdulaziz and Christoph Madlener

Published in: LIPIcs, Volume 268, 14th International Conference on Interactive Theorem Proving (ITP 2023)


Abstract
We describe a formal correctness proof of RANKING, an online algorithm for online bipartite matching. An outcome of our formalisation is that it shows that there is a gap in all combinatorial proofs of the algorithm. Filling that gap constituted the majority of the effort which went into this work. This is despite the algorithm being one of the most studied algorithms and a central result in theoretical computer science. This gap is an example of difficulties in formalising graphical arguments which are ubiquitous in the theory of computing.

Cite as

Mohammad Abdulaziz and Christoph Madlener. A Formal Analysis of RANKING. In 14th International Conference on Interactive Theorem Proving (ITP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 268, pp. 3:1-3:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{abdulaziz_et_al:LIPIcs.ITP.2023.3,
  author =	{Abdulaziz, Mohammad and Madlener, Christoph},
  title =	{{A Formal Analysis of RANKING}},
  booktitle =	{14th International Conference on Interactive Theorem Proving (ITP 2023)},
  pages =	{3:1--3:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-284-6},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{268},
  editor =	{Naumowicz, Adam and Thiemann, Ren\'{e}},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2023.3},
  URN =		{urn:nbn:de:0030-drops-183785},
  doi =		{10.4230/LIPIcs.ITP.2023.3},
  annote =	{Keywords: Matching Theory, Formalized Mathematics, Online Algorithms}
}
Document
Invited Talk
Trustworthy Graph Algorithms (Invited Talk)

Authors: Mohammad Abdulaziz, Kurt Mehlhorn, and Tobias Nipkow

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
The goal of the LEDA project was to build an easy-to-use and extendable library of correct and efficient data structures, graph algorithms and geometric algorithms. We report on the use of formal program verification to achieve an even higher level of trustworthiness. Specifically, we report on an ongoing and largely finished verification of the blossom-shrinking algorithm for maximum cardinality matching.

Cite as

Mohammad Abdulaziz, Kurt Mehlhorn, and Tobias Nipkow. Trustworthy Graph Algorithms (Invited Talk). In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 1:1-1:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{abdulaziz_et_al:LIPIcs.MFCS.2019.1,
  author =	{Abdulaziz, Mohammad and Mehlhorn, Kurt and Nipkow, Tobias},
  title =	{{Trustworthy Graph Algorithms}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{1:1--1:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.1},
  URN =		{urn:nbn:de:0030-drops-109456},
  doi =		{10.4230/LIPIcs.MFCS.2019.1},
  annote =	{Keywords: graph algorithms, formal correct proofs, Isabelle, LEDA, certifying algorithms}
}
Document
A Unified Approach to Tail Estimates for Randomized Incremental Construction

Authors: Sandeep Sen

Published in: LIPIcs, Volume 126, 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)


Abstract
By combining several interesting applications of random sampling in geometric algorithms like point location, linear programming, segment intersections, binary space partitioning, Clarkson and Shor [Kenneth L. Clarkson and Peter W. Shor, 1989] developed a general framework of randomized incremental construction (RIC ). The basic idea is to add objects in a random order and show that this approach yields efficient/optimal bounds on expected running time. Even quicksort can be viewed as a special case of this paradigm. However, unlike quicksort, for most of these problems, sharper tail estimates on their running times are not known. Barring some promising attempts in [Kurt Mehlhorn et al., 1993; Kenneth L. Clarkson et al., 1992; Raimund Seidel, 1991], the general question remains unresolved. In this paper we present a general technique to obtain tail estimates for RIC and and provide applications to some fundamental problems like Delaunay triangulations and construction of Visibility maps of intersecting line segments. The main result of the paper is derived from a new and careful application of Freedman’s [David Freedman, 1975] inequality for Martingale concentration that overcomes the bottleneck of the better known Azuma-Hoeffding inequality. Further, we explore instances, where an RIC based algorithm may not have inverse polynomial tail estimates. In particular, we show that the RIC time bounds for trapezoidal map can encounter a running time of Omega (n log n log log n) with probability exceeding 1/(sqrt{n)}. This rules out inverse polynomial concentration bounds within a constant factor of the O(n log n) expected running time.

Cite as

Sandeep Sen. A Unified Approach to Tail Estimates for Randomized Incremental Construction. In 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 126, pp. 58:1-58:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{sen:LIPIcs.STACS.2019.58,
  author =	{Sen, Sandeep},
  title =	{{A Unified Approach to Tail Estimates for Randomized Incremental Construction}},
  booktitle =	{36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)},
  pages =	{58:1--58:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-100-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{126},
  editor =	{Niedermeier, Rolf and Paul, Christophe},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2019.58},
  URN =		{urn:nbn:de:0030-drops-102977},
  doi =		{10.4230/LIPIcs.STACS.2019.58},
  annote =	{Keywords: ric, tail estimates, martingale, lower bound}
}
Document
Multi-Finger Binary Search Trees

Authors: Parinya Chalermsook, Mayank Goswami, László Kozma, Kurt Mehlhorn, and Thatchaphol Saranurak

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
We study multi-finger binary search trees (BSTs), a far-reaching extension of the classical BST model, with connections to the well-studied k-server problem. Finger search is a popular technique for speeding up BST operations when a query sequence has locality of reference. BSTs with multiple fingers can exploit more general regularities in the input. In this paper we consider the cost of serving a sequence of queries in an optimal (offline) BST with k fingers, a powerful benchmark against which other algorithms can be measured. We show that the k-finger optimum can be matched by a standard dynamic BST (having a single root-finger) with an O(log{k}) factor overhead. This result is tight for all k, improving the O(k) factor implicit in earlier work. Furthermore, we describe new online BSTs that match this bound up to a (log{k})^{O(1)} factor. Previously only the "one-finger" special case was known to hold for an online BST (Iacono, Langerman, 2016; Cole et al., 2000). Splay trees, assuming their conjectured optimality (Sleator and Tarjan, 1983), would have to match our bounds for all k. Our online algorithms are randomized and combine techniques developed for the k-server problem with a multiplicative-weights scheme for learning tree metrics. To our knowledge, this is the first time when tools developed for the k-server problem are used in BSTs. As an application of our k-finger results, we show that BSTs can efficiently serve queries that are close to some recently accessed item. This is a (restricted) form of the unified property (Iacono, 2001) that was previously not known to hold for any BST algorithm, online or offline.

Cite as

Parinya Chalermsook, Mayank Goswami, László Kozma, Kurt Mehlhorn, and Thatchaphol Saranurak. Multi-Finger Binary Search Trees. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 55:1-55:26, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chalermsook_et_al:LIPIcs.ISAAC.2018.55,
  author =	{Chalermsook, Parinya and Goswami, Mayank and Kozma, L\'{a}szl\'{o} and Mehlhorn, Kurt and Saranurak, Thatchaphol},
  title =	{{Multi-Finger Binary Search Trees}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{55:1--55:26},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.55},
  URN =		{urn:nbn:de:0030-drops-100032},
  doi =		{10.4230/LIPIcs.ISAAC.2018.55},
  annote =	{Keywords: binary search trees, dynamic optimality, finger search, k-server}
}
Document
On Fair Division for Indivisible Items

Authors: Bhaskar Ray Chaudhury, Yun Kuen Cheung, Jugal Garg, Naveen Garg, Martin Hoefer, and Kurt Mehlhorn

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We consider the task of assigning indivisible goods to a set of agents in a fair manner. Our notion of fairness is Nash social welfare, i.e., the goal is to maximize the geometric mean of the utilities of the agents. Each good comes in multiple items or copies, and the utility of an agent diminishes as it receives more items of the same good. The utility of a bundle of items for an agent is the sum of the utilities of the items in the bundle. Each agent has a utility cap beyond which he does not value additional items. We give a polynomial time approximation algorithm that maximizes Nash social welfare up to a factor of e^{1/{e}} ~~ 1.445. The computed allocation is Pareto-optimal and approximates envy-freeness up to one item up to a factor of 2 + epsilon.

Cite as

Bhaskar Ray Chaudhury, Yun Kuen Cheung, Jugal Garg, Naveen Garg, Martin Hoefer, and Kurt Mehlhorn. On Fair Division for Indivisible Items. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 25:1-25:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chaudhury_et_al:LIPIcs.FSTTCS.2018.25,
  author =	{Chaudhury, Bhaskar Ray and Cheung, Yun Kuen and Garg, Jugal and Garg, Naveen and Hoefer, Martin and Mehlhorn, Kurt},
  title =	{{On Fair Division for Indivisible Items}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{25:1--25:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.25},
  URN =		{urn:nbn:de:0030-drops-99242},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.25},
  annote =	{Keywords: Fair Division, Indivisible Goods, Envy-Free}
}
Document
Combinatorial Algorithms for General Linear Arrow-Debreu Markets

Authors: Bhaskar Ray Chaudhury and Kurt Mehlhorn

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We present a combinatorial algorithm for determining the market clearing prices of a general linear Arrow-Debreu market, where every agent can own multiple goods. The existing combinatorial algorithms for linear Arrow-Debreu markets consider the case where each agent can own all of one good only. We present an O~((n+m)^7 log^3(UW)) algorithm where n, m, U and W refer to the number of agents, the number of goods, the maximal integral utility and the maximum quantity of any good in the market respectively. The algorithm refines the iterative algorithm of Duan, Garg and Mehlhorn using several new ideas. We also identify the hard instances for existing combinatorial algorithms for linear Arrow-Debreu markets. In particular we find instances where the ratio of the maximum to the minimum equilibrium price of a good is U^{Omega(n)} and the number of iterations required by the existing iterative combinatorial algorithms of Duan, and Mehlhorn and Duan, Garg, and Mehlhorn are high. Our instances also separate the two algorithms.

Cite as

Bhaskar Ray Chaudhury and Kurt Mehlhorn. Combinatorial Algorithms for General Linear Arrow-Debreu Markets. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 26:1-26:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chaudhury_et_al:LIPIcs.FSTTCS.2018.26,
  author =	{Chaudhury, Bhaskar Ray and Mehlhorn, Kurt},
  title =	{{Combinatorial Algorithms for General Linear Arrow-Debreu Markets}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{26:1--26:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.26},
  URN =		{urn:nbn:de:0030-drops-99255},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.26},
  annote =	{Keywords: Linear Exchange Markets, Equilibrium, Combinatorial Algorithms}
}
Document
Computing Equilibria in Markets with Budget-Additive Utilities

Authors: Xiaohui Bei, Jugal Garg, Martin Hoefer, and Kurt Mehlhorn

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
We present the first analysis of Fisher markets with buyers that have budget-additive utility functions. Budget-additive utilities are elementary concave functions with numerous applications in online adword markets and revenue optimization problems. They extend the standard case of linear utilities and have been studied in a variety of other market models. In contrast to the frequently studied CES utilities, they have a global satiation point which can imply multiple market equilibria with quite different characteristics. Our main result is an efficient combinatorial algorithm to compute a market equilibrium with a Pareto-optimal allocation of goods. It relies on a new descending-price approach and, as a special case, also implies a novel combinatorial algorithm for computing a market equilibrium in linear Fisher markets. We complement this positive result with a number of hardness results for related computational questions. We prove that it isNP-hard to compute a market equilibrium that maximizes social welfare, and it is PPAD-hard to find any market equilibrium with utility functions with separate satiation points for each buyer and each good.

Cite as

Xiaohui Bei, Jugal Garg, Martin Hoefer, and Kurt Mehlhorn. Computing Equilibria in Markets with Budget-Additive Utilities. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 8:1-8:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bei_et_al:LIPIcs.ESA.2016.8,
  author =	{Bei, Xiaohui and Garg, Jugal and Hoefer, Martin and Mehlhorn, Kurt},
  title =	{{Computing Equilibria in Markets with Budget-Additive Utilities}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{8:1--8:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.8},
  URN =		{urn:nbn:de:0030-drops-63504},
  doi =		{10.4230/LIPIcs.ESA.2016.8},
  annote =	{Keywords: Budget-Additive Utility, Market Equilibrium, Equilibrium Computation}
}
Document
A Note On Spectral Clustering

Authors: Pavel Kolev and Kurt Mehlhorn

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
Spectral clustering is a popular and successful approach for partitioning the nodes of a graph into clusters for which the ratio of outside connections compared to the volume (sum of degrees) is small. In order to partition into k clusters, one first computes an approximation of the bottom k eigenvectors of the (normalized) Laplacian of G, uses it to embed the vertices of G into k-dimensional Euclidean space R^k, and then partitions the resulting points via a k-means clustering algorithm. It is an important task for theory to explain the success of spectral clustering. Peng et al. (COLT, 2015) made an important step in this direction. They showed that spectral clustering provably works if the gap between the (k+1)-th and the k-th eigenvalue of the normalized Laplacian is sufficiently large. They proved a structural and an algorithmic result. The algorithmic result needs a considerably stronger gap assumption and does not analyze the standard spectral clustering paradigm; it replaces spectral embedding by heat kernel embedding and k-means clustering by locality sensitive hashing. We extend their work in two directions. Structurally, we improve the quality guarantee for spectral clustering by a factor of k and simultaneously weaken the gap assumption. Algorithmically, we show that the standard paradigm for spectral clustering works. Moreover, it even works with the same gap assumption as required for the structural result.

Cite as

Pavel Kolev and Kurt Mehlhorn. A Note On Spectral Clustering. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 57:1-57:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{kolev_et_al:LIPIcs.ESA.2016.57,
  author =	{Kolev, Pavel and Mehlhorn, Kurt},
  title =	{{A Note On Spectral Clustering}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{57:1--57:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.57},
  URN =		{urn:nbn:de:0030-drops-63994},
  doi =		{10.4230/LIPIcs.ESA.2016.57},
  annote =	{Keywords: spectral embedding, k-means clustering, power method, gap assumption}
}
Document
Equilibrium Computation (Dagstuhl Seminar 14342)

Authors: Nimrod Megiddo, Kurt Mehlhorn, Rahul Savani, and Vijay V. Vazirani

Published in: Dagstuhl Reports, Volume 4, Issue 8 (2015)


Abstract
This report documents the program and outcomes of Dagstuhl Seminar 14342 "Equilibrium Computation". The seminar was at the leading edge of current topics related to equilibrium computation for games and markets. We summarize these topics, give the talk abstracts, and give brief summaries of the problems that were discussed in the open problem sessions.

Cite as

Nimrod Megiddo, Kurt Mehlhorn, Rahul Savani, and Vijay V. Vazirani. Equilibrium Computation (Dagstuhl Seminar 14342). In Dagstuhl Reports, Volume 4, Issue 8, pp. 73-88, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@Article{megiddo_et_al:DagRep.4.8.73,
  author =	{Megiddo, Nimrod and Mehlhorn, Kurt and Savani, Rahul and Vazirani, Vijay V.},
  title =	{{Equilibrium Computation (Dagstuhl Seminar 14342)}},
  pages =	{73--88},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2014},
  volume =	{4},
  number =	{8},
  editor =	{Megiddo, Nimrod and Mehlhorn, Kurt and Savani, Rahul and Vazirani, Vijay V.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.4.8.73},
  URN =		{urn:nbn:de:0030-drops-47990},
  doi =		{10.4230/DagRep.4.8.73},
  annote =	{Keywords: Algorithms, Computational Complexity, Equilibrium Computation, Game Theory, Market Equilibrium, Nash Equilibrium}
}
Document
Fair Matchings and Related Problems

Authors: Chien-Chung Huang, Telikepalli Kavitha, Kurt Mehlhorn, and Dimitrios Michail

Published in: LIPIcs, Volume 24, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2013)


Abstract
Let G = (A union B, E) be a bipartite graph, where every vertex ranks its neighbors in an order of preference (with ties allowed) and let r be the worst rank used. A matching M is fair in G if it has maximum cardinality, subject to this, M matches the minimum number of vertices to rank r neighbors, subject to that, M matches the minimum number of vertices to rank (r-1) neighbors, and so on. We show an efficient combinatorial algorithm based on LP duality to compute a fair matching in G. We also show a scaling based algorithm for the fair b-matching problem. Our two algorithms can be extended to solve other profile-based matching problems. In designing our combinatorial algorithm, we show how to solve a generalized version of the minimum weighted vertex cover problem in bipartite graphs, using a single-source shortest paths computation---this can be of independent interest.

Cite as

Chien-Chung Huang, Telikepalli Kavitha, Kurt Mehlhorn, and Dimitrios Michail. Fair Matchings and Related Problems. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 24, pp. 339-350, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{huang_et_al:LIPIcs.FSTTCS.2013.339,
  author =	{Huang, Chien-Chung and Kavitha, Telikepalli and Mehlhorn, Kurt and Michail, Dimitrios},
  title =	{{Fair Matchings and Related Problems}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2013)},
  pages =	{339--350},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-64-4},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{24},
  editor =	{Seth, Anil and Vishnoi, Nisheeth K.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2013.339},
  URN =		{urn:nbn:de:0030-drops-43841},
  doi =		{10.4230/LIPIcs.FSTTCS.2013.339},
  annote =	{Keywords: Matching with Preferences, Fairness and Rank-Maximality, Bipartite Vertex Cover, Linear Programming Duality, Complementary Slackness}
}
Document
Computational Geometry (Dagstuhl Seminar 13101)

Authors: Otfried Cheong, Kurt Mehlhorn, and Monique Teillaud

Published in: Dagstuhl Reports, Volume 3, Issue 3 (2013)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 13101 "Computational Geometry". The seminar was held from 3rd to 8th March 2013 and 47 senior and young researchers from various countries and continents attended it. Recent developments in the field were presented and new challenges in computational geometry were identified. This report collects abstracts of the talks and a list of open problems.

Cite as

Otfried Cheong, Kurt Mehlhorn, and Monique Teillaud. Computational Geometry (Dagstuhl Seminar 13101). In Dagstuhl Reports, Volume 3, Issue 3, pp. 1-23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Article{cheong_et_al:DagRep.3.3.1,
  author =	{Cheong, Otfried and Mehlhorn, Kurt and Teillaud, Monique},
  title =	{{Computational Geometry (Dagstuhl Seminar 13101)}},
  pages =	{1--23},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2013},
  volume =	{3},
  number =	{3},
  editor =	{Cheong, Otfried and Mehlhorn, Kurt and Teillaud, Monique},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.3.3.1},
  URN =		{urn:nbn:de:0030-drops-40210},
  doi =		{10.4230/DagRep.3.3.1},
  annote =	{Keywords: Algorithms, geometry, theory, approximation, implementation, combinatorics, topology}
}
Document
Publication Culture in Computing Research (Dagstuhl Perspectives Workshop 12452)

Authors: Kurt Mehlhorn, Moshe Y. Vardi, and Marc Herbstritt

Published in: Dagstuhl Reports, Volume 2, Issue 11 (2013)


Abstract
The dissemination of research results is an integral part of research and hence a crucial component for any scientific discipline. In the area of computing research, there have been raised concerns recently about its publication culture, most notably by highlighting the high priority of conferences (compared to journals in other disciplines) and -- from an economic viewpoint -- the costs of preparing and accessing research results. The Dagstuhl Perspectives Workshop 12452 ``Publication Culture in Computing Research'' aimed at discussing the main problems with a selected group of researchers and practitioners. The goal was to identify and classify the current problems and to suggest potential remedies. The group of participants was selected in a way such that a wide spectrum of opinions would be presented. This lead to intensive discussions. The workshop is seen as an important step in the ongoing discussion. As a main result, the main problem roots were identified and potential solutions were discussed. The insights will be part of an upcoming manifesto on Publication Culture in Computing Research.

Cite as

Kurt Mehlhorn, Moshe Y. Vardi, and Marc Herbstritt. Publication Culture in Computing Research (Dagstuhl Perspectives Workshop 12452). In Dagstuhl Reports, Volume 2, Issue 11, pp. 20-44, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Article{mehlhorn_et_al:DagRep.2.11.20,
  author =	{Mehlhorn, Kurt and Vardi, Moshe Y. and Herbstritt, Marc},
  title =	{{Publication Culture in Computing Research (Dagstuhl Perspectives Workshop 12452)}},
  pages =	{20--44},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2013},
  volume =	{2},
  number =	{11},
  editor =	{Mehlhorn, Kurt and Vardi, Moshe Y. and Herbstritt, Marc},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.2.11.20},
  URN =		{urn:nbn:de:0030-drops-39778},
  doi =		{10.4230/DagRep.2.11.20},
  annote =	{Keywords: scholarly publishing, conference, journal, peer review, open archive, open access, indexing, research evaluation}
}
Document
Invited Talk
Physarum Computations (Invited Talk)

Authors: Kurt Mehlhorn

Published in: LIPIcs, Volume 20, 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)


Abstract
Physarum is a slime mold. It was observed over the past 10 years that the mold is able to solve shortest path problems and to construct good Steiner networks [9, 11, 8].In a nutshell, the shortest path experiment is as follows: A maze is covered with mold and food is then provided at two positions s and t and the evolution of the slime is observed. Over time, the slime retracts to the shortest s-t-path. A video showing the wet-lab experiment can be found at http://www.youtube.com/watch?v=tLO2n3YMcXw&t=4m43s. We strongly recommend to watch this video. A mathematical model of the slime's dynamic behavior was proposed in 2007 [10]. Extensive computer simulations of the mathematical model confirm the wet-lab findings. For the edges on the shortest path, the diameter converges to one, and for the edges off the shortest path, the diameter converges to zero. We review the wet-lab and the computer experiments and provide a proof for these experimental findings. The proof was developed over a sequence of papers [6, 7, 4, 2, 1, 3]. We recommend the last two papers for first reading. An interesting connection between Physarum and ant computations is made in [5].

Cite as

Kurt Mehlhorn. Physarum Computations (Invited Talk). In 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 20, pp. 5-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{mehlhorn:LIPIcs.STACS.2013.5,
  author =	{Mehlhorn, Kurt},
  title =	{{Physarum Computations}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{5--6},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Portier, Natacha and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2013.5},
  URN =		{urn:nbn:de:0030-drops-39166},
  doi =		{10.4230/LIPIcs.STACS.2013.5},
  annote =	{Keywords: Biological computation, shortest path problems}
}
Document
Computational Geometry (Dagstuhl Seminar 11111)

Authors: Pankaj Kumar Agarwal, Kurt Mehlhorn, and Monique Teillaud

Published in: Dagstuhl Reports, Volume 1, Issue 3 (2011)


Abstract
This report documents the outcomes of Dagstuhl Seminar 11111 ``Computational Geometry''. The Seminar gathered fifty-three senior and younger researchers from various countries in the unique atmosphere offered by Schloss Dagstuhl. Abstracts of talks are collected in this report as well as a list of open problems.

Cite as

Pankaj Kumar Agarwal, Kurt Mehlhorn, and Monique Teillaud. Computational Geometry (Dagstuhl Seminar 11111). In Dagstuhl Reports, Volume 1, Issue 3, pp. 19-41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@Article{agarwal_et_al:DagRep.1.3.19,
  author =	{Agarwal, Pankaj Kumar and Mehlhorn, Kurt and Teillaud, Monique},
  title =	{{Computational Geometry (Dagstuhl Seminar 11111)}},
  pages =	{19--41},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2011},
  volume =	{1},
  number =	{3},
  editor =	{Agarwal, Pankaj Kumar and Mehlhorn, Kurt and Teillaud, Monique},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.1.3.19},
  URN =		{urn:nbn:de:0030-drops-31997},
  doi =		{10.4230/DagRep.1.3.19},
  annote =	{Keywords: Algorithms, geometry, combinatorics, topology, theory, applications, implementation}
}
Document
A Descartes Algorithms for Polynomials with Bit-Stream Coefficients

Authors: Kurt Mehlhorn, Arno Eigenwillig, Lutz Kettner, Werner Krandick, Susanne Schmitt, and Nicola Wolpert

Published in: Dagstuhl Seminar Proceedings, Volume 6021, Reliable Implementation of Real Number Algorithms: Theory and Practice (2006)


Abstract
The Descartes method is an algorithm for isolating the real roots of square-free polynomials with real coefficients. We assume that coefficients are given as (potentially infinite) bit-streams. In other words, coefficients can be approximated to any desired accuracy, but are not known exactly. We show that a variant of the Descartes algorithm can cope with bit-stream coefficients. To isolate the real roots of a square-free real polynomial $q(x) = q_nx^n+ldots+q_0$ with root separation $ ho$, coefficients $abs{q_n}ge1$ and $abs{q_i} le 2^ au$, it needs coefficient approximations to $O(n(log(1/ ho) + au))$ bits after the binary point and has an expected cost of $O(n^4 (log(1/ ho) + au)^2)$ bit operations.

Cite as

Kurt Mehlhorn, Arno Eigenwillig, Lutz Kettner, Werner Krandick, Susanne Schmitt, and Nicola Wolpert. A Descartes Algorithms for Polynomials with Bit-Stream Coefficients. In Reliable Implementation of Real Number Algorithms: Theory and Practice. Dagstuhl Seminar Proceedings, Volume 6021, pp. 1-12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{mehlhorn_et_al:DagSemProc.06021.3,
  author =	{Mehlhorn, Kurt and Eigenwillig, Arno and Kettner, Lutz and Krandick, Werner and Schmitt, Susanne and Wolpert, Nicola},
  title =	{{A Descartes Algorithms for Polynomials with Bit-Stream Coefficients}},
  booktitle =	{Reliable Implementation of Real Number Algorithms: Theory and Practice},
  pages =	{1--12},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6021},
  editor =	{Peter Hertling and Christoph M. Hoffmann and Wolfram Luther and Nathalie Revol},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.06021.3},
  URN =		{urn:nbn:de:0030-drops-7157},
  doi =		{10.4230/DagSemProc.06021.3},
  annote =	{Keywords: Root Isolation, Interval Arithmetic, Descartes Algorithm}
}
  • Refine by Author
  • 17 Mehlhorn, Kurt
  • 2 Abdulaziz, Mohammad
  • 2 Arge, Lars
  • 2 Chaudhury, Bhaskar Ray
  • 2 Garg, Jugal
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Design and analysis of algorithms
  • 1 Mathematics of computing
  • 1 Software and its engineering → Formal software verification
  • 1 Software and its engineering → Software libraries and repositories
  • 1 Theory of computation
  • Show More...

  • Refine by Keyword
  • 3 Algorithms
  • 2 Equilibrium Computation
  • 2 Market Equilibrium
  • 2 combinatorics
  • 2 geometry
  • Show More...

  • Refine by Type
  • 21 document

  • Refine by Publication Year
  • 4 2013
  • 3 2005
  • 3 2018
  • 2 2016
  • 2 2019
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail