13 Search Results for "Schmidt, Thomas"


Document
Dynamic Constant Time Parallel Graph Algorithms with Sub-Linear Work

Authors: Jonas Schmidt and Thomas Schwentick

Published in: LIPIcs, Volume 272, 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)


Abstract
The paper proposes dynamic parallel algorithms for connectivity and bipartiteness of undirected graphs that require constant time and 𝒪(n^{1/2+ε}) work on the CRCW PRAM model. The work of these algorithms almost matches the work of the 𝒪(log n) time algorithm for connectivity by Kopelowitz et al. (2018) on the EREW PRAM model and the time of the sequential algorithm for bipartiteness by Eppstein et al. (1997). In particular, we show that the sparsification technique, which has been used in both mentioned papers, can in principle also be used for constant time algorithms in the CRCW PRAM model, despite the logarithmic depth of sparsification trees.

Cite as

Jonas Schmidt and Thomas Schwentick. Dynamic Constant Time Parallel Graph Algorithms with Sub-Linear Work. In 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 272, pp. 80:1-80:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{schmidt_et_al:LIPIcs.MFCS.2023.80,
  author =	{Schmidt, Jonas and Schwentick, Thomas},
  title =	{{Dynamic Constant Time Parallel Graph Algorithms with Sub-Linear Work}},
  booktitle =	{48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)},
  pages =	{80:1--80:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-292-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{272},
  editor =	{Leroux, J\'{e}r\^{o}me and Lombardy, Sylvain and Peleg, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2023.80},
  URN =		{urn:nbn:de:0030-drops-186140},
  doi =		{10.4230/LIPIcs.MFCS.2023.80},
  annote =	{Keywords: Dynamic parallel algorithms, Undirected connectivity, Bipartiteness}
}
Document
On the Work of Dynamic Constant-Time Parallel Algorithms for Regular Tree Languages and Context-Free Languages

Authors: Jonas Schmidt, Thomas Schwentick, and Jennifer Todtenhoefer

Published in: LIPIcs, Volume 272, 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)


Abstract
Previous work on Dynamic Complexity has established that there exist dynamic constant-time parallel algorithms for regular tree languages and context-free languages under label or symbol changes. However, these algorithms were not developed with the goal to minimise work (or, equivalently, the number of processors). In fact, their inspection yields the work bounds 𝒪(n²) and 𝒪(n⁷) per change operation, respectively. In this paper, dynamic algorithms for regular tree languages are proposed that generalise the previous algorithms in that they allow unbounded node rank and leaf insertions, while improving the work bound from 𝒪(n²) to 𝒪(n^ε), for arbitrary ε > 0. For context-free languages, algorithms with better work bounds (compared with 𝒪(n⁷)) for restricted classes are proposed: for every ε > 0 there are such algorithms for deterministic context-free languages with work bound 𝒪(n^{3+ε}) and for visibly pushdown languages with work bound 𝒪(n^{2+ε}).

Cite as

Jonas Schmidt, Thomas Schwentick, and Jennifer Todtenhoefer. On the Work of Dynamic Constant-Time Parallel Algorithms for Regular Tree Languages and Context-Free Languages. In 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 272, pp. 81:1-81:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{schmidt_et_al:LIPIcs.MFCS.2023.81,
  author =	{Schmidt, Jonas and Schwentick, Thomas and Todtenhoefer, Jennifer},
  title =	{{On the Work of Dynamic Constant-Time Parallel Algorithms for Regular Tree Languages and Context-Free Languages}},
  booktitle =	{48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)},
  pages =	{81:1--81:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-292-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{272},
  editor =	{Leroux, J\'{e}r\^{o}me and Lombardy, Sylvain and Peleg, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2023.81},
  URN =		{urn:nbn:de:0030-drops-186152},
  doi =		{10.4230/LIPIcs.MFCS.2023.81},
  annote =	{Keywords: Dynamic complexity, work, parallel constant time}
}
Document
Work-Efficient Query Evaluation with PRAMs

Authors: Jens Keppeler, Thomas Schwentick, and Christopher Spinrath

Published in: LIPIcs, Volume 255, 26th International Conference on Database Theory (ICDT 2023)


Abstract
The paper studies query evaluation in parallel constant time in the PRAM model. While it is well-known that all relational algebra queries can be evaluated in constant time on an appropriate CRCW-PRAM, this paper is interested in the efficiency of evaluation algorithms, that is, in the number of processors or, asymptotically equivalent, in the work. Naive evaluation in the parallel setting results in huge (polynomial) bounds on the work of such algorithms and in presentations of the result sets that can be extremely scattered in memory. The paper first discusses some obstacles for constant time PRAM query evaluation. It presents algorithms for relational operators that are considerably more efficient than the naive approaches. Further it explores three settings, in which efficient sequential query evaluation algorithms exist: acyclic queries, semi-join algebra queries, and join queries - the latter in the worst-case optimal framework. Under natural assumptions on the representation of the database, the work of the given algorithms matches the best sequential algorithms in the case of semi-join queries, and it comes close in the other two settings. An important tool is the compaction technique from Hagerup (1992).

Cite as

Jens Keppeler, Thomas Schwentick, and Christopher Spinrath. Work-Efficient Query Evaluation with PRAMs. In 26th International Conference on Database Theory (ICDT 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 255, pp. 16:1-16:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{keppeler_et_al:LIPIcs.ICDT.2023.16,
  author =	{Keppeler, Jens and Schwentick, Thomas and Spinrath, Christopher},
  title =	{{Work-Efficient Query Evaluation with PRAMs}},
  booktitle =	{26th International Conference on Database Theory (ICDT 2023)},
  pages =	{16:1--16:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-270-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{255},
  editor =	{Geerts, Floris and Vandevoort, Brecht},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2023.16},
  URN =		{urn:nbn:de:0030-drops-177589},
  doi =		{10.4230/LIPIcs.ICDT.2023.16},
  annote =	{Keywords: PRAM, query evaluation, work-efficient, parallel, acyclic queries, free-connex queries}
}
Document
Dynamic Complexity of Regular Languages: Big Changes, Small Work

Authors: Felix Tschirbs, Nils Vortmeier, and Thomas Zeume

Published in: LIPIcs, Volume 252, 31st EACSL Annual Conference on Computer Science Logic (CSL 2023)


Abstract
Whether a changing string is member of a certain regular language can be maintained in the DynFO framework of Patnaik and Immerman: after changing the symbol at one position of the string, a first-order update formula can express - using additionally stored information - whether the resulting string is in the regular language. We extend this and further known results by considering changes of many positions at once. We also investigate to which degree the obtained update formulas imply work-efficient parallel dynamic algorithms.

Cite as

Felix Tschirbs, Nils Vortmeier, and Thomas Zeume. Dynamic Complexity of Regular Languages: Big Changes, Small Work. In 31st EACSL Annual Conference on Computer Science Logic (CSL 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 252, pp. 35:1-35:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{tschirbs_et_al:LIPIcs.CSL.2023.35,
  author =	{Tschirbs, Felix and Vortmeier, Nils and Zeume, Thomas},
  title =	{{Dynamic Complexity of Regular Languages: Big Changes, Small Work}},
  booktitle =	{31st EACSL Annual Conference on Computer Science Logic (CSL 2023)},
  pages =	{35:1--35:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-264-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{252},
  editor =	{Klin, Bartek and Pimentel, Elaine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2023.35},
  URN =		{urn:nbn:de:0030-drops-174963},
  doi =		{10.4230/LIPIcs.CSL.2023.35},
  annote =	{Keywords: dynamic descriptive complexity, regular languages, batch changes, work}
}
Document
Towards a Corpus of Historical German Plays with Emotion Annotations

Authors: Thomas Schmidt, Katrin Dennerlein, and Christian Wolff

Published in: OASIcs, Volume 93, 3rd Conference on Language, Data and Knowledge (LDK 2021)


Abstract
In this paper, we present first work-in-progress annotation results of a project investigating computational methods of emotion analysis for historical German plays around 1800. We report on the development of an annotation scheme focussing on the annotation of emotions that are important from a literary studies perspective for this time span as well as on the annotation process we have developed. We annotate emotions expressed or attributed by characters of the plays in the written texts. The scheme consists of 13 hierarchically structured emotion concepts as well as the source (who experiences or attributes the emotion) and target (who or what is the emotion directed towards). We have conducted the annotation of five example plays of our corpus with two annotators per play and report on annotation distributions and agreement statistics. We were able to collect over 6,500 emotion annotations and identified a fair agreement for most concepts around a κ-value of 0.4. We discuss how we plan to improve annotator consistency and continue our work. The results also have implications for similar projects in the context of Digital Humanities.

Cite as

Thomas Schmidt, Katrin Dennerlein, and Christian Wolff. Towards a Corpus of Historical German Plays with Emotion Annotations. In 3rd Conference on Language, Data and Knowledge (LDK 2021). Open Access Series in Informatics (OASIcs), Volume 93, pp. 9:1-9:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{schmidt_et_al:OASIcs.LDK.2021.9,
  author =	{Schmidt, Thomas and Dennerlein, Katrin and Wolff, Christian},
  title =	{{Towards a Corpus of Historical German Plays with Emotion Annotations}},
  booktitle =	{3rd Conference on Language, Data and Knowledge (LDK 2021)},
  pages =	{9:1--9:11},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-199-3},
  ISSN =	{2190-6807},
  year =	{2021},
  volume =	{93},
  editor =	{Gromann, Dagmar and S\'{e}rasset, Gilles and Declerck, Thierry and McCrae, John P. and Gracia, Jorge and Bosque-Gil, Julia and Bobillo, Fernando and Heinisch, Barbara},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.LDK.2021.9},
  URN =		{urn:nbn:de:0030-drops-145459},
  doi =		{10.4230/OASIcs.LDK.2021.9},
  annote =	{Keywords: Emotion, Annotation, Digital Humanities, Computational Literary Studies, German Drama, Sentiment Analysis, Emotion Analysis, Corpus}
}
Document
Robust Algorithms for the Secretary Problem

Authors: Domagoj Bradac, Anupam Gupta, Sahil Singla, and Goran Zuzic

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
In classical secretary problems, a sequence of n elements arrive in a uniformly random order, and we want to choose a single item, or a set of size K. The random order model allows us to escape from the strong lower bounds for the adversarial order setting, and excellent algorithms are known in this setting. However, one worrying aspect of these results is that the algorithms overfit to the model: they are not very robust. Indeed, if a few "outlier" arrivals are adversarially placed in the arrival sequence, the algorithms perform poorly. E.g., Dynkin’s popular 1/e-secretary algorithm is sensitive to even a single adversarial arrival: if the adversary gives one large bid at the beginning of the stream, the algorithm does not select any element at all. We investigate a robust version of the secretary problem. In the Byzantine Secretary model, we have two kinds of elements: green (good) and red (rogue). The values of all elements are chosen by the adversary. The green elements arrive at times uniformly randomly drawn from [0,1]. The red elements, however, arrive at adversarially chosen times. Naturally, the algorithm does not see these colors: how well can it solve secretary problems? We show that selecting the highest value red set, or the single largest green element is not possible with even a small fraction of red items. However, on the positive side, we show that these are the only bad cases, by giving algorithms which get value comparable to the value of the optimal green set minus the largest green item. (This benchmark reminds us of regret minimization and digital auctions, where we subtract an additive term depending on the "scale" of the problem.) Specifically, we give an algorithm to pick K elements, which gets within (1-ε) factor of the above benchmark, as long as K ≥ poly(ε^{-1} log n). We extend this to the knapsack secretary problem, for large knapsack size K. For the single-item case, an analogous benchmark is the value of the second-largest green item. For value-maximization, we give a poly log^* n-competitive algorithm, using a multi-layered bucketing scheme that adaptively refines our estimates of second-max over time. For probability-maximization, we show the existence of a good randomized algorithm, using the minimax principle. We hope that this work will spur further research on robust algorithms for the secretary problem, and for other problems in sequential decision-making, where the existing algorithms are not robust and often tend to overfit to the model.

Cite as

Domagoj Bradac, Anupam Gupta, Sahil Singla, and Goran Zuzic. Robust Algorithms for the Secretary Problem. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 32:1-32:26, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bradac_et_al:LIPIcs.ITCS.2020.32,
  author =	{Bradac, Domagoj and Gupta, Anupam and Singla, Sahil and Zuzic, Goran},
  title =	{{Robust Algorithms for the Secretary Problem}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{32:1--32:26},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.32},
  URN =		{urn:nbn:de:0030-drops-117171},
  doi =		{10.4230/LIPIcs.ITCS.2020.32},
  annote =	{Keywords: stochastic optimization, robust optimization, secretary problem, matroid secretary, robust secretary}
}
Document
Algorithms and Adaptivity Gaps for Stochastic k-TSP

Authors: Haotian Jiang, Jian Li, Daogao Liu, and Sahil Singla

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Given a metric (V,d) and a root ∈ V, the classic k-TSP problem is to find a tour originating at the root of minimum length that visits at least k nodes in V. In this work, motivated by applications where the input to an optimization problem is uncertain, we study two stochastic versions of k-TSP. In Stoch-Reward k-TSP, originally defined by Ene-Nagarajan-Saket [Ene et al., 2018], each vertex v in the given metric (V,d) contains a stochastic reward R_v. The goal is to adaptively find a tour of minimum expected length that collects at least reward k; here "adaptively" means our next decision may depend on previous outcomes. Ene et al. give an O(log k)-approximation adaptive algorithm for this problem, and left open if there is an O(1)-approximation algorithm. We totally resolve their open question, and even give an O(1)-approximation non-adaptive algorithm for Stoch-Reward k-TSP. We also introduce and obtain similar results for the Stoch-Cost k-TSP problem. In this problem each vertex v has a stochastic cost C_v, and the goal is to visit and select at least k vertices to minimize the expected sum of tour length and cost of selected vertices. Besides being a natural stochastic generalization of k-TSP, this problem is also interesting because it generalizes the Price of Information framework [Singla, 2018] from deterministic probing costs to metric probing costs. Our techniques are based on two crucial ideas: "repetitions" and "critical scaling". In general, replacing a random variable with its expectation leads to very poor results. We show that for our problems, if we truncate the random variables at an ideal threshold, then their expected values form a good surrogate. Here, we rely on running several repetitions of our algorithm with the same threshold, and then argue concentration using Freedman’s and Jogdeo-Samuels' inequalities. Unfortunately, this ideal threshold depends on how far we are from achieving our target k, which a non-adaptive algorithm does not know. To overcome this barrier, we truncate the random variables at various different scales and identify a "critical" scale.

Cite as

Haotian Jiang, Jian Li, Daogao Liu, and Sahil Singla. Algorithms and Adaptivity Gaps for Stochastic k-TSP. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 45:1-45:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{jiang_et_al:LIPIcs.ITCS.2020.45,
  author =	{Jiang, Haotian and Li, Jian and Liu, Daogao and Singla, Sahil},
  title =	{{Algorithms and Adaptivity Gaps for Stochastic k-TSP}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{45:1--45:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.45},
  URN =		{urn:nbn:de:0030-drops-117308},
  doi =		{10.4230/LIPIcs.ITCS.2020.45},
  annote =	{Keywords: approximation algorithms, stochastic optimization, travelling salesman problem}
}
Document
Dynamic Complexity Meets Parameterised Algorithms

Authors: Jonas Schmidt, Thomas Schwentick, Nils Vortmeier, Thomas Zeume, and Ioannis Kokkinis

Published in: LIPIcs, Volume 152, 28th EACSL Annual Conference on Computer Science Logic (CSL 2020)


Abstract
Dynamic Complexity studies the maintainability of queries with logical formulas in a setting where the underlying structure or database changes over time. Most often, these formulas are from first-order logic, giving rise to the dynamic complexity class DynFO. This paper investigates extensions of DynFO in the spirit of parameterised algorithms. In this setting structures come with a parameter k and the extensions allow additional "space" of size f(k) (in the form of an additional structure of this size) or additional time f(k) (in the form of iterations of formulas) or both. The resulting classes are compared with their non-dynamic counterparts and other classes. The main part of the paper explores the applicability of methods for parameterised algorithms to this setting through case studies for various well-known parameterised problems.

Cite as

Jonas Schmidt, Thomas Schwentick, Nils Vortmeier, Thomas Zeume, and Ioannis Kokkinis. Dynamic Complexity Meets Parameterised Algorithms. In 28th EACSL Annual Conference on Computer Science Logic (CSL 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 152, pp. 36:1-36:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{schmidt_et_al:LIPIcs.CSL.2020.36,
  author =	{Schmidt, Jonas and Schwentick, Thomas and Vortmeier, Nils and Zeume, Thomas and Kokkinis, Ioannis},
  title =	{{Dynamic Complexity Meets Parameterised Algorithms}},
  booktitle =	{28th EACSL Annual Conference on Computer Science Logic (CSL 2020)},
  pages =	{36:1--36:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-132-0},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{152},
  editor =	{Fern\'{a}ndez, Maribel and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2020.36},
  URN =		{urn:nbn:de:0030-drops-116792},
  doi =		{10.4230/LIPIcs.CSL.2020.36},
  annote =	{Keywords: Dynamic complexity, parameterised complexity}
}
Document
Towards Realistic Pedestrian Route Planning

Authors: Simeon Andreev, Julian Dibbelt, Martin Nöllenburg, Thomas Pajor, and Dorothea Wagner

Published in: OASIcs, Volume 48, 15th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS 2015)


Abstract
Pedestrian routing has its specific set of challenges, which are often neglected by state-of-the-art route planners. For instance, the lack of detailed sidewalk data and the inability to traverse plazas and parks in a natural way often leads to unappealing and suboptimal routes. In this work, we first propose to augment the network by generating sidewalks based on the street geometry and adding edges for routing over plazas and squares. Using this and further information, our query algorithm seamlessly handles node-to-node queries and queries whose origin or destination is an arbitrary location on a plaza or inside a park. Our experiments show that we are able to compute appealing pedestrian routes at negligible overhead over standard routing algorithms.

Cite as

Simeon Andreev, Julian Dibbelt, Martin Nöllenburg, Thomas Pajor, and Dorothea Wagner. Towards Realistic Pedestrian Route Planning. In 15th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS 2015). Open Access Series in Informatics (OASIcs), Volume 48, pp. 1-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{andreev_et_al:OASIcs.ATMOS.2015.1,
  author =	{Andreev, Simeon and Dibbelt, Julian and N\"{o}llenburg, Martin and Pajor, Thomas and Wagner, Dorothea},
  title =	{{Towards Realistic Pedestrian Route Planning}},
  booktitle =	{15th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS 2015)},
  pages =	{1--15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-99-6},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{48},
  editor =	{Italiano, Giuseppe F. and Schmidt, Marie},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.ATMOS.2015.1},
  URN =		{urn:nbn:de:0030-drops-54592},
  doi =		{10.4230/OASIcs.ATMOS.2015.1},
  annote =	{Keywords: pedestrian routing, realistic model, shortest paths, speed-up technique}
}
Document
Weighted Polynomial Approximations: Limits for Learning and Pseudorandomness

Authors: Mark Bun and Thomas Steinke

Published in: LIPIcs, Volume 40, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)


Abstract
Low-degree polynomial approximations to the sign function underlie pseudorandom generators for halfspaces, as well as algorithms for agnostically learning halfspaces. We study the limits of these constructions by proving inapproximability results for the sign function. First, we investigate the derandomization of Chernoff-type concentration inequalities. Schmidt et al. (SIAM J. Discrete Math. 1995) showed that a tail bound of delta can be established for sums of Bernoulli random variables with only O(log(1/delta))-wise independence. We show that their results are tight up to constant factors. Secondly, the “polynomial regression” algorithm of Kalai et al. (SIAM J. Comput. 2008) shows that halfspaces can be efficiently learned with respect to log-concave distributions on R^n in the challenging agnostic learning model. The power of this algorithm relies on the fact that under log-concave distributions, halfspaces can be approximated arbitrarily well by low-degree polynomials. In contrast, we exhibit a large class of non-log-concave distributions under which polynomials of any degree cannot approximate the sign function to within arbitrarily low error.

Cite as

Mark Bun and Thomas Steinke. Weighted Polynomial Approximations: Limits for Learning and Pseudorandomness. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 40, pp. 625-644, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{bun_et_al:LIPIcs.APPROX-RANDOM.2015.625,
  author =	{Bun, Mark and Steinke, Thomas},
  title =	{{Weighted Polynomial Approximations: Limits for Learning and Pseudorandomness}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)},
  pages =	{625--644},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-89-7},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{40},
  editor =	{Garg, Naveen and Jansen, Klaus and Rao, Anup and Rolim, Jos\'{e} D. P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2015.625},
  URN =		{urn:nbn:de:0030-drops-53274},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2015.625},
  annote =	{Keywords: Polynomial Approximations, Pseudorandomness, Concentration, Learning Theory, Halfspaces}
}
Document
The Complexity of Integrating Routing Decisions in Public Transportation Models

Authors: Marie Schmidt and Anita Schöbel

Published in: OASIcs, Volume 14, 10th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS'10) (2010)


Abstract
To model and solve optimization problems arising in public transportation, data about the passengers is necessary and has to be included in the models in any phase of the planning process. Many approaches assume a two-step procedure: in a first step, the data about the passengers is distributed over the public transportation network using traffic-assignment procedures. In a second step, the actual planning of lines, timetables, etc. takes place. This approach ignores that for most passengers there are many possible ways to reach their destinations in the public transportation network, thus the actual connections the passengers will take depend strongly on the decisions made during the planning phase. In this paper we investigate the influence of integrating the traffic assignment procedure in the optimization process on the complexity of line planning and aperiodic timetabling. In both problems, our objective is to maximize the passengers' benefit, namely to minimize the overall travel time of the passengers in the network. We present new models, analyze NP-hardness results arising from the integration of the routing decisions in the traditional models, and derive polynomial algorithms for special cases.

Cite as

Marie Schmidt and Anita Schöbel. The Complexity of Integrating Routing Decisions in Public Transportation Models. In 10th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS'10). Open Access Series in Informatics (OASIcs), Volume 14, pp. 156-169, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{schmidt_et_al:OASIcs.ATMOS.2010.156,
  author =	{Schmidt, Marie and Sch\"{o}bel, Anita},
  title =	{{The Complexity of Integrating Routing Decisions in Public Transportation Models}},
  booktitle =	{10th Workshop on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems (ATMOS'10)},
  pages =	{156--169},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-20-0},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{14},
  editor =	{Erlebach, Thomas and L\"{u}bbecke, Marco},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.ATMOS.2010.156},
  URN =		{urn:nbn:de:0030-drops-27575},
  doi =		{10.4230/OASIcs.ATMOS.2010.156},
  annote =	{Keywords: Line Planning, Timetabling, Routing}
}
Document
Construction Sequences and Certifying 3-Connectedness

Authors: Jens M. Schmidt

Published in: LIPIcs, Volume 5, 27th International Symposium on Theoretical Aspects of Computer Science (2010)


Abstract
Given two $3$-connected graphs $G$ and $H$, a \emph{construction sequence} constructs $G$ from $H$ (e.\,g. from the $K_4$) with three basic operations, called the \emph{Barnette-Gr\"unbaum operations}. These operations are known to be able to construct all $3$-connected graphs. We extend this result by identifying every intermediate graph in the construction sequence with a subdivision in $G$ and showing under some minor assumptions that there is still a construction sequence to $G$ when we start from an \emph{arbitrary prescribed} $H$-subdivision. This leads to the first algorithm that computes a construction sequence in time $O(|V(G)|^2)$. As an application, we develop a certificate for the $3$-connectedness of graphs that can be easily computed and verified. Based on this, a certifying test on $3$-connectedness is designed.%Finding certifying algorithms is a major goal for problems where the efficient solutions known are complicated. Tutte proved that every $3$-connected graph on more than $4$ nodes has a \emph{contractible edge}. Barnette and Gr\"unbaum proved the existence of a \emph{removable edge} in the same setting. We show that the sequence of contractions and the sequence of removals from $G$ to the $K_4$ can be computed in $O(|V|^2)$ time by extending Barnette and Gr\"unbaum's theorem. As an application, we derive a certificate for the $3$-connectedness of graphs that can be easily computed and verified.

Cite as

Jens M. Schmidt. Construction Sequences and Certifying 3-Connectedness. In 27th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 5, pp. 633-644, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{schmidt:LIPIcs.STACS.2010.2491,
  author =	{Schmidt, Jens M.},
  title =	{{Construction Sequences and Certifying 3-Connectedness}},
  booktitle =	{27th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{633--644},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-16-3},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{5},
  editor =	{Marion, Jean-Yves and Schwentick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2010.2491},
  URN =		{urn:nbn:de:0030-drops-24918},
  doi =		{10.4230/LIPIcs.STACS.2010.2491},
  annote =	{Keywords: Construction sequence, 3-connected graph, nested subdivisions, inductive characterization, 3-connectedness, certifying algorithm}
}
Document
Weighted L_2 B Discrepancy and Approximation of Integrals over Reproducing Kernel Hilbert Spaces

Authors: Michael Gnewuch

Published in: Dagstuhl Seminar Proceedings, Volume 9391, Algorithms and Complexity for Continuous Problems (2009)


Abstract
We extend the notion of $L_2$ $B$ discrepancy provided in [E. Novak, H. Wo'zniakowski, $L_2$ discrepancy and multivariate integration, in: Analytic number theory. Essays in honour of Klaus Roth. W. W. L. Chen, W. T. Gowers, H. Halberstam, W. M. Schmidt, and R. C. Vaughan (Eds.), Cambridge University Press, Cambridge, 2009, 359 – 388] to the weighted $L_2$ $mathcal{B}$ discrepancy. This newly defined notion allows to consider weights, but also volume measures different from the Lebesgue measure and classes of test sets different from measurable subsets of some Euclidean space. We relate the weighted $L_2$ $mathcal{B}$ discrepancy to numerical integration defined over weighted reproducing kernel Hilbert spaces and settle in this way an open problem posed by Novak and Wo'zniakowski.

Cite as

Michael Gnewuch. Weighted L_2 B Discrepancy and Approximation of Integrals over Reproducing Kernel Hilbert Spaces. In Algorithms and Complexity for Continuous Problems. Dagstuhl Seminar Proceedings, Volume 9391, pp. 1-9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{gnewuch:DagSemProc.09391.5,
  author =	{Gnewuch, Michael},
  title =	{{Weighted L\underline2 B Discrepancy and Approximation of Integrals over Reproducing Kernel Hilbert Spaces}},
  booktitle =	{Algorithms and Complexity for Continuous Problems},
  pages =	{1--9},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9391},
  editor =	{Thomas M\"{u}ller-Gronbach and Leszek Plaskota and Joseph. F. Traub},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09391.5},
  URN =		{urn:nbn:de:0030-drops-22966},
  doi =		{10.4230/DagSemProc.09391.5},
  annote =	{Keywords: Discrepancy, Numerical Integration, Quasi-Monte Carlo, Reproducing Kernel Hilbert Space}
}
  • Refine by Author
  • 4 Schwentick, Thomas
  • 3 Schmidt, Jonas
  • 2 Singla, Sahil
  • 2 Vortmeier, Nils
  • 2 Zeume, Thomas
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Complexity theory and logic
  • 2 Theory of computation → Logic and databases
  • 2 Theory of computation → Parallel algorithms
  • 1 Applied computing → Arts and humanities
  • 1 Computing methodologies → Machine learning
  • Show More...

  • Refine by Keyword
  • 2 Dynamic complexity
  • 2 stochastic optimization
  • 2 work
  • 1 3-connected graph
  • 1 3-connectedness
  • Show More...

  • Refine by Type
  • 13 document

  • Refine by Publication Year
  • 4 2023
  • 3 2020
  • 2 2010
  • 2 2015
  • 1 2009
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail