8 Search Results for "Heeger, Klaus"


Document
Minimizing the Weighted Number of Tardy Jobs Is W[1]-Hard

Authors: Klaus Heeger and Danny Hermelin

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
We consider the 1∣∣∑ w_jU_j problem, the problem of minimizing the weighted number of tardy jobs on a single machine. This problem is one of the most basic and fundamental problems in scheduling theory, with several different applications both in theory and practice. Using a reduction from the Multicolored Clique problem, we prove that 1∣∣∑ w_jU_j is W[1]-hard with respect to the number p_# of different processing times in the input, as well as with respect to the number w_# of different weights in the input. This, along with previous work, provides a complete picture for 1∣∣∑ w_jU_j from the perspective of parameterized complexity, as well as almost tight complexity bounds for the problem under the Exponential Time Hypothesis (ETH).

Cite as

Klaus Heeger and Danny Hermelin. Minimizing the Weighted Number of Tardy Jobs Is W[1]-Hard. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 68:1-68:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.ESA.2024.68,
  author =	{Heeger, Klaus and Hermelin, Danny},
  title =	{{Minimizing the Weighted Number of Tardy Jobs Is W\lbrack1\rbrack-Hard}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{68:1--68:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.68},
  URN =		{urn:nbn:de:0030-drops-211392},
  doi =		{10.4230/LIPIcs.ESA.2024.68},
  annote =	{Keywords: single-machine scheduling, number of different weights, number of different processing times}
}
Document
Track A: Algorithms, Complexity and Games
No Polynomial Kernels for Knapsack

Authors: Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
This paper focuses on kernelization algorithms for the fundamental Knapsack problem. A kernelization algorithm (or kernel) is a polynomial-time reduction from a problem onto itself, where the output size is bounded by a function of some problem-specific parameter. Such algorithms provide a theoretical model for data reduction and preprocessing and are central in the area of parameterized complexity. In this way, a kernel for Knapsack for some parameter k reduces any instance of Knapsack to an equivalent instance of size at most f(k) in polynomial time, for some computable function f. When f(k) = k^{O(1)} then we call such a reduction a polynomial kernel. Our study focuses on two natural parameters for Knapsack: The number w_# of different item weights, and the number p_# of different item profits. Our main technical contribution is a proof showing that Knapsack does not admit a polynomial kernel for any of these two parameters under standard complexity-theoretic assumptions. Our proof discovers an elaborate application of the standard kernelization lower bound framework, and develops along the way novel ideas that should be useful for other problems as well. We complement our lower bounds by showing that Knapsack admits a polynomial kernel for the combined parameter w_# ⋅ p_#.

Cite as

Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay. No Polynomial Kernels for Knapsack. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 83:1-83:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.ICALP.2024.83,
  author =	{Heeger, Klaus and Hermelin, Danny and Mnich, Matthias and Shabtay, Dvir},
  title =	{{No Polynomial Kernels for Knapsack}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{83:1--83:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.83},
  URN =		{urn:nbn:de:0030-drops-202261},
  doi =		{10.4230/LIPIcs.ICALP.2024.83},
  annote =	{Keywords: Knapsack, polynomial kernels, compositions, number of different weights, number of different profits}
}
Document
Single Machine Scheduling with Few Deadlines

Authors: Klaus Heeger, Danny Hermelin, and Dvir Shabtay

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
We study single-machine scheduling problems with few deadlines. We focus on two classical objectives, namely minimizing the weighted number of tardy jobs and the total weighted completion time. For both problems, we give a pseudopolynomial-time algorithm for a constant number of different deadlines. This algorithm is complemented with an ETH-based, almost tight lower bound. Furthermore, we study the case where the number of jobs with a nontrivial deadline is taken as parameter. For this case, the complexity of our two problems differ: Minimizing the total number of tardy jobs becomes fixed-parameter tractable, while minimizing the total weighted completion time is W[1]-hard.

Cite as

Klaus Heeger, Danny Hermelin, and Dvir Shabtay. Single Machine Scheduling with Few Deadlines. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 24:1-24:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.IPEC.2023.24,
  author =	{Heeger, Klaus and Hermelin, Danny and Shabtay, Dvir},
  title =	{{Single Machine Scheduling with Few Deadlines}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{24:1--24:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.24},
  URN =		{urn:nbn:de:0030-drops-194434},
  doi =		{10.4230/LIPIcs.IPEC.2023.24},
  annote =	{Keywords: Single-machine scheduling, weighted completion time, tardy jobs, pseudopolynomial algorithms, parameterized complexity}
}
Document
Fully Polynomial-Time Algorithms Parameterized by Vertex Integrity Using Fast Matrix Multiplication

Authors: Matthias Bentert, Klaus Heeger, and Tomohiro Koana

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
We study the computational complexity of several polynomial-time-solvable graph problems parameterized by vertex integrity, a measure of a graph’s vulnerability to vertex removal in terms of connectivity. Vertex integrity is the smallest number ι such that there is a set S of ι' ≤ ι vertices such that every connected component of G-S contains at most ι-ι' vertices. It is known that the vertex integrity lies between the well-studied parameters vertex cover number and tree-depth. Our work follows similar studies for vertex cover number [Alon and Yuster, ESA 2007] and tree-depth [Iwata, Ogasawara, and Ohsaka, STACS 2018]. Alon and Yuster designed algorithms for graphs with small vertex cover number using fast matrix multiplications. We demonstrate that fast matrix multiplication can also be effectively used when parameterizing by vertex integrity ι by developing efficient algorithms for problems including an O(ι^{ω-1}n)-time algorithm for Maximum Matching and an O(ι^{(ω-1)/2}n²) ⊆ O(ι^{0.687} n²)-time algorithm for All-Pairs Shortest Paths. These algorithms can be faster than previous algorithms parameterized by tree-depth, for which fast matrix multiplication is not known to be effective.

Cite as

Matthias Bentert, Klaus Heeger, and Tomohiro Koana. Fully Polynomial-Time Algorithms Parameterized by Vertex Integrity Using Fast Matrix Multiplication. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 16:1-16:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bentert_et_al:LIPIcs.ESA.2023.16,
  author =	{Bentert, Matthias and Heeger, Klaus and Koana, Tomohiro},
  title =	{{Fully Polynomial-Time Algorithms Parameterized by Vertex Integrity Using Fast Matrix Multiplication}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{16:1--16:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.16},
  URN =		{urn:nbn:de:0030-drops-186692},
  doi =		{10.4230/LIPIcs.ESA.2023.16},
  annote =	{Keywords: FPT in P, Algebraic Algorithms, Adaptive Algorithms, Subgraph Detection, Matching, APSP}
}
Document
Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions

Authors: Klaus Heeger, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
We provide a general framework to exclude parameterized running times of the form O(l^β + n^γ) for problems that have polynomial running time lower bounds under hypotheses from fine-grained complexity. Our framework is based on cross-compositions from parameterized complexity. We (conditionally) exclude running times of the form O(l^{γ/(γ-1) - ε} + n^γ) for any 1 < γ < 2 and ε > 0 for the following problems: - Longest Common (Increasing) Subsequence: Given two length-n strings over an alphabet Σ (over ℕ) and l ∈ ℕ, is there a common (increasing) subsequence of length l in both strings? - Discrete Fréchet Distance: Given two lists of n points each and k ∈ N, is the Fréchet distance of the lists at most k? Here l is the maximum number of points which one list is ahead of the other list in an optimum traversal. - Planar Motion Planning: Given a set of n non-intersecting axis-parallel line segment obstacles in the plane and a line segment robot (called rod), can the rod be moved to a specified target without touching any obstacles? Here l is the maximum number of segments any segment has in its vicinity. Moreover, we exclude running times O(l^{2γ/(γ-1) - ε} + n^γ) for any 1 < γ < 3 and ε > 0 for: - Negative Triangle: Given an edge-weighted graph with n vertices, is there a triangle whose sum of edge-weights is negative? Here l is the order of a maximum connected component. - Triangle Collection: Given a vertex-colored graph with n vertices, is there for each triple of colors a triangle whose vertices have these three colors? Here l is the order of a maximum connected component. - 2nd Shortest Path: Given an n-vertex edge-weighted digraph, vertices s and t, and k ∈ ℕ, has the second longest s-t-path length at most k? Here l is the directed feedback vertex set number. Except for 2nd Shortest Path all these running time bounds are tight, that is, algorithms with running time O(l^{γ/(γ-1)} + n^γ) for any 1 < γ < 2 and O(l^{2γ/(γ -1)} + n^γ) for any 1 < γ < 3, respectively, are known. Our running time lower bounds also imply lower bounds on kernelization algorithms for these problems.

Cite as

Klaus Heeger, André Nichterlein, and Rolf Niedermeier. Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 35:1-35:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.STACS.2023.35,
  author =	{Heeger, Klaus and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{35:1--35:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.35},
  URN =		{urn:nbn:de:0030-drops-176876},
  doi =		{10.4230/LIPIcs.STACS.2023.35},
  annote =	{Keywords: FPT in P, Kernelization, Decomposition}
}
Document
Deepening the (Parameterized) Complexity Analysis of Incremental Stable Matching Problems

Authors: Niclas Boehmer, Klaus Heeger, and Rolf Niedermeier

Published in: LIPIcs, Volume 241, 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)


Abstract
When computing stable matchings, it is usually assumed that the preferences of the agents in the matching market are fixed. However, in many realistic scenarios, preferences change over time. Consequently, an initially stable matching may become unstable. Then, a natural goal is to find a matching which is stable with respect to the modified preferences and as close as possible to the initial one. For Stable Marriage/Roommates, this problem was formally defined as Incremental Stable Marriage/Roommates by Bredereck et al. [AAAI '20]. As they showed that Incremental Stable Roommates and Incremental Stable Marriage with Ties are NP-hard, we focus on the parameterized complexity of these problems. We answer two open questions of Bredereck et al. [AAAI '20]: We show that Incremental Stable Roommates is W[1]-hard parameterized by the number of changes in the preferences, yet admits an intricate XP-algorithm, and we show that Incremental Stable Marriage with Ties is W[1]-hard parameterized by the number of ties. Furthermore, we analyze the influence of the degree of "similarity" between the agents' preference lists, identifying several polynomial-time solvable and fixed-parameter tractable cases, but also proving that Incremental Stable Roommates and Incremental Stable Marriage with Ties parameterized by the number of different preference lists are W[1]-hard.

Cite as

Niclas Boehmer, Klaus Heeger, and Rolf Niedermeier. Deepening the (Parameterized) Complexity Analysis of Incremental Stable Matching Problems. In 47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 241, pp. 21:1-21:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{boehmer_et_al:LIPIcs.MFCS.2022.21,
  author =	{Boehmer, Niclas and Heeger, Klaus and Niedermeier, Rolf},
  title =	{{Deepening the (Parameterized) Complexity Analysis of Incremental Stable Matching Problems}},
  booktitle =	{47th International Symposium on Mathematical Foundations of Computer Science (MFCS 2022)},
  pages =	{21:1--21:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-256-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{241},
  editor =	{Szeider, Stefan and Ganian, Robert and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2022.21},
  URN =		{urn:nbn:de:0030-drops-168194},
  doi =		{10.4230/LIPIcs.MFCS.2022.21},
  annote =	{Keywords: Stable Marriage, Stable Roommates, adapting to changing preferences, NP-hardness, W\lbrack1\rbrack-hardness, XP, FPT, master lists, incremental algorithms}
}
Document
Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters

Authors: Matthias Bentert, Klaus Heeger, and Dušan Knop

Published in: LIPIcs, Volume 181, 31st International Symposium on Algorithms and Computation (ISAAC 2020)


Abstract
In the presented paper, we study the Length-Bounded Cut problem for special graph classes as well as from a parameterized-complexity viewpoint. Here, we are given a graph G, two vertices s and t, and positive integers β and λ. The task is to find a set F of edges of size at most β such that every s-t-path of length at most λ in G contains some edge in F. Bazgan et al. [Networks, 2019] conjectured that Length-Bounded Cut admits a polynomial-time algorithm if the input graph G is a proper interval graph. We confirm this conjecture by providing a dynamic-programming based polynomial-time algorithm. Moreover, we strengthen the W[1]-hardness result of Dvořák and Knop [Algorithmica, 2018] for Length-Bounded Cut parameterized by pathwidth. Our reduction is shorter, and the target of the reduction has stronger structural properties. Consequently, we give W[1]-hardness for the combined parameter pathwidth and maximum degree of the input graph. Finally, we prove that Length-Bounded Cut is W[1]-hard for the feedback vertex number. Both our hardness results complement known XP algorithms.

Cite as

Matthias Bentert, Klaus Heeger, and Dušan Knop. Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 36:1-36:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bentert_et_al:LIPIcs.ISAAC.2020.36,
  author =	{Bentert, Matthias and Heeger, Klaus and Knop, Du\v{s}an},
  title =	{{Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{36:1--36:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.36},
  URN =		{urn:nbn:de:0030-drops-133800},
  doi =		{10.4230/LIPIcs.ISAAC.2020.36},
  annote =	{Keywords: Edge-disjoint paths, pathwidth, feedback vertex number}
}
Document
Parameterized Complexity of Stable Roommates with Ties and Incomplete Lists Through the Lens of Graph Parameters

Authors: Robert Bredereck, Klaus Heeger, Dušan Knop, and Rolf Niedermeier

Published in: LIPIcs, Volume 149, 30th International Symposium on Algorithms and Computation (ISAAC 2019)


Abstract
We continue and extend previous work on the parameterized complexity analysis of the NP-hard Stable Roommates with Ties and Incomplete Lists problem, thereby strengthening earlier results both on the side of parameterized hardness as well as on the side of fixed-parameter tractability. Other than for its famous sister problem Stable Marriage which focuses on a bipartite scenario, Stable Roommates with Incomplete Lists allows for arbitrary acceptability graphs whose edges specify the possible matchings of each two agents (agents are represented by graph vertices). Herein, incomplete lists and ties reflect the fact that in realistic application scenarios the agents cannot bring all other agents into a linear order. Among our main contributions is to show that it is W[1]-hard to compute a maximum-cardinality stable matching for acceptability graphs of bounded treedepth, bounded tree-cut width, and bounded feedback vertex number (these are each time the respective parameters). However, if we "only" ask for perfect stable matchings or the mere existence of a stable matching, then we obtain fixed-parameter tractability with respect to tree-cut width but not with respect to treedepth. On the positive side, we also provide fixed-parameter tractability results for the parameter feedback edge set number.

Cite as

Robert Bredereck, Klaus Heeger, Dušan Knop, and Rolf Niedermeier. Parameterized Complexity of Stable Roommates with Ties and Incomplete Lists Through the Lens of Graph Parameters. In 30th International Symposium on Algorithms and Computation (ISAAC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 149, pp. 44:1-44:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bredereck_et_al:LIPIcs.ISAAC.2019.44,
  author =	{Bredereck, Robert and Heeger, Klaus and Knop, Du\v{s}an and Niedermeier, Rolf},
  title =	{{Parameterized Complexity of Stable Roommates with Ties and Incomplete Lists Through the Lens of Graph Parameters}},
  booktitle =	{30th International Symposium on Algorithms and Computation (ISAAC 2019)},
  pages =	{44:1--44:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-130-6},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{149},
  editor =	{Lu, Pinyan and Zhang, Guochuan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2019.44},
  URN =		{urn:nbn:de:0030-drops-115406},
  doi =		{10.4230/LIPIcs.ISAAC.2019.44},
  annote =	{Keywords: Stable matching, acceptability graph, fixed-parameter tractability, W\lbrack1\rbrack-hardness, treewidth, treedepth, tree-cut width, feedback set numbers}
}
  • Refine by Author
  • 8 Heeger, Klaus
  • 3 Hermelin, Danny
  • 3 Niedermeier, Rolf
  • 2 Bentert, Matthias
  • 2 Knop, Dušan
  • Show More...

  • Refine by Classification

  • Refine by Keyword
  • 2 FPT in P
  • 2 W[1]-hardness
  • 2 number of different weights
  • 1 APSP
  • 1 Adaptive Algorithms
  • Show More...

  • Refine by Type
  • 8 document

  • Refine by Publication Year
  • 3 2023
  • 2 2024
  • 1 2019
  • 1 2020
  • 1 2022

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail