9 Search Results for "B�cker, H. Martin"


Document
The Complexity of Homomorphism Reconstructibility

Authors: Jan Böker, Louis Härtel, Nina Runde, Tim Seppelt, and Christoph Standke

Published in: LIPIcs, Volume 289, 41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024)


Abstract
Representing graphs by their homomorphism counts has led to the beautiful theory of homomorphism indistinguishability in recent years. Moreover, homomorphism counts have promising applications in database theory and machine learning, where one would like to answer queries or classify graphs solely based on the representation of a graph G as a finite vector of homomorphism counts from some fixed finite set of graphs to G. We study the computational complexity of the arguably most fundamental computational problem associated to these representations, the homomorphism reconstructability problem: given a finite sequence of graphs and a corresponding vector of natural numbers, decide whether there exists a graph G that realises the given vector as the homomorphism counts from the given graphs. We show that this problem yields a natural example of an NP^#𝖯-hard problem, which still can be NP-hard when restricted to a fixed number of input graphs of bounded treewidth and a fixed input vector of natural numbers, or alternatively, when restricted to a finite input set of graphs. We further show that, when restricted to a finite input set of graphs and given an upper bound on the order of the graph G as additional input, the problem cannot be NP-hard unless 𝖯 = NP. For this regime, we obtain partial positive results. We also investigate the problem’s parameterised complexity and provide fpt-algorithms for the case that a single graph is given and that multiple graphs of the same order with subgraph instead of homomorphism counts are given.

Cite as

Jan Böker, Louis Härtel, Nina Runde, Tim Seppelt, and Christoph Standke. The Complexity of Homomorphism Reconstructibility. In 41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 289, pp. 19:1-19:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{boker_et_al:LIPIcs.STACS.2024.19,
  author =	{B\"{o}ker, Jan and H\"{a}rtel, Louis and Runde, Nina and Seppelt, Tim and Standke, Christoph},
  title =	{{The Complexity of Homomorphism Reconstructibility}},
  booktitle =	{41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024)},
  pages =	{19:1--19:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-311-9},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{289},
  editor =	{Beyersdorff, Olaf and Kant\'{e}, Mamadou Moustapha and Kupferman, Orna and Lokshtanov, Daniel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2024.19},
  URN =		{urn:nbn:de:0030-drops-197298},
  doi =		{10.4230/LIPIcs.STACS.2024.19},
  annote =	{Keywords: graph homomorphism, counting complexity, parameterised complexity}
}
Document
The Complexity of Homomorphism Indistinguishability

Authors: Jan Böker, Yijia Chen, Martin Grohe, and Gaurav Rattan

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
For every graph class {F}, let HomInd({F}) be the problem of deciding whether two given graphs are homomorphism-indistinguishable over {F}, i.e., for every graph F in {F}, the number hom(F, G) of homomorphisms from F to G equals the corresponding number hom(F, H) for H. For several natural graph classes (such as paths, trees, bounded treewidth graphs), homomorphism-indistinguishability over the class has an efficient structural characterization, resulting in polynomial time solvability [H. Dell et al., 2018]. In particular, it is known that two non-isomorphic graphs are homomorphism-indistinguishable over the class {T}_k of graphs of treewidth k if and only if they are not distinguished by k-dimensional Weisfeiler-Leman algorithm, a central heuristic for isomorphism testing: this characterization implies a polynomial time algorithm for HomInd({T}_k), for every fixed k in N. In this paper, we show that there is a polynomial-time-decidable class {F} of undirected graphs of bounded treewidth such that HomInd({F}) is undecidable. Our second hardness result concerns the class {K} of complete graphs. We show that HomInd({K}) is co-NP-hard, and in fact, we show completeness for the class C_=P (under P-time Turing reductions). On the algorithmic side, we show that HomInd({P}) can be solved in polynomial time for the class {P} of directed paths. We end with a brief study of two variants of the HomInd({F}) problem: (a) the problem of lexographic-comparison of homomorphism numbers of two graphs, and (b) the problem of computing certain distance-measures (defined via homomorphism numbers) between two graphs.

Cite as

Jan Böker, Yijia Chen, Martin Grohe, and Gaurav Rattan. The Complexity of Homomorphism Indistinguishability. In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 54:1-54:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{boker_et_al:LIPIcs.MFCS.2019.54,
  author =	{B\"{o}ker, Jan and Chen, Yijia and Grohe, Martin and Rattan, Gaurav},
  title =	{{The Complexity of Homomorphism Indistinguishability}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{54:1--54:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.54},
  URN =		{urn:nbn:de:0030-drops-109980},
  doi =		{10.4230/LIPIcs.MFCS.2019.54},
  annote =	{Keywords: graph homomorphism numbers, counting complexity, treewidth}
}
Document
Tight Complexity Lower Bounds for Integer Linear Programming with Few Constraints

Authors: Dušan Knop, Michał Pilipczuk, and Marcin Wrochna

Published in: LIPIcs, Volume 126, 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)


Abstract
We consider the standard ILP Feasibility problem: given an integer linear program of the form {Ax = b, x >= 0}, where A is an integer matrix with k rows and l columns, x is a vector of l variables, and b is a vector of k integers, we ask whether there exists x in N^l that satisfies Ax = b. Each row of A specifies one linear constraint on x; our goal is to study the complexity of ILP Feasibility when both k, the number of constraints, and |A|_infty, the largest absolute value of an entry in A, are small. Papadimitriou [Christos H. Papadimitriou, 1981] was the first to give a fixed-parameter algorithm for ILP Feasibility under parameterization by the number of constraints that runs in time ((|A |_infty + |b|_infty) * k)^O(k^2). This was very recently improved by Eisenbrand and Weismantel [Friedrich Eisenbrand and Robert Weismantel, 2018], who used the Steinitz lemma to design an algorithm with running time (k |A|_infty)^{O(k)}* |b|_infty^2, which was subsequently improved by Jansen and Rohwedder [Klaus Jansen and Lars Rohwedder, 2019] to O(k |A |_infty)^k* log |b|_infty. We prove that for {0,1}-matrices A, the running time of the algorithm of Eisenbrand and Weismantel is probably optimal: an algorithm with running time 2^{o(k log k)}* (l+|{b}|_infty)^{o(k)} would contradict the Exponential Time Hypothesis (ETH). This improves previous non-tight lower bounds of Fomin et al. [Fedor V. Fomin et al., 2018]. We then consider integer linear programs that may have many constraints, but they need to be structured in a "shallow" way. Precisely, we consider the parameter {dual treedepth} of the matrix A, denoted td_D(A), which is the treedepth of the graph over the rows of A, where two rows are adjacent if in some column they simultaneously contain a non-zero entry. It was recently shown by Koutecký et al. [Martin Koutecký et al., 2018] that {ILP Feasibility} can be solved in time |A |_infty^{2^O(td_D(A))} * (k+l+log |b|_infty)^O(1). We present a streamlined proof of this fact and prove that, again, this running time is probably optimal: even assuming that all entries of A and {b} are in {-1,0,1}, the existence of an algorithm with running time 2^{2^o(td_D(A))} * (k+l)^O(1) would contradict the ETH.

Cite as

Dušan Knop, Michał Pilipczuk, and Marcin Wrochna. Tight Complexity Lower Bounds for Integer Linear Programming with Few Constraints. In 36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 126, pp. 44:1-44:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{knop_et_al:LIPIcs.STACS.2019.44,
  author =	{Knop, Du\v{s}an and Pilipczuk, Micha{\l} and Wrochna, Marcin},
  title =	{{Tight Complexity Lower Bounds for Integer Linear Programming with Few Constraints}},
  booktitle =	{36th International Symposium on Theoretical Aspects of Computer Science (STACS 2019)},
  pages =	{44:1--44:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-100-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{126},
  editor =	{Niedermeier, Rolf and Paul, Christophe},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2019.44},
  URN =		{urn:nbn:de:0030-drops-102831},
  doi =		{10.4230/LIPIcs.STACS.2019.44},
  annote =	{Keywords: integer linear programming, fixed-parameter tractability, ETH}
}
Document
Parametricity, Automorphisms of the Universe, and Excluded Middle

Authors: Auke B. Booij, Martín H. Escardó, Peter LeFanu Lumsdaine, and Michael Shulman

Published in: LIPIcs, Volume 97, 22nd International Conference on Types for Proofs and Programs (TYPES 2016)


Abstract
It is known that one can construct non-parametric functions by assuming classical axioms. Our work is a converse to that: we prove classical axioms in dependent type theory assuming specific instances of non-parametricity. We also address the interaction between classical axioms and the existence of automorphisms of a type universe. We work over intensional Martin-Löf dependent type theory, and for some results assume further principles including function extensionality, propositional extensionality, propositional truncation, and the univalence axiom.

Cite as

Auke B. Booij, Martín H. Escardó, Peter LeFanu Lumsdaine, and Michael Shulman. Parametricity, Automorphisms of the Universe, and Excluded Middle. In 22nd International Conference on Types for Proofs and Programs (TYPES 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 97, pp. 7:1-7:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{booij_et_al:LIPIcs.TYPES.2016.7,
  author =	{Booij, Auke B. and Escard\'{o}, Mart{\'\i}n H. and Lumsdaine, Peter LeFanu and Shulman, Michael},
  title =	{{Parametricity, Automorphisms of the Universe, and Excluded Middle}},
  booktitle =	{22nd International Conference on Types for Proofs and Programs (TYPES 2016)},
  pages =	{7:1--7:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-065-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{97},
  editor =	{Ghilezan, Silvia and Geuvers, Herman and Ivetic, Jelena},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.TYPES.2016.7},
  URN =		{urn:nbn:de:0030-drops-98554},
  doi =		{10.4230/LIPIcs.TYPES.2016.7},
  annote =	{Keywords: relational parametricity, dependent type theory, univalent foundations, homotopy type theory, excluded middle, classical mathematics, constructive mat}
}
Document
Bounding Helly Numbers via Betti Numbers

Authors: Xavier Goaoc, Pavel Paták, Zuzana Patáková, Martin Tancer, and Uli Wagner

Published in: LIPIcs, Volume 34, 31st International Symposium on Computational Geometry (SoCG 2015)


Abstract
We show that very weak topological assumptions are enough to ensure the existence of a Helly-type theorem. More precisely, we show that for any non-negative integers b and d there exists an integer h(b,d) such that the following holds. If F is a finite family of subsets of R^d such that the ith reduced Betti number (with Z_2 coefficients in singular homology) of the intersection of any proper subfamily G of F is at most b for every non-negative integer i less or equal to (d-1)/2, then F has Helly number at most h(b,d). These topological conditions are sharp: not controlling any of these first Betti numbers allow for families with unbounded Helly number. Our proofs combine homological non-embeddability results with a Ramsey-based approach to build, given an arbitrary simplicial complex K, some well-behaved chain map from C_*(K) to C_*(R^d). Both techniques are of independent interest.

Cite as

Xavier Goaoc, Pavel Paták, Zuzana Patáková, Martin Tancer, and Uli Wagner. Bounding Helly Numbers via Betti Numbers. In 31st International Symposium on Computational Geometry (SoCG 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 34, pp. 507-521, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{goaoc_et_al:LIPIcs.SOCG.2015.507,
  author =	{Goaoc, Xavier and Pat\'{a}k, Pavel and Pat\'{a}kov\'{a}, Zuzana and Tancer, Martin and Wagner, Uli},
  title =	{{Bounding Helly Numbers via Betti Numbers}},
  booktitle =	{31st International Symposium on Computational Geometry (SoCG 2015)},
  pages =	{507--521},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-83-5},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{34},
  editor =	{Arge, Lars and Pach, J\'{a}nos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SOCG.2015.507},
  URN =		{urn:nbn:de:0030-drops-51297},
  doi =		{10.4230/LIPIcs.SOCG.2015.507},
  annote =	{Keywords: Helly-type theorem, Ramsey’s theorem, Embedding of simplicial complexes, Homological almost-embedding, Betti numbers}
}
Document
Construction of Implicit Surfaces from Point Clouds Using a Feature-based Approach

Authors: Patric Keller, Oliver Kreylos, Eric S. Cowgill, Louise H. Kellogg, and Martin Hering-Bertram

Published in: Dagstuhl Follow-Ups, Volume 2, Scientific Visualization: Interactions, Features, Metaphors (2011)


Abstract
We present a novel feature-based approach to surface generation from point clouds in three-dimensional space obtained by terrestrial and airborne laser scanning. In a first step, we apply a multiscale clustering and classification of local point set neighborhoods by considering their geometric shape. Corresponding feature values quantify the similarity to curve-like, surface-like, and solid-like shapes. For selecting and extracting surface features, we build a hierarchical trivariate B-spline representation of this surface feature function. Surfaces are extracted with a variant of marching cubes (MC), providing an inner and outer shell that are merged into a single non-manifold surface component at the field’s ridges. By adapting the isovalue of the feature function the user may control surface topology and thus adapt the extracted features to the noise level of the underlying point cloud. User control and adaptive approximation make our method robust for noisy and complex point data.

Cite as

Patric Keller, Oliver Kreylos, Eric S. Cowgill, Louise H. Kellogg, and Martin Hering-Bertram. Construction of Implicit Surfaces from Point Clouds Using a Feature-based Approach. In Scientific Visualization: Interactions, Features, Metaphors. Dagstuhl Follow-Ups, Volume 2, pp. 129-143, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InCollection{keller_et_al:DFU.Vol2.SciViz.2011.129,
  author =	{Keller, Patric and Kreylos, Oliver and Cowgill, Eric S. and Kellogg, Louise H. and Hering-Bertram, Martin},
  title =	{{Construction of Implicit Surfaces from Point Clouds Using a Feature-based Approach}},
  booktitle =	{Scientific Visualization: Interactions, Features, Metaphors},
  pages =	{129--143},
  series =	{Dagstuhl Follow-Ups},
  ISBN =	{978-3-939897-26-2},
  ISSN =	{1868-8977},
  year =	{2011},
  volume =	{2},
  editor =	{Hagen, Hans},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DFU.Vol2.SciViz.2011.129},
  URN =		{urn:nbn:de:0030-drops-33032},
  doi =		{10.4230/DFU.Vol2.SciViz.2011.129},
  annote =	{Keywords: 3D Point Clouds, Surface Reconstruction, Implicit Surfaces}
}
Document
Termination of Integer Term Rewriting

Authors: Carsten Fuhs, Jürgen Giesl, Martin Plücker, Peter Schneider-Kamp, and Stephan Falke

Published in: Dagstuhl Seminar Proceedings, Volume 9411, Interaction versus Automation: The two Faces of Deduction (2010)


Abstract
Recently, techniques and tools from term rewriting have been successfully applied to prove termination automatically for different programming languages. The advantage of rewrite techniques is that they are very powerful for algorithms on user-defined data structures. But in contrast to techniques for termination analysis of imperative programs, the drawback of rewrite techniques is that they do not support data structures like integer numbers which are pre-defined in almost all programming languages. To solve this problem, we extend term rewriting by built-in integers and adapt the dependency pair framework to prove termination of integer term rewriting automatically. Our experiments show that this indeed combines the power of rewrite techniques on user-defined data types with a powerful treatment of pre-defined integers.

Cite as

Carsten Fuhs, Jürgen Giesl, Martin Plücker, Peter Schneider-Kamp, and Stephan Falke. Termination of Integer Term Rewriting. In Interaction versus Automation: The two Faces of Deduction. Dagstuhl Seminar Proceedings, Volume 9411, pp. 1-4, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{fuhs_et_al:DagSemProc.09411.5,
  author =	{Fuhs, Carsten and Giesl, J\"{u}rgen and Pl\"{u}cker, Martin and Schneider-Kamp, Peter and Falke, Stephan},
  title =	{{Termination of Integer Term Rewriting}},
  booktitle =	{Interaction versus Automation: The two Faces of Deduction},
  pages =	{1--4},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9411},
  editor =	{Thomas Ball and J\"{u}rgen Giesl and Reiner H\"{a}hnle and Tobias Nipkow},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09411.5},
  URN =		{urn:nbn:de:0030-drops-24233},
  doi =		{10.4230/DagSemProc.09411.5},
  annote =	{Keywords: Termination analysis, integers, term rewriting, dependency pairs}
}
Document
Assessing an approximation algorithm for the minimum fill-in problem in practice

Authors: H. Martin Bücker, Michael Lülfesmann, and Arno Rasch

Published in: Dagstuhl Seminar Proceedings, Volume 9061, Combinatorial Scientific Computing (2009)


Abstract
We investigate an implementation of an approximation algorithm for the minimum fill-in problem. The algorithm has some degree of freedom since it is composed of several subtasks for which one can choose between different algorithms. The goal of the present work is to study the impact of theses components and carefully examine the practicability of the overall approximation algorithm by a set of numerical examples.

Cite as

H. Martin Bücker, Michael Lülfesmann, and Arno Rasch. Assessing an approximation algorithm for the minimum fill-in problem in practice. In Combinatorial Scientific Computing. Dagstuhl Seminar Proceedings, Volume 9061, p. 1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{bucker_et_al:DagSemProc.09061.5,
  author =	{B\"{u}cker, H. Martin and L\"{u}lfesmann, Michael and Rasch, Arno},
  title =	{{Assessing an approximation algorithm for the minimum fill-in problem in practice}},
  booktitle =	{Combinatorial Scientific Computing},
  pages =	{1--1},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9061},
  editor =	{Uwe Naumann and Olaf Schenk and Horst D. Simon and Sivan Toledo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09061.5},
  URN =		{urn:nbn:de:0030-drops-21191},
  doi =		{10.4230/DagSemProc.09061.5},
  annote =	{Keywords: Sparse linear algebra}
}
Document
Towards the Development of an Interval Arithmetic Environment for Validated Computer-Aided Design and Verification of Systems in Control Engineering

Authors: Andreas Rauh, Johanna Minisini, and Eberhard P. Hofer

Published in: Dagstuhl Seminar Proceedings, Volume 8021, Numerical Validation in Current Hardware Architectures (2008)


Abstract
Modern techniques for the design and analysis of control strategies for nonlinear dynamical systems are often based on the simulation of the open-loop as well as the closed-loop dynamical behavior of suitable mathematical models. In control engineering, continuous-time and discrete-time state-space representations are widely used which are given by sets of ordinary differential equations and difference equations, respectively. In addition to these representations, sets of differential algebraic equations are commonly used. Since we will focus on computational techniques which are applied for the design and mathematical verification of controllers for lumped parameter systems, i.e., systems which do not contain elements with distributed parameters, partial differential equations will not be considered in this talk. The prerequisite for the design and robustness analysis of each control system is the identification of mathematical models which describe the dynamics of the plant to be controlled as well as the available measurement devices with a sufficient accuracy. The model identification task comprises the derivation of physically motivated state equations, their parameterization based on measured data, as well as simplifications to apply specific approaches for controller design. In the design stage, both open-loop and closed-loop control strategies can be considered. Since dynamical system models are subject to uncertain parameters and uncertain initial conditions in most practical applications, detailed mathematical formulations of the desired dynamics of the controlled system are necessary. These specifications involve the definition of robustness with respect to the above-mentioned uncertainties. For linear system representations, robustness is commonly specified in terms of regions in the complex domain containing all admissible poles of the closed-loop transfer functions ($Gamma$-stability) or in terms of specifications of worst-case bounds for the frequency response ($mathcal{B}$-stability) [1]. However, these specifications do not allow for inclusion of bounds for the state variables which are often available in the time domain if controllers are designed for safety critical applications. Especially for nonlinear dynamical systems, pole assignment based on the linearization of nonlinear mathematical models generally leads to the necessity for the analysis of asymptotic stability of the resulting closed-loop dynamics. In this presentation, we will give an overview of the potential use of validated techniques for the analysis and design of controllers for nonlinear dynamical systems with uncertainties, where the systems under consideration will be subject to constraints for both state and control variables. As an application scenario the design of robust control strategies for a biological wastewater treatment process will be discussed. In the design and the verification process, constraints for both state and control variables which are given by guaranteed interval bounds in the time domain are taken into account. Suitable computational techniques are, for example, based on an extension of the validated initial value problem solver {sc ValEncIA-IVP} [2,6]. For that purpose, differential sensitivities of the trajectories of all state variables with respect to variations of the parameters of the mathematical system model as well as the adaptation of controller parameters are computed. This information can then be used for online identification and adaptation of parameters during the operation of a closed-loop controller as well as in offline design, verification, and optimization. Here, the interval arithmetic routines for sensitivity analysis allow to compute guaranteed differential sensitivity measures for system models with both nominal parameters and interval uncertainties. The presented interval arithmetic techniques are the basis for a general purpose tool for the analysis and the design of robust and optimal control strategies for uncertain dynamical systems. The presentation is concluded with an outlook on the formulation of control problems using sets of differential algebraic equations. Possibilities for the extension of {sc ValEncIA-IVP} to this type of system representation will be summarized. Relations between the presented interval arithmetic approach and methods for stabilizing control of nonlinear dynamical systems which make use of structural system properties such as differential flatness [3] and exact feedback linearization are highlighted [4,5]. In the latter case, input-output linearization as well as (in special cases) input-to-state linearization are of practical importance. References: [1] J. Ackermann, P. Blue, T. B"unte, L. G"uvenc, D. Kaesbauer, M. Kordt, M. Muhler, and D. Odenthal, {it{Robust Control: The Parameter Space Approach}}, Springer--Verlag, London, 2nd edition, 2002. [2] E. Auer, A. Rauh, E. P. Hofer, and W. Luther, {it{Validated Modeling of Mechanical Systems with {sc SmartMOBILE}: Improvement of Performance by {sc ValEncIA-IVP}}}, In Proceedings of Dagstuhl Seminar 06021: Reliable Implementation of Real Number Algorithms: Theory and Practice, Lecture Notes in Computer Science, Dagstuhl, Germany, 2006. In print. [3] M. Fliess, J. Lévine, P. Martin, and P. Rouchon, {it{Flatness and Defect of Nonlinear Systems: Introductory Theory and Examples}}, International Journal of Control, vol. 61, pp. 1327--1361, 1995. [4] H. K. Khalil, {it{Nonlinear Systems}}, Prentice-Hall, Upper Saddle River, New Jersey, 3rd edition, 2002. [5] H. J. Marquez, {it{Nonlinear Control Systems}}, John Wiley & Sons, Inc., New Jersey, 2003. [6] A. Rauh and E. Auer, {{www.valencia-ivp.com}}.

Cite as

Andreas Rauh, Johanna Minisini, and Eberhard P. Hofer. Towards the Development of an Interval Arithmetic Environment for Validated Computer-Aided Design and Verification of Systems in Control Engineering. In Numerical Validation in Current Hardware Architectures. Dagstuhl Seminar Proceedings, Volume 8021, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{rauh_et_al:DagSemProc.08021.21,
  author =	{Rauh, Andreas and Minisini, Johanna and Hofer, Eberhard P.},
  title =	{{Towards the Development of an Interval Arithmetic Environment for Validated Computer-Aided Design and Verification of Systems in Control Engineering}},
  booktitle =	{Numerical Validation in Current Hardware Architectures},
  pages =	{1--10},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8021},
  editor =	{Annie Cuyt and Walter Kr\"{a}mer and Wolfram Luther and Peter Markstein},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08021.21},
  URN =		{urn:nbn:de:0030-drops-14529},
  doi =		{10.4230/DagSemProc.08021.21},
  annote =	{Keywords: Interval techniques, \{sc\{ValEncIA-IVP\}\}, controller design, robustness, validated integration of ODEs, parameter uncertainties, sensitivity analysis}
}
  • Refine by Author
  • 2 Böker, Jan
  • 1 Booij, Auke B.
  • 1 Bücker, H. Martin
  • 1 Chen, Yijia
  • 1 Cowgill, Eric S.
  • Show More...

  • Refine by Classification
  • 2 Mathematics of computing → Graph theory
  • 2 Theory of computation → Graph algorithms analysis
  • 1 Theory of computation → Fixed parameter tractability
  • 1 Theory of computation → Integer programming
  • 1 Theory of computation → Parameterized complexity and exact algorithms

  • Refine by Keyword
  • 2 counting complexity
  • 1 3D Point Clouds
  • 1 Betti numbers
  • 1 ETH
  • 1 Embedding of simplicial complexes
  • Show More...

  • Refine by Type
  • 9 document

  • Refine by Publication Year
  • 2 2019
  • 1 2008
  • 1 2009
  • 1 2010
  • 1 2011
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail