18 Search Results for "Winter, Andreas"


Document
Public-Key Pseudoentanglement and the Hardness of Learning Ground State Entanglement Structure

Authors: Adam Bouland, Bill Fefferman, Soumik Ghosh, Tony Metger, Umesh Vazirani, Chenyi Zhang, and Zixin Zhou

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Given a local Hamiltonian, how difficult is it to determine the entanglement structure of its ground state? We show that this problem is computationally intractable even if one is only trying to decide if the ground state is volume-law vs near area-law entangled. We prove this by constructing strong forms of pseudoentanglement in a public-key setting, where the circuits used to prepare the states are public knowledge. In particular, we construct two families of quantum circuits which produce volume-law vs near area-law entangled states, but nonetheless the classical descriptions of the circuits are indistinguishable under the Learning with Errors (LWE) assumption. Indistinguishability of the circuits then allows us to translate our construction to Hamiltonians. Our work opens new directions in Hamiltonian complexity, for example whether it is difficult to learn certain phases of matter.

Cite as

Adam Bouland, Bill Fefferman, Soumik Ghosh, Tony Metger, Umesh Vazirani, Chenyi Zhang, and Zixin Zhou. Public-Key Pseudoentanglement and the Hardness of Learning Ground State Entanglement Structure. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 21:1-21:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bouland_et_al:LIPIcs.CCC.2024.21,
  author =	{Bouland, Adam and Fefferman, Bill and Ghosh, Soumik and Metger, Tony and Vazirani, Umesh and Zhang, Chenyi and Zhou, Zixin},
  title =	{{Public-Key Pseudoentanglement and the Hardness of Learning Ground State Entanglement Structure}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{21:1--21:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.21},
  URN =		{urn:nbn:de:0030-drops-204175},
  doi =		{10.4230/LIPIcs.CCC.2024.21},
  annote =	{Keywords: Quantum computing, Quantum complexity theory, entanglement}
}
Document
Track A: Algorithms, Complexity and Games
Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification

Authors: Esther Galby, Sándor Kisfaludi-Bak, Dániel Marx, and Roohani Sharma

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
In the Directed Steiner Network problem, the input is a directed graph G, a set T ⊆ V(G) of k terminals, and a demand graph D on T. The task is to find a subgraph H ⊆ G with the minimum number of edges such that for every (s,t) ∈ E(D), the solution H contains a directed s → t path. The goal of this paper is to investigate how the complexity of the problem depends on the demand pattern in planar graphs. Formally, if 𝒟 is a class of directed graphs, then the 𝒟-Steiner Network (𝒟-DSN) problem is the special case where the demand graph D is restricted to be from 𝒟. We give a complete characterization of the behavior of every 𝒟-DSN problem on planar graphs. We classify every class 𝒟 closed under transitive equivalence and identification of vertices into three cases: assuming ETH, either the problem is 1) solvable in time 2^O(k)⋅n^O(1), i.e., FPT parameterized by the number k of terminals, but not solvable in time 2^o(k)⋅n^O(1), 2) solvable in time f(k)⋅n^O(√k), but cannot be solved in time f(k)⋅n^o(√k), or 3) solvable in time f(k)⋅n^O(k), but cannot be solved in time f(k)⋅n^o(k). Our result is a far-reaching generalization and unification of earlier results on Directed Steiner Tree, Directed Steiner Network, and Strongly Connected Steiner Subgraph on planar graphs. As an important step of our lower bound proof, we discover a rare example of a genuinely planar problem (i.e., described by a planar graph and two sets of vertices) that cannot be solved in time f(k)⋅n^o(k): given two sets of terminals S and T with |S|+|T| = k, find a subgraph with minimum number of edges such that every vertex of T is reachable from every vertex of S.

Cite as

Esther Galby, Sándor Kisfaludi-Bak, Dániel Marx, and Roohani Sharma. Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 67:1-67:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.ICALP.2024.67,
  author =	{Galby, Esther and Kisfaludi-Bak, S\'{a}ndor and Marx, D\'{a}niel and Sharma, Roohani},
  title =	{{Subexponential Parameterized Directed Steiner Network Problems on Planar Graphs: A Complete Classification}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{67:1--67:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.67},
  URN =		{urn:nbn:de:0030-drops-202104},
  doi =		{10.4230/LIPIcs.ICALP.2024.67},
  annote =	{Keywords: Directed Steiner Network, Sub-exponential algorithm}
}
Document
Short Paper
Flexible Patterns of Place for Function-based Search of Space (Short Paper)

Authors: Emmanuel Papadakis, Andreas Petutschnig, and Thomas Blaschke

Published in: LIPIcs, Volume 114, 10th International Conference on Geographic Information Science (GIScience 2018)


Abstract
Place is a human interpretation of space; it augments the latter with information related to human activities, services, emotions and so forth. Searching for places rather than traditional space-based search represents significant challenges. The most prevalent method of addressing place-related queries is based on placenames but has limited potential due to the vagueness of natural language and its tendency to lead to ambiguous interpretations. In previous work we proposed a system-oriented formalization of place that goes beyond placenames by introducing composition patterns of place. In this study, we introduce flexibility into these patterns in terms of what is necessarily or possibly included when describing the spatial composition of a place and propose a novel automated process of extracting these patterns relying on both theoretical and empirical knowledge. The proposed methodology is exemplified through the use case of locating all the shopping areas within London, UK.

Cite as

Emmanuel Papadakis, Andreas Petutschnig, and Thomas Blaschke. Flexible Patterns of Place for Function-based Search of Space (Short Paper). In 10th International Conference on Geographic Information Science (GIScience 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 114, pp. 54:1-54:7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{papadakis_et_al:LIPIcs.GISCIENCE.2018.54,
  author =	{Papadakis, Emmanuel and Petutschnig, Andreas and Blaschke, Thomas},
  title =	{{Flexible Patterns of Place for Function-based Search of Space}},
  booktitle =	{10th International Conference on Geographic Information Science (GIScience 2018)},
  pages =	{54:1--54:7},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-083-5},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{114},
  editor =	{Winter, Stephan and Griffin, Amy and Sester, Monika},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.GISCIENCE.2018.54},
  URN =		{urn:nbn:de:0030-drops-93825},
  doi =		{10.4230/LIPIcs.GISCIENCE.2018.54},
  annote =	{Keywords: Functions, Place, Patterns, Function-based search, Place-based GIS}
}
Document
Quantum Enhancement of Randomness Distribution

Authors: Raul Garcia-Patron, William Matthews, and Andreas Winter

Published in: LIPIcs, Volume 44, 10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015)


Abstract
The capability of a given channel to transmit information is, a priori, distinct from its capability to distribute random correlations. Despite that, for classical channels, the capacity to distribute information and randomness turns out to be the same, even with the assistance of auxiliary communication. In this work we show that this is no longer true for quantum channels when feedback is allowed. We prove this by constructing a channel that is noisy for the transmission of information but behaves as a virtual noiseless channel for randomness distribution when assisted by feedback communication. Our result can be seen as a way of unlocking quantum randomness internal to the channel.

Cite as

Raul Garcia-Patron, William Matthews, and Andreas Winter. Quantum Enhancement of Randomness Distribution. In 10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 44, pp. 180-190, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{garciapatron_et_al:LIPIcs.TQC.2015.180,
  author =	{Garcia-Patron, Raul and Matthews, William and Winter, Andreas},
  title =	{{Quantum Enhancement of Randomness Distribution}},
  booktitle =	{10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015)},
  pages =	{180--190},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-96-5},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{44},
  editor =	{Beigi, Salman and K\"{o}nig, Robert},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2015.180},
  URN =		{urn:nbn:de:0030-drops-55567},
  doi =		{10.4230/LIPIcs.TQC.2015.180},
  annote =	{Keywords: Quantum Shannon theory, noisy channels, capacity, randomness}
}
Document
Implementing Unitary 2-Designs Using Random Diagonal-unitary Matrices

Authors: Yoshifumi Nakata, Christoph Hirche, Ciara Morgan, and Andreas Winter

Published in: LIPIcs, Volume 44, 10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015)


Abstract
Unitary 2-designs are random unitary matrices which, in contrast to their Haar-distributed counterparts, have been shown to be efficiently realized by quantum circuits. Most notably, unitary 2-designs are known to achieve decoupling, a fundamental primitive of paramount importance in quantum Shannon theory. Here we prove that unitary 2-designs can be implemented approximately using random diagonal-unitaries.

Cite as

Yoshifumi Nakata, Christoph Hirche, Ciara Morgan, and Andreas Winter. Implementing Unitary 2-Designs Using Random Diagonal-unitary Matrices. In 10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 44, pp. 191-205, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{nakata_et_al:LIPIcs.TQC.2015.191,
  author =	{Nakata, Yoshifumi and Hirche, Christoph and Morgan, Ciara and Winter, Andreas},
  title =	{{Implementing Unitary 2-Designs Using Random Diagonal-unitary Matrices}},
  booktitle =	{10th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2015)},
  pages =	{191--205},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-96-5},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{44},
  editor =	{Beigi, Salman and K\"{o}nig, Robert},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2015.191},
  URN =		{urn:nbn:de:0030-drops-55570},
  doi =		{10.4230/LIPIcs.TQC.2015.191},
  annote =	{Keywords: unitary 2-designs, commuting quantum circuits}
}
Document
Bounds on Entanglement Assisted Source-channel Coding Via the Lovász Theta Number and Its Variants

Authors: Toby Cubitt, Laura Mancinska, David Roberson, Simone Severini, Dan Stahlke, and Andreas Winter

Published in: LIPIcs, Volume 27, 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)


Abstract
We study zero-error entanglement assisted source-channel coding (communication in the presence of side information). Adapting a technique of Beigi, we show that such coding requires existence of a set of vectors satisfying orthogonality conditions related to suitably defined graphs G and H. Such vectors exist if and only if theta(G) <= theta(H) where theta represents the Lovász number. We also obtain similar inequalities for the related Schrijver theta^- and Szegedy theta^+ numbers. These inequalities reproduce several known bounds and also lead to new results. We provide a lower bound on the entanglement assisted cost rate. We show that the entanglement assisted independence number is bounded by the Schrijver number: alpha^*(G) <= theta^-(G). Therefore, we are able to disprove the conjecture that the one-shot entanglement-assisted zero-error capacity is equal to the integer part of the Lovász number. Beigi introduced a quantity beta as an upper bound on alpha^* and posed the question of whether beta(G) = \lfloor theta(G) \rfloor. We answer this in the affirmative and show that a related quantity is equal to \lceil theta(G) \rceil. We show that a quantity chi_{vect}(G) recently introduced in the context of Tsirelson's conjecture is equal to \lceil theta^+(G) \rceil.

Cite as

Toby Cubitt, Laura Mancinska, David Roberson, Simone Severini, Dan Stahlke, and Andreas Winter. Bounds on Entanglement Assisted Source-channel Coding Via the Lovász Theta Number and Its Variants. In 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 27, pp. 48-51, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{cubitt_et_al:LIPIcs.TQC.2014.48,
  author =	{Cubitt, Toby and Mancinska, Laura and Roberson, David and Severini, Simone and Stahlke, Dan and Winter, Andreas},
  title =	{{Bounds on Entanglement Assisted Source-channel Coding Via the Lov\'{a}sz Theta Number and Its Variants}},
  booktitle =	{9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)},
  pages =	{48--51},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-73-6},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{27},
  editor =	{Flammia, Steven T. and Harrow, Aram W.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2014.48},
  URN =		{urn:nbn:de:0030-drops-48054},
  doi =		{10.4230/LIPIcs.TQC.2014.48},
  annote =	{Keywords: source-channel coding, zero-error capacity, Lov\'{a}sz theta}
}
Document
Strong Converse for the Quantum Capacity of the Erasure Channel for Almost All Codes

Authors: Mark M. Wilde and Andreas Winter

Published in: LIPIcs, Volume 27, 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)


Abstract
A strong converse theorem for channel capacity establishes that the error probability in any communication scheme for a given channel necessarily tends to one if the rate of communication exceeds the channel's capacity. Establishing such a theorem for the quantum capacity of degradable channels has been an elusive task, with the strongest progress so far being a so-called "pretty strong converse." In this work, Morgan and Winter proved that the quantum error of any quantum communication scheme for a given degradable channel converges to a value larger than 1/sqrt(2) in the limit of many channel uses if the quantum rate of communication exceeds the channel's quantum capacity. The present paper establishes a theorem that is a counterpart to this "pretty strong converse." We prove that the large fraction of codes having a rate exceeding the erasure channel's quantum capacity have a quantum error tending to one in the limit of many channel uses. Thus, our work adds to the body of evidence that a fully strong converse theorem should hold for the quantum capacity of the erasure channel. As a side result, we prove that the classical capacity of the quantum erasure channel obeys the strong converse property.

Cite as

Mark M. Wilde and Andreas Winter. Strong Converse for the Quantum Capacity of the Erasure Channel for Almost All Codes. In 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 27, pp. 52-66, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{wilde_et_al:LIPIcs.TQC.2014.52,
  author =	{Wilde, Mark M. and Winter, Andreas},
  title =	{{Strong Converse for the Quantum Capacity of the Erasure Channel for Almost All Codes}},
  booktitle =	{9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)},
  pages =	{52--66},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-73-6},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{27},
  editor =	{Flammia, Steven T. and Harrow, Aram W.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2014.52},
  URN =		{urn:nbn:de:0030-drops-48068},
  doi =		{10.4230/LIPIcs.TQC.2014.52},
  annote =	{Keywords: strong converse, quantum erasure channel, quantum capacity}
}
Document
Quantum Learning of Classical Stochastic Processes: The Completely-Positive Realization Problem

Authors: Alex Monras and Andreas Winter

Published in: LIPIcs, Volume 27, 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)


Abstract
Among several tasks in Machine Learning, is the problem of inferring the latent variables of a system and their causal relations with the observed behavior. A paradigmatic instance of such problem is the task of inferring the Hidden Markov Model underlying a given stochastic process. This is known as the positive realization problem (PRP) [Benvenuti,Farina(2004)] and constitutes a central problem in machine learning. The PRP and its solutions have far-reaching consequences in many areas of systems and control theory, and is nowadays an important piece in the broad field of positive systems theory [Luenberger(1979)]. We consider the scenario where the latent variables are quantum (e.g., quantum states of a finite-dimensional system), and the system dynamics is constrained only by physical transformations on the quantum system. The observable dynamics is then described by a quantum instrument, and the task is to determine which quantum instrument-if any-yields the process at hand by iterative application. We take as a starting point the theory of quasi-realizations, whence a description of the dynamics of the process is given in terms of linear maps on state vectors and probabilities are given by linear functionals on the state vectors. This description, despite its remarkable resemblance with the Hidden Markov Model, or the iterated quantum instrument, is however devoid from any stochastic or quantum mechanical interpretation, as said maps fail to satisfy any positivity conditions. The Completely-Positive realization problem then consists in determining whether an equivalent quantum mechanical description of the same process exists. We generalize some key results of stochastic realization theory, and show that the problem has deep connections with operator systems theory, giving possible insight to the lifting problem in quotient operator systems. Our results have potential applications in quantum machine learning, device-independent characterization and reverse-engineering of stochastic processes and quantum processors, and more generally, of dynamical processes with quantum memory [Guta(2011), Guta&Yamamoto(2013)].

Cite as

Alex Monras and Andreas Winter. Quantum Learning of Classical Stochastic Processes: The Completely-Positive Realization Problem. In 9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 27, pp. 99-109, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{monras_et_al:LIPIcs.TQC.2014.99,
  author =	{Monras, Alex and Winter, Andreas},
  title =	{{Quantum Learning of Classical Stochastic Processes: The Completely-Positive Realization Problem}},
  booktitle =	{9th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2014)},
  pages =	{99--109},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-73-6},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{27},
  editor =	{Flammia, Steven T. and Harrow, Aram W.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2014.99},
  URN =		{urn:nbn:de:0030-drops-48100},
  doi =		{10.4230/LIPIcs.TQC.2014.99},
  annote =	{Keywords: quantum instrument, hidden Markov model, machine learning, quantum measurement}
}
Document
The Quantum Entropy Cone of Stabiliser States

Authors: Noah Linden, Frantisek Matus, Mary Beth Ruskai, and Andreas Winter

Published in: LIPIcs, Volume 22, 8th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2013)


Abstract
We investigate the universal linear inequalities that hold for the von Neumann entropies in a multi-party system, prepared in a stabiliser state. We demonstrate here that entropy vectors for stabiliser states satisfy, in addition to the classic inequalities, a type of linear rank inequalities associated with the combinatorial structure of normal subgroups of certain matrix groups. In the 4-party case, there is only one such inequality, the so-called Ingleton inequality. For these systems we show that strong subadditivity, weak monotonicity and Ingleton inequality exactly characterize the entropy cone for stabiliser states.

Cite as

Noah Linden, Frantisek Matus, Mary Beth Ruskai, and Andreas Winter. The Quantum Entropy Cone of Stabiliser States. In 8th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 22, pp. 270-284, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{linden_et_al:LIPIcs.TQC.2013.270,
  author =	{Linden, Noah and Matus, Frantisek and Ruskai, Mary Beth and Winter, Andreas},
  title =	{{The Quantum Entropy Cone of Stabiliser States}},
  booktitle =	{8th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2013)},
  pages =	{270--284},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-55-2},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{22},
  editor =	{Severini, Simone and Brandao, Fernando},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2013.270},
  URN =		{urn:nbn:de:0030-drops-43278},
  doi =		{10.4230/LIPIcs.TQC.2013.270},
  annote =	{Keywords: Entropy inequalities, Stabiliser states, Ingleton inequality}
}
Document
Megamodelling and Etymology

Authors: Jean-Marie Favre

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
Is a model of a model, a metamodel? Is the relational model a metamodel? Is it a model? What is a component metamodel? Is it a model of a component model? The word MODEL is subject to a lot of debates in Model Driven Engineering. Add the notion of metamodel on top of it and you will just enter what some people call the Meta-muddle. Recently megamodels have been proposed to avoid the meta-muddle. This approach is very promising but it does not solve however the primary problem. That is, even a simple use of the word Model could lead to misunderstanding and confusion. This paper tackles this problem from its very source: the polysemic nature of the word MODEL. The evolution and semantic variations of the word MODEL are modelled from many different perspectives. This papers tells how the prefix MED in indo-european has lead, five millenniums after, to the acronym MDE, and this via the word MODEL. Based on an extensive study of encyclopedias, dictionaries, thesauri, and etymological sources, it is shown that the many senses of the word MODEL can be clustered into four groups, namely model-as-representation, model-as-example, model-as-type, and model-as-mold. All these groups are fundamental to understand the real nature of Model Driven Engineering. Megamodels and Etymology are indeed keys to avoid the Meta-muddle.on.

Cite as

Jean-Marie Favre. Megamodelling and Etymology. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{favre:DagSemProc.05161.6,
  author =	{Favre, Jean-Marie},
  title =	{{Megamodelling and Etymology}},
  booktitle =	{Transformation Techniques in Software Engineering},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.6},
  URN =		{urn:nbn:de:0030-drops-4276},
  doi =		{10.4230/DagSemProc.05161.6},
  annote =	{Keywords: MDE, MDD, MDA, Model Driven Architecture, Model, Metamodel, Etymology, Definition, Taxonomy}
}
Document
05161 Executive Summary – Transformation Techniques in Software Engineering

Authors: James R. Cordy, Ralf Lämmel, and Andreas Winter

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
TrafoDagstuhl brought together representatives of the research communities in re-engineering, XML processing, model-driven architecture and other areas of software engineering that involve grammar- or schema-driven transformations. These various existing fields and application contexts involve widely varying transformation techniques – the tradeoffs of which are worth analysing. This seminar initiated a process of understanding each other's transformation techniques – their use cases, corresponding methods, tool support, best practises, and open problems. This process makes it possible to exchange knowledge and experience between these various communities. This effort should also help in transposing transformation concepts from established application fields to new fields. This executive summary reports on the conception of the seminar, the program, outcomes and future work. Most of the material from the seminar (including abstracts of all talks) as well as additional papers can be found on the dedicated web site: http://www.dagstuhl.de/05161/

Cite as

James R. Cordy, Ralf Lämmel, and Andreas Winter. 05161 Executive Summary – Transformation Techniques in Software Engineering. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, pp. 1-24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{cordy_et_al:DagSemProc.05161.1,
  author =	{Cordy, James R. and L\"{a}mmel, Ralf and Winter, Andreas},
  title =	{{05161 Executive Summary – Transformation Techniques in Software Engineering}},
  booktitle =	{Transformation Techniques in Software Engineering},
  pages =	{1--24},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.1},
  URN =		{urn:nbn:de:0030-drops-4978},
  doi =		{10.4230/DagSemProc.05161.1},
  annote =	{Keywords: Program transformation, transformational programming, generative programming, generative language technology, automated software testing, engineering of metamodels, engineering for XML schemas, engineering of data models}
}
Document
CAViT: a Consistency Maintenance Framework based on Transformation Contracts

Authors: Pieter Van Gorp and Dirk Janssens

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
Design by contract is a software correctness methodology for procedural and object-oriented software. It relies on logical assertions to detect implementation mistakes at run-time or to proof the absence thereof at compile-time. Design by contract has found a new application in model driven engineering, a methodology that aims to manage the complexity of frameworks by relying on models and transformations. A ``transformation contract'' is a pair of constraints that together describe the effect of a transformation rule on the set of models contained in its transformation definition: the postcondition describes the model consistency state that the rule can establish provided that its precondition is satisfied. A transformation contract of a rule can be maintained automatically by calling the rule (1) as soon as the invariant corresponding to its postcondition is violated and (2) provided that its precondition is satisfied. Domain specific visual languages can facilitate the implementation of the actual transformation rules since they hide the complexity of graph transformation algorithms and standards for tool interoperability. In this talk, we describe CAViT: a framework that integrates a visual model transformation tool with a design by contract tool by relying on OMG standards such as UML, OCL and MOF.

Cite as

Pieter Van Gorp and Dirk Janssens. CAViT: a Consistency Maintenance Framework based on Transformation Contracts. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, pp. 1-27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{vangorp_et_al:DagSemProc.05161.2,
  author =	{Van Gorp, Pieter and Janssens, Dirk},
  title =	{{CAViT: a Consistency Maintenance Framework based on Transformation Contracts}},
  booktitle =	{Transformation Techniques in Software Engineering},
  pages =	{1--27},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.2},
  URN =		{urn:nbn:de:0030-drops-4294},
  doi =		{10.4230/DagSemProc.05161.2},
  annote =	{Keywords: Consistency maintenance, metamodeling, transformation, graph rewriting, UML, OCL, MOF}
}
Document
Experiences in Teaching Program Transformation for Software Reengineering

Authors: Mohammad El-Ramly

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
Little attention is given to teaching the theory and practice of software evolution and change in software engineering curricula. Program transformation is no exception. This paper presents the author’s experience in teaching program transformation as a unit in a postgraduate module on software systems reengineering. It describes the teaching context of this unit and two different offerings of it, one using Turing eXtender Language (TXL) and the other using Legacy Computer Aided Reengineering Environment (Legacy-CARE or L-CARE) from ATX Software. From this experience, it was found that selecting the suitable material (that balances theory and practice) and the right tool(s) for the level of students and depth of coverage required is a non-trivial task. It was also found that teaching using toy exercises and assignments does not convey well the practical aspects of the subject. While, teaching with real, even small size, exercises and assignments, is almost non-feasible. Finding the right balance is very important but not easy. It was also found that students understanding and appreciation of the topic of program transformation increases when they are presented with real industrial case studies.

Cite as

Mohammad El-Ramly. Experiences in Teaching Program Transformation for Software Reengineering. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{elramly:DagSemProc.05161.3,
  author =	{El-Ramly, Mohammad},
  title =	{{Experiences in Teaching Program Transformation for Software Reengineering}},
  booktitle =	{Transformation Techniques in Software Engineering},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.3},
  URN =		{urn:nbn:de:0030-drops-4230},
  doi =		{10.4230/DagSemProc.05161.3},
  annote =	{Keywords: Teaching Program Transformation, Reengineering, Source to Source Transformation, Software Engineering Education}
}
Document
g4re: Harnessing GCC to Reverse Engineer C++ Applications

Authors: Nicholas A. Kraft, Brian A. Malloy, and James F. Power

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
In this paper, we describe g4re, our tool chain that exploits GENERIC, an intermediate format incorporated into the gcc C++ compiler, to facilitate analysis of real C++ applications. The gcc GENERIC representation is available through a file generated for each translation unit (TU), and g4re reads each TU file and constructs a corresponding Abstract Semantic Graph (ASG). Since TU files can be prohibitively large, ranging from 11 megabytes for a "hello world" program, to 18 gigabytes for a version of Mozilla Thunderbird, we describe our approach for reducing the size of the generated ASG.

Cite as

Nicholas A. Kraft, Brian A. Malloy, and James F. Power. g4re: Harnessing GCC to Reverse Engineer C++ Applications. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, pp. 1-11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{kraft_et_al:DagSemProc.05161.4,
  author =	{Kraft, Nicholas A. and Malloy, Brian A. and Power, James F.},
  title =	{{g4re: Harnessing GCC to Reverse Engineer C++ Applications}},
  booktitle =	{Transformation Techniques in Software Engineering},
  pages =	{1--11},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.4},
  URN =		{urn:nbn:de:0030-drops-4244},
  doi =		{10.4230/DagSemProc.05161.4},
  annote =	{Keywords: Reverse engineering, schema, GXL}
}
Document
How to make a bridge between transformation and analysis technologies?

Authors: Jurgen Vinju and James R. Cordy

Published in: Dagstuhl Seminar Proceedings, Volume 5161, Transformation Techniques in Software Engineering (2006)


Abstract
At the Dagstuhl seminar on Transformation Techniques in Software Engineering we had an organized discussion on the intricacies of engineering practicle connections between software analysis and software transformation tools. This abstract summarizes it. This discussion contributes mainly by explicitly focussing on this subject from a general perspective, and providing a first sketch of a domain analysis. First we discuss the solution space in general, and then we compare the merits of two entirely di®erent designs: the monolithic versus the heterogeneous approach.

Cite as

Jurgen Vinju and James R. Cordy. How to make a bridge between transformation and analysis technologies?. In Transformation Techniques in Software Engineering. Dagstuhl Seminar Proceedings, Volume 5161, pp. 1-7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{vinju_et_al:DagSemProc.05161.5,
  author =	{Vinju, Jurgen and Cordy, James R.},
  title =	{{How to make a bridge between transformation and analysis technologies?}},
  booktitle =	{Transformation Techniques in Software Engineering},
  pages =	{1--7},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5161},
  editor =	{James R. Cordy and Ralf L\"{a}mmel and Andreas Winter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05161.5},
  URN =		{urn:nbn:de:0030-drops-4265},
  doi =		{10.4230/DagSemProc.05161.5},
  annote =	{Keywords: Transformation, analysis, fact extraction, middleware, source code representations}
}
  • Refine by Author
  • 7 Winter, Andreas
  • 2 Cordy, James R.
  • 1 Blaschke, Thomas
  • 1 Bouland, Adam
  • 1 Cubitt, Toby
  • Show More...

  • Refine by Classification
  • 1 Information systems → Geographic information systems
  • 1 Theory of computation → Fixed parameter tractability
  • 1 Theory of computation → Graph algorithms analysis
  • 1 Theory of computation → Modal and temporal logics
  • 1 Theory of computation → Pseudorandomness and derandomization
  • Show More...

  • Refine by Keyword
  • 2 Metamodel
  • 1 Consistency maintenance
  • 1 Datamodel transformation
  • 1 Definition
  • 1 Directed Steiner Network
  • Show More...

  • Refine by Type
  • 18 document

  • Refine by Publication Year
  • 9 2006
  • 3 2014
  • 2 2015
  • 2 2024
  • 1 2013
  • Show More...