Search Results

Documents authored by Nguyen, An



Nguyen, An

Document
Multimedia Contribution
Zapping Zika with a Mosquito-Managing Drone: Computing Optimal Flight Patterns with Minimum Turn Cost (Multimedia Contribution)

Authors: Aaron T. Becker, Mustapha Debboun, Sándor P. Fekete, Dominik Krupke, and An Nguyen

Published in: LIPIcs, Volume 77, 33rd International Symposium on Computational Geometry (SoCG 2017)


Abstract
We present results arising from the problem of sweeping a mosquito-infested area with an Un-manned Aerial Vehicle (UAV) equipped with an electrified metal grid. This is related to the Traveling Salesman Problem, the Lawn Mower Problem and, most closely, Milling with TurnCost. Planning a good trajectory can be reduced to considering penalty and budget variants of covering a grid graph with minimum turn cost. On the theoretical side, we show the solution of a problem from The Open Problems Project that had been open for more than 15 years, and hint at approximation algorithms. On the practical side, we describe an exact method based on Integer Programming that is able to compute provably optimal instances with over 500 pixels. These solutions are actually used for practical trajectories, as demonstrated in the video.

Cite as

Aaron T. Becker, Mustapha Debboun, Sándor P. Fekete, Dominik Krupke, and An Nguyen. Zapping Zika with a Mosquito-Managing Drone: Computing Optimal Flight Patterns with Minimum Turn Cost (Multimedia Contribution). In 33rd International Symposium on Computational Geometry (SoCG 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 77, pp. 62:1-62:5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{becker_et_al:LIPIcs.SoCG.2017.62,
  author =	{Becker, Aaron T. and Debboun, Mustapha and Fekete, S\'{a}ndor P. and Krupke, Dominik and Nguyen, An},
  title =	{{Zapping Zika with a Mosquito-Managing Drone: Computing Optimal Flight Patterns with Minimum Turn Cost}},
  booktitle =	{33rd International Symposium on Computational Geometry (SoCG 2017)},
  pages =	{62:1--62:5},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-038-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{77},
  editor =	{Aronov, Boris and Katz, Matthew J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2017.62},
  URN =		{urn:nbn:de:0030-drops-72394},
  doi =		{10.4230/LIPIcs.SoCG.2017.62},
  annote =	{Keywords: Covering tours, turn cost, complexity, exact optimization}
}

Nguyen, Hoang Gia

Document
Informal Presentation
Enhanced Distributed Behavioral Cartography of Parametric Timed Automata (Informal Presentation)

Authors: Étienne André, Camille Coti, and Hoang Gia Nguyen

Published in: OASIcs, Volume 44, 2nd International Workshop on Synthesis of Complex Parameters (SynCoP'15) (2015)


Abstract
Parametric timed automata (PTA) allow the specification and verification of timed systems incompletely specified, or featuring timing constants that may change either in the design phase, or at runtime. The behavioral cartography of PTA (BC) relies on the idea of covering a bounded parameter domain with tiles, i.e., parts of the parameter domain in which the discrete behavior is uniform. This is achieved by iterating the inverse method (IM) on the (yet uncovered) integer parameter valuations ("points") of the bounded parametric domain: given a reference point, IM generalizes the behavior corresponding to this point by synthesizing a constraint containing other (integer and real-valued) points with the same discrete behavior. Then, given a linear time property, it is easy to partition the parametric domain into a subset of "good" tiles and a subset of "bad" ones (which correspond to good and bad behaviors). Useful applications of BC include the optimization of timing constants, and the measure of the system robustness (values around the reference parameters) w.r.t. the untimed language. In practice, a parameter domain with a large number of integer points will require a long time to compute BC. To alleviate that, our goal is to take advantage of powerful distributed architectures. Distributing BC is theoretically easy, since it is trivial that two executions of IM from two different points can be performed on two different nodes. However, distributing it efficiently is challenging. For example, calling two executions of IM from two contiguous integer points has a large probability to yield the same tile in both cases, resulting in a loss of time for one of the two nodes. Thus, the critical question is how to distribute efficiently the point on which to call IM. In a previous work, a master-worker scheme is proposed, where the master assigns points to each worker process, which is called a point-based distribution scheme. In this point-based distribution scheme, choosing the point distribution approach on the master side is the key point that will decide the algorithm performance. Since the master has no ability to foresee the tiles on cartography (the "shape" of a cartography is unknown in general), two or more processes can receive close points, that then yield the same result, leading to a loss of efficiency. Besides that, two or more tiles can overlap each other; hence, the question is whether we stop an going process starting from a point that is already covered by another tile. Finally, a very large parameter domain (with many integer points) can cause a bottleneck phenomenon on the master side since many worker processes ask for point, while the master is busy to find uncovered points. From the previous problems, we proposed an enhanced master-worker distributed algorithm, based on a domain decomposition scheme. The main idea is that the master splits the parameter domain into subdomains, and assigns them to the workers. Then workers will work on their own set of points, hence reducing the probability of choosing close points since the workers work as far as possible from each other. Then, when a worker finishes the coverage of its subdomain, it asks the master for a new subdomain: the master splits a slow worker's subdomain into two parts, and sends it to the fast worker. Furthermore, we used a heuristic approach to decide whether to stop a process working on a point that has been covered by a tile of another worker. In all our experiments, our enhanced distributed algorithm outperforms previous algorithms.

Cite as

Étienne André, Camille Coti, and Hoang Gia Nguyen. Enhanced Distributed Behavioral Cartography of Parametric Timed Automata (Informal Presentation). In 2nd International Workshop on Synthesis of Complex Parameters (SynCoP'15). Open Access Series in Informatics (OASIcs), Volume 44, pp. 104-105, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{andre_et_al:OASIcs.SynCoP.2015.104,
  author =	{Andr\'{e}, \'{E}tienne and Coti, Camille and Nguyen, Hoang Gia},
  title =	{{Enhanced Distributed Behavioral Cartography of Parametric Timed Automata}},
  booktitle =	{2nd International Workshop on Synthesis of Complex Parameters (SynCoP'15)},
  pages =	{104--105},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-82-8},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{44},
  editor =	{Andr\'{e}, \'{E}tienne and Frehse, Goran},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.SynCoP.2015.104},
  URN =		{urn:nbn:de:0030-drops-56134},
  doi =		{10.4230/OASIcs.SynCoP.2015.104},
  annote =	{Keywords: Formal methods, model checking, distributed algorithmic, real-time systems, parameter synthesis}
}

Nguyen, Van Duc

Document
Explanations Generation For Web Service Workflow

Authors: Van Duc Nguyen, Son Cao Tran, and Enrico Pontelli

Published in: OASIcs, Volume 64, Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018)


Abstract
The motivation for the work is the challenge of providing textual explanations of automatically generated scientific workflows (e.g., paragraphs that scientists can include in their publications). The extended abstract presents a system which generates explanations for a web service workflow from sets of atoms derived from a collection of ontologies. The system, called nlgPhylogeny, demonstrates the feasibility of the task in the Phylotastic project, that aims at providing evolutionary biologists with a platform for automatic generation of phylogenetic trees given some suitable inputs.

Cite as

Van Duc Nguyen, Son Cao Tran, and Enrico Pontelli. Explanations Generation For Web Service Workflow. In Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018). Open Access Series in Informatics (OASIcs), Volume 64, pp. 14:1-14:3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:OASIcs.ICLP.2018.14,
  author =	{Nguyen, Van Duc and Tran, Son Cao and Pontelli, Enrico},
  title =	{{Explanations Generation For Web Service Workflow}},
  booktitle =	{Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018)},
  pages =	{14:1--14:3},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-090-3},
  ISSN =	{2190-6807},
  year =	{2018},
  volume =	{64},
  editor =	{Dal Palu', Alessandro and Tarau, Paul and Saeedloei, Neda and Fodor, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.ICLP.2018.14},
  URN =		{urn:nbn:de:0030-drops-98801},
  doi =		{10.4230/OASIcs.ICLP.2018.14},
  annote =	{Keywords: Phylotastic, Grammatical Framework}
}
Document
Natural Language Generation From Ontologies Using Grammatical Framework

Authors: Van Duc Nguyen

Published in: OASIcs, Volume 64, Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018)


Abstract
The paper addresses the problem of automatic generation of natural language descriptions for ontology-described artifacts. The motivation for the work is the challenge of providing textual descriptions of automatically generated scientific workflows (e.g., paragraphs that scientists can include in their publications). The extended abstract presents a system which generates descriptions of sets of atoms derived from a collection of ontologies. The system, called nlgPhylogeny, demonstrates the feasibility of the task in the Phylotastic project, that aims at providing evolutionary biologists with a platform for automatic generation of phylogenetic trees given some suitable inputs. nlgPhylogeny utilizes the fact that the Grammatical Framework (GF) is suitable for the natural language generation (NLG) task; the abstract shows how elements of the ontologies in Phylotastic, such as web services, inputs and outputs of web services, can be encoded in GF for the NLG task.

Cite as

Van Duc Nguyen. Natural Language Generation From Ontologies Using Grammatical Framework. In Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018). Open Access Series in Informatics (OASIcs), Volume 64, pp. 22:1-22:7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:OASIcs.ICLP.2018.22,
  author =	{Nguyen, Van Duc},
  title =	{{Natural Language Generation From Ontologies Using Grammatical Framework}},
  booktitle =	{Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018)},
  pages =	{22:1--22:7},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-090-3},
  ISSN =	{2190-6807},
  year =	{2018},
  volume =	{64},
  editor =	{Dal Palu', Alessandro and Tarau, Paul and Saeedloei, Neda and Fodor, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.ICLP.2018.22},
  URN =		{urn:nbn:de:0030-drops-98885},
  doi =		{10.4230/OASIcs.ICLP.2018.22},
  annote =	{Keywords: Phylotastic, Grammatical Framework}
}

Nguyen, Manh Thang

Document
Termination of Programs using Term Rewriting and SAT Solving

Authors: Jürgen Giesl, Peter Schneider-Kamp, René Thiemann, Stephan Swiderski, Manh Thang Nguyen, Daniel De Schreye, and Alexander Serebrenik

Published in: Dagstuhl Seminar Proceedings, Volume 7401, Deduction and Decision Procedures (2007)


Abstract
There are many powerful techniques for automated termination analysis of term rewrite systems (TRSs). However, up to now they have hardly been used for real programming languages. In this talk, we describe recent results which permit the application of existing techniques from term rewriting in order to prove termination of programs. We discuss two possible approaches: 1. One could translate programs into TRSs and then use existing tools to verify termination of the resulting TRSs. 2. One could adapt TRS-techniques to the respective programming languages in order to analyze programs directly. We present such approaches for the functional language Haskell and the logic language Prolog. Our results have been implemented in the termination provers AProVE and Polytool. In order to handle termination problems resulting from real programs, these provers had to be coupled with modern SAT solvers, since the automation of the TRS-termination techniques had to improve significantly. Our resulting termination analyzers are currently the most powerful ones for Haskell and Prolog.

Cite as

Jürgen Giesl, Peter Schneider-Kamp, René Thiemann, Stephan Swiderski, Manh Thang Nguyen, Daniel De Schreye, and Alexander Serebrenik. Termination of Programs using Term Rewriting and SAT Solving. In Deduction and Decision Procedures. Dagstuhl Seminar Proceedings, Volume 7401, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{giesl_et_al:DagSemProc.07401.7,
  author =	{Giesl, J\"{u}rgen and Schneider-Kamp, Peter and Thiemann, Ren\'{e} and Swiderski, Stephan and Nguyen, Manh Thang and De Schreye, Daniel and Serebrenik, Alexander},
  title =	{{Termination of Programs using Term Rewriting and SAT Solving}},
  booktitle =	{Deduction and Decision Procedures},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7401},
  editor =	{Franz Baader and Byron Cook and J\"{u}rgen Giesl and Robert Nieuwenhuis},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.07401.7},
  URN =		{urn:nbn:de:0030-drops-12481},
  doi =		{10.4230/DagSemProc.07401.7},
  annote =	{Keywords: Termination, Term Rewriting, Haskell, Prolog, SAT Solving}
}

Nguyen, Viet Anh

Document
Cache-Conscious Offline Real-Time Task Scheduling for Multi-Core Processors

Authors: Viet Anh Nguyen, Damien Hardy, and Isabelle Puaut

Published in: LIPIcs, Volume 76, 29th Euromicro Conference on Real-Time Systems (ECRTS 2017)


Abstract
Most schedulability analysis techniques for multi-core architectures assume a single Worst-Case Execution Time (WCET) per task, which is valid in all execution conditions. This assumption is too pessimistic for parallel applications running on multi-core architectures with local instruction or data caches, for which the WCET of a task depends on the cache contents at the beginning of its execution, itself depending on the task that was executed before the task under study. In this paper, we propose two scheduling techniques for multi-core architectures equipped with local instruction and data caches. The two techniques schedule a parallel application modeled as a task graph, and generate a static partitioned non-preemptive schedule. We propose an optimal method, using an Integer Linear Programming (ILP) formulation, as well as a heuristic method based on list scheduling. Experimental results show that by taking into account the effect of private caches on tasks' WCETs, the length of generated schedules is significantly reduced as compared to schedules generated by cache-unaware scheduling methods. The observed schedule length reduction on streaming applications is 11% on average for the optimal method and 9% on average for the heuristic method.

Cite as

Viet Anh Nguyen, Damien Hardy, and Isabelle Puaut. Cache-Conscious Offline Real-Time Task Scheduling for Multi-Core Processors. In 29th Euromicro Conference on Real-Time Systems (ECRTS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 76, pp. 14:1-14:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ECRTS.2017.14,
  author =	{Nguyen, Viet Anh and Hardy, Damien and Puaut, Isabelle},
  title =	{{Cache-Conscious Offline Real-Time Task Scheduling for Multi-Core Processors}},
  booktitle =	{29th Euromicro Conference on Real-Time Systems (ECRTS 2017)},
  pages =	{14:1--14:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-037-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{76},
  editor =	{Bertogna, Marko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECRTS.2017.14},
  URN =		{urn:nbn:de:0030-drops-71642},
  doi =		{10.4230/LIPIcs.ECRTS.2017.14},
  annote =	{Keywords: Real-time scheduling, Cache-conscious scheduling, Many-core architectures, ILP, Static list scheduling}
}

Nguyen, Danny

Document
The Computational Complexity of Integer Programming with Alternations

Authors: Danny Nguyen and Igor Pak

Published in: LIPIcs, Volume 79, 32nd Computational Complexity Conference (CCC 2017)


Abstract
We prove that integer programming with three alternating quantifiers is NP-complete, even for a fixed number of variables. This complements earlier results by Lenstra and Kannan, which together say that integer programming with at most two alternating quantifiers can be done in polynomial time for a fixed number of variables. As a byproduct of the proof, we show that for two polytopes P, Q in R^4, counting the projection of integer points in Q\P is #P-complete. This contrasts the 2003 result by Barvinok and Woods, which allows counting in polynomial time the projection of integer points in P and Q separately.

Cite as

Danny Nguyen and Igor Pak. The Computational Complexity of Integer Programming with Alternations. In 32nd Computational Complexity Conference (CCC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 79, pp. 6:1-6:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.CCC.2017.6,
  author =	{Nguyen, Danny and Pak, Igor},
  title =	{{The Computational Complexity of Integer Programming with Alternations}},
  booktitle =	{32nd Computational Complexity Conference (CCC 2017)},
  pages =	{6:1--6:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-040-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{79},
  editor =	{O'Donnell, Ryan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2017.6},
  URN =		{urn:nbn:de:0030-drops-75151},
  doi =		{10.4230/LIPIcs.CCC.2017.6},
  annote =	{Keywords: Integer Programming, Alternations, Projection of Integer Points}
}

Nguyen, Lê Thành Dung

Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give a characterization of star-free languages in a λ-calculus with support for non-commutative affine types (in the sense of linear logic), via the algebraic characterization of the former using aperiodic monoids. When the type system is made commutative, we show that we get regular languages instead. A key ingredient in our approach – that it shares with higher-order model checking – is the use of Church encodings for inputs and outputs. Our result is, to our knowledge, the first use of non-commutativity in implicit computational complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 135:1-135:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2020.135,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{Implicit Automata in Typed \lambda-Calculi I: Aperiodicity in a Non-Commutative Logic}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{135:1--135:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.135},
  URN =		{urn:nbn:de:0030-drops-125426},
  doi =		{10.4230/LIPIcs.ICALP.2020.135},
  annote =	{Keywords: Church encodings, ordered linear types, star-free languages}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming)

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We introduce a new approach to implicit complexity in linear logic, inspired by functional database query languages and using recent developments in effective denotational semantics of polymorphism. We give the first sub-polynomial upper bound in a type system with impredicative polymorphism; adding restrictions on quantifiers yields a characterization of logarithmic space, for which extensional completeness is established via descriptive complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming). In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 123:1-123:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2019.123,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{From Normal Functors to Logarithmic Space Queries}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{123:1--123:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.123},
  URN =		{urn:nbn:de:0030-drops-106994},
  doi =		{10.4230/LIPIcs.ICALP.2019.123},
  annote =	{Keywords: coherence spaces, elementary linear logic, semantic evaluation}
}
Document
Unique perfect matchings and proof nets

Authors: Lê Thành Dung Nguyen

Published in: LIPIcs, Volume 108, 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)


Abstract
This paper establishes a bridge between linear logic and mainstream graph theory, building previous work by Retoré (2003). We show that the problem of correctness for MLL+Mix proof nets is equivalent to the problem of uniqueness of a perfect matching. By applying matching theory, we obtain new results for MLL+Mix proof nets: a linear-time correctness criterion, a quasi-linear sequentialization algorithm, and a characterization of the sub-polynomial complexity of the correctness problem. We also use graph algorithms to compute the dependency relation of Bagnol et al. (2015) and the kingdom ordering of Bellin (1997), and relate them to the notion of blossom which is central to combinatorial maximum matching algorithms.

Cite as

Lê Thành Dung Nguyen. Unique perfect matchings and proof nets. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:LIPIcs.FSCD.2018.25,
  author =	{Nguyen, L\^{e} Th\`{a}nh Dung},
  title =	{{Unique perfect matchings and proof nets}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{25:1--25:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.25},
  URN =		{urn:nbn:de:0030-drops-91957},
  doi =		{10.4230/LIPIcs.FSCD.2018.25},
  annote =	{Keywords: correctness criteria, matching algorithms}
}

Nguyễn, Lê Thành Dũng

Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give a characterization of star-free languages in a λ-calculus with support for non-commutative affine types (in the sense of linear logic), via the algebraic characterization of the former using aperiodic monoids. When the type system is made commutative, we show that we get regular languages instead. A key ingredient in our approach – that it shares with higher-order model checking – is the use of Church encodings for inputs and outputs. Our result is, to our knowledge, the first use of non-commutativity in implicit computational complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 135:1-135:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2020.135,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{Implicit Automata in Typed \lambda-Calculi I: Aperiodicity in a Non-Commutative Logic}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{135:1--135:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.135},
  URN =		{urn:nbn:de:0030-drops-125426},
  doi =		{10.4230/LIPIcs.ICALP.2020.135},
  annote =	{Keywords: Church encodings, ordered linear types, star-free languages}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming)

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We introduce a new approach to implicit complexity in linear logic, inspired by functional database query languages and using recent developments in effective denotational semantics of polymorphism. We give the first sub-polynomial upper bound in a type system with impredicative polymorphism; adding restrictions on quantifiers yields a characterization of logarithmic space, for which extensional completeness is established via descriptive complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming). In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 123:1-123:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2019.123,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{From Normal Functors to Logarithmic Space Queries}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{123:1--123:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.123},
  URN =		{urn:nbn:de:0030-drops-106994},
  doi =		{10.4230/LIPIcs.ICALP.2019.123},
  annote =	{Keywords: coherence spaces, elementary linear logic, semantic evaluation}
}
Document
Unique perfect matchings and proof nets

Authors: Lê Thành Dung Nguyen

Published in: LIPIcs, Volume 108, 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)


Abstract
This paper establishes a bridge between linear logic and mainstream graph theory, building previous work by Retoré (2003). We show that the problem of correctness for MLL+Mix proof nets is equivalent to the problem of uniqueness of a perfect matching. By applying matching theory, we obtain new results for MLL+Mix proof nets: a linear-time correctness criterion, a quasi-linear sequentialization algorithm, and a characterization of the sub-polynomial complexity of the correctness problem. We also use graph algorithms to compute the dependency relation of Bagnol et al. (2015) and the kingdom ordering of Bellin (1997), and relate them to the notion of blossom which is central to combinatorial maximum matching algorithms.

Cite as

Lê Thành Dung Nguyen. Unique perfect matchings and proof nets. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:LIPIcs.FSCD.2018.25,
  author =	{Nguyen, L\^{e} Th\`{a}nh Dung},
  title =	{{Unique perfect matchings and proof nets}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{25:1--25:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.25},
  URN =		{urn:nbn:de:0030-drops-91957},
  doi =		{10.4230/LIPIcs.FSCD.2018.25},
  annote =	{Keywords: correctness criteria, matching algorithms}
}

Nguyen, Trung Thanh

Document
Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs

Authors: Trung Thanh Nguyen and Jörg Rothe

Published in: LIPIcs, Volume 181, 31st International Symposium on Algorithms and Computation (ISAAC 2020)


Abstract
We study a generalized version of the load balancing problem on unrelated machines with cost constraints: Given a set of m machines (of certain types) and a set of n jobs, each job j processed on machine i requires p_{i,j} time units and incurs a cost c_{i,j}, and the goal is to find a schedule of jobs to machines, which is defined as an ordered partition of n jobs into m disjoint subsets, in such a way that some objective function of the vector of the completion times of the machines is optimized, subject to the constraint that the total costs by the schedule must be within a given budget B. Motivated by recent results from the literature, our focus is on the case when the number of machine types is a fixed constant and we develop a bi-criteria approximation scheme for the studied problem. Our result generalizes several known results for certain special cases, such as the case with identical machines, or the case with a constant number of machines with cost constraints. Building on the elegant technique recently proposed by Jansen and Maack [K. Jansen and M. Maack, 2019], we construct a more general approach that can be used to derive approximation schemes to a wider class of load balancing problems with constraints.

Cite as

Trung Thanh Nguyen and Jörg Rothe. Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 14:1-14:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ISAAC.2020.14,
  author =	{Nguyen, Trung Thanh and Rothe, J\"{o}rg},
  title =	{{Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{14:1--14:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.14},
  URN =		{urn:nbn:de:0030-drops-133582},
  doi =		{10.4230/LIPIcs.ISAAC.2020.14},
  annote =	{Keywords: bi-criteria approximation algorithm, polynomial-time approximation algorithm, load balancing, machine scheduling}
}

Nguyễn, Lê Thành Dũng (Tito)

Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Function Spaces for Orbit-Finite Sets

Authors: Mikołaj Bojańczyk, Lê Thành Dũng (Tito) Nguyễn, and Rafał Stefański

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
Orbit-finite sets are a generalisation of finite sets, and as such support many operations allowed for finite sets, such as pairing, quotienting, or taking subsets. However, they do not support function spaces, i.e. if X and Y are orbit-finite sets, then the space of finitely supported functions from X to Y is not orbit-finite. We propose a solution to this problem inspired by linear logic.

Cite as

Mikołaj Bojańczyk, Lê Thành Dũng (Tito) Nguyễn, and Rafał Stefański. Function Spaces for Orbit-Finite Sets. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 130:1-130:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bojanczyk_et_al:LIPIcs.ICALP.2024.130,
  author =	{Boja\'{n}czyk, Miko{\l}aj and Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito) and Stefa\'{n}ski, Rafa{\l}},
  title =	{{Function Spaces for Orbit-Finite Sets}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{130:1--130:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.130},
  URN =		{urn:nbn:de:0030-drops-202730},
  doi =		{10.4230/LIPIcs.ICALP.2024.130},
  annote =	{Keywords: Orbit-finite sets, automata, linear types, game semantics}
}
Document
Syntactically and Semantically Regular Languages of λ-Terms Coincide Through Logical Relations

Authors: Vincent Moreau and Lê Thành Dũng (Tito) Nguyễn

Published in: LIPIcs, Volume 288, 32nd EACSL Annual Conference on Computer Science Logic (CSL 2024)


Abstract
A fundamental theme in automata theory is regular languages of words and trees, and their many equivalent definitions. Salvati has proposed a generalization to regular languages of simply typed λ-terms, defined using denotational semantics in finite sets. We provide here some evidence for its robustness. First, we give an equivalent syntactic characterization that naturally extends the seminal work of Hillebrand and Kanellakis connecting regular languages of words and syntactic λ-definability. Second, we show that any finitary extensional model of the simply typed λ-calculus, when used in Salvati’s definition, recognizes exactly the same class of languages of λ-terms as the category of finite sets does. The proofs of these two results rely on logical relations and can be seen as instances of a more general construction of a categorical nature, inspired by previous categorical accounts of logical relations using the gluing construction.

Cite as

Vincent Moreau and Lê Thành Dũng (Tito) Nguyễn. Syntactically and Semantically Regular Languages of λ-Terms Coincide Through Logical Relations. In 32nd EACSL Annual Conference on Computer Science Logic (CSL 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 288, pp. 40:1-40:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{moreau_et_al:LIPIcs.CSL.2024.40,
  author =	{Moreau, Vincent and Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito)},
  title =	{{Syntactically and Semantically Regular Languages of \lambda-Terms Coincide Through Logical Relations}},
  booktitle =	{32nd EACSL Annual Conference on Computer Science Logic (CSL 2024)},
  pages =	{40:1--40:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-310-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{288},
  editor =	{Murano, Aniello and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2024.40},
  URN =		{urn:nbn:de:0030-drops-196831},
  doi =		{10.4230/LIPIcs.CSL.2024.40},
  annote =	{Keywords: regular languages, simple types, denotational semantics, logical relations}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Algebraic Recognition of Regular Functions

Authors: Mikołaj Bojańczyk and Lê Thành Dũng (Tito) Nguyễn

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
We consider regular string-to-string functions, i.e. functions that are recognized by copyless streaming string transducers, or any of their equivalent models, such as deterministic two-way automata. We give yet another characterization, which is very succinct: finiteness-preserving functors from the category of semigroups to itself, together with a certain output function that is a natural transformation.

Cite as

Mikołaj Bojańczyk and Lê Thành Dũng (Tito) Nguyễn. Algebraic Recognition of Regular Functions. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 117:1-117:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bojanczyk_et_al:LIPIcs.ICALP.2023.117,
  author =	{Boja\'{n}czyk, Miko{\l}aj and Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito)},
  title =	{{Algebraic Recognition of Regular Functions}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{117:1--117:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.117},
  URN =		{urn:nbn:de:0030-drops-181697},
  doi =		{10.4230/LIPIcs.ICALP.2023.117},
  annote =	{Keywords: string transducers, semigroups, category theory}
}
Document
BV and Pomset Logic Are Not the Same

Authors: Lê Thành Dũng (Tito) Nguyễn and Lutz Straßburger

Published in: LIPIcs, Volume 216, 30th EACSL Annual Conference on Computer Science Logic (CSL 2022)


Abstract
BV and pomset logic are two logics that both conservatively extend unit-free multiplicative linear logic by a third binary connective, which (i) is non-commutative, (ii) is self-dual, and (iii) lies between the "par" and the "tensor". It was conjectured early on (more than 20 years ago), that these two logics, that share the same language, that both admit cut elimination, and whose connectives have essentially the same properties, are in fact the same. In this paper we show that this is not the case. We present a formula that is provable in pomset logic but not in BV.

Cite as

Lê Thành Dũng (Tito) Nguyễn and Lutz Straßburger. BV and Pomset Logic Are Not the Same. In 30th EACSL Annual Conference on Computer Science Logic (CSL 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 216, pp. 32:1-32:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.CSL.2022.32,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito) and Stra{\ss}burger, Lutz},
  title =	{{BV and Pomset Logic Are Not the Same}},
  booktitle =	{30th EACSL Annual Conference on Computer Science Logic (CSL 2022)},
  pages =	{32:1--32:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-218-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{216},
  editor =	{Manea, Florin and Simpson, Alex},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2022.32},
  URN =		{urn:nbn:de:0030-drops-157521},
  doi =		{10.4230/LIPIcs.CSL.2022.32},
  annote =	{Keywords: proof nets, deep inference, pomset logic, system BV, cographs, dicographs, series-parallel orders}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Comparison-Free Polyregular Functions

Authors: Lê Thành Dũng (Tito) Nguyễn, Camille Noûs, and Pierre Pradic

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
This paper introduces a new automata-theoretic class of string-to-string functions with polynomial growth. Several equivalent definitions are provided: a machine model which is a restricted variant of pebble transducers, and a few inductive definitions that close the class of regular functions under certain operations. Our motivation for studying this class comes from another characterization, which we merely mention here but prove elsewhere, based on a λ-calculus with a linear type system. As their name suggests, these comparison-free polyregular functions form a subclass of polyregular functions; we prove that the inclusion is strict. We also show that they are incomparable with HDT0L transductions, closed under usual function composition - but not under a certain "map" combinator - and satisfy a comparison-free version of the pebble minimization theorem. On the broader topic of polynomial growth transductions, we also consider the recently introduced layered streaming string transducers (SSTs), or equivalently k-marble transducers. We prove that a function can be obtained by composing such transducers together if and only if it is polyregular, and that k-layered SSTs (or k-marble transducers) are closed under "map" and equivalent to a corresponding notion of (k+1)-layered HDT0L systems.

Cite as

Lê Thành Dũng (Tito) Nguyễn, Camille Noûs, and Pierre Pradic. Comparison-Free Polyregular Functions. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 139:1-139:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2021.139,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito) and No\^{u}s, Camille and Pradic, Pierre},
  title =	{{Comparison-Free Polyregular Functions}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{139:1--139:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.139},
  URN =		{urn:nbn:de:0030-drops-142087},
  doi =		{10.4230/LIPIcs.ICALP.2021.139},
  annote =	{Keywords: pebble transducers, HDT0L systems, polyregular functions}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail