60 Search Results for "Chen, Xi"


Document
Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)

Authors: Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen

Published in: Dagstuhl Manifestos, Volume 11, Issue 1 (2025)


Abstract
During the workshop, we deeply discussed what CONversational Information ACcess (CONIAC) is and its unique features, proposing a world model abstracting it, and defined the Conversational Agents Framework for Evaluation (CAFE) for the evaluation of CONIAC systems, consisting of six major components: 1) goals of the system’s stakeholders, 2) user tasks to be studied in the evaluation, 3) aspects of the users carrying out the tasks, 4) evaluation criteria to be considered, 5) evaluation methodology to be applied, and 6) measures for the quantitative criteria chosen.

Cite as

Christine Bauer, Li Chen, Nicola Ferro, Norbert Fuhr, Avishek Anand, Timo Breuer, Guglielmo Faggioli, Ophir Frieder, Hideo Joho, Jussi Karlgren, Johannes Kiesel, Bart P. Knijnenburg, Aldo Lipani, Lien Michiels, Andrea Papenmeier, Maria Soledad Pera, Mark Sanderson, Scott Sanner, Benno Stein, Johanne R. Trippas, Karin Verspoor, and Martijn C. Willemsen. Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352). In Dagstuhl Manifestos, Volume 11, Issue 1, pp. 19-67, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{bauer_et_al:DagMan.11.1.19,
  author =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  title =	{{Conversational Agents: A Framework for Evaluation (CAFE) (Dagstuhl Perspectives Workshop 24352)}},
  pages =	{19--67},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2025},
  volume =	{11},
  number =	{1},
  editor =	{Bauer, Christine and Chen, Li and Ferro, Nicola and Fuhr, Norbert and Anand, Avishek and Breuer, Timo and Faggioli, Guglielmo and Frieder, Ophir and Joho, Hideo and Karlgren, Jussi and Kiesel, Johannes and Knijnenburg, Bart P. and Lipani, Aldo and Michiels, Lien and Papenmeier, Andrea and Pera, Maria Soledad and Sanderson, Mark and Sanner, Scott and Stein, Benno and Trippas, Johanne R. and Verspoor, Karin and Willemsen, Martijn C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.11.1.19},
  URN =		{urn:nbn:de:0030-drops-252722},
  doi =		{10.4230/DagMan.11.1.19},
  annote =	{Keywords: Conversational Agents, Evaluation, Information Access}
}
Document
Survey
Resilience in Knowledge Graph Embeddings

Authors: Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo

Published in: TGDK, Volume 3, Issue 2 (2025). Transactions on Graph Data and Knowledge, Volume 3, Issue 2


Abstract
In recent years, knowledge graphs have gained interest and witnessed widespread applications in various domains, such as information retrieval, question-answering, recommendation systems, amongst others. Large-scale knowledge graphs to this end have demonstrated their utility in effectively representing structured knowledge. To further facilitate the application of machine learning techniques, knowledge graph embedding models have been developed. Such models can transform entities and relationships within knowledge graphs into vectors. However, these embedding models often face challenges related to noise, missing information, distribution shift, adversarial attacks, etc. This can lead to sub-optimal embeddings and incorrect inferences, thereby negatively impacting downstream applications. While the existing literature has focused so far on adversarial attacks on KGE models, the challenges related to the other critical aspects remain unexplored. In this paper, we, first of all, give a unified definition of resilience, encompassing several factors such as generalisation, in-distribution generalization, distribution adaption, and robustness. After formalizing these concepts for machine learning in general, we define them in the context of knowledge graphs. To find the gap in the existing works on resilience in the context of knowledge graphs, we perform a systematic survey, taking into account all these aspects mentioned previously. Our survey results show that most of the existing works focus on a specific aspect of resilience, namely robustness. After categorizing such works based on their respective aspects of resilience, we discuss the challenges and future research directions.

Cite as

Arnab Sharma, N'Dah Jean Kouagou, and Axel-Cyrille Ngonga Ngomo. Resilience in Knowledge Graph Embeddings. In Transactions on Graph Data and Knowledge (TGDK), Volume 3, Issue 2, pp. 1:1-1:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@Article{sharma_et_al:TGDK.3.2.1,
  author =	{Sharma, Arnab and Kouagou, N'Dah Jean and Ngomo, Axel-Cyrille Ngonga},
  title =	{{Resilience in Knowledge Graph Embeddings}},
  journal =	{Transactions on Graph Data and Knowledge},
  pages =	{1:1--1:38},
  ISSN =	{2942-7517},
  year =	{2025},
  volume =	{3},
  number =	{2},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/TGDK.3.2.1},
  URN =		{urn:nbn:de:0030-drops-248117},
  doi =		{10.4230/TGDK.3.2.1},
  annote =	{Keywords: Knowledge graphs, Resilience, Robustness}
}
Document
Testing Sumsets Is Hard

Authors: Xi Chen, Shivam Nadimpalli, Tim Randolph, Rocco A. Servedio, and Or Zamir

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
A subset S of the Boolean hypercube 𝔽₂ⁿ is a sumset if S = {a + b : a, b ∈ A} for some A ⊆ 𝔽₂ⁿ. Sumsets are central objects of study in additive combinatorics, where they play a role in several of the field’s most important results. We prove a lower bound of Ω(2^{n/2}) for the number of queries needed to test whether a Boolean function f:𝔽₂ⁿ → {0,1} is the indicator function of a sumset, ruling out an efficient testing algorithm for sumsets. Our lower bound for testing sumsets follows from sharp bounds on the related problem of shift testing, which may be of independent interest. We also give a near-optimal {2^{n/2} ⋅ poly(n)}-query algorithm for a smoothed analysis formulation of the sumset refutation problem. Finally, we include a simple proof that the number of different sumsets in 𝔽₂ⁿ is 2^{(1±o(1))2^{n-1}}.

Cite as

Xi Chen, Shivam Nadimpalli, Tim Randolph, Rocco A. Servedio, and Or Zamir. Testing Sumsets Is Hard. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 14:1-14:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ESA.2025.14,
  author =	{Chen, Xi and Nadimpalli, Shivam and Randolph, Tim and Servedio, Rocco A. and Zamir, Or},
  title =	{{Testing Sumsets Is Hard}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{14:1--14:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.14},
  URN =		{urn:nbn:de:0030-drops-244822},
  doi =		{10.4230/LIPIcs.ESA.2025.14},
  annote =	{Keywords: Sumsets, additive combinatorics, property testing, Boolean functions}
}
Document
Streaming Diameter of High-Dimensional Points

Authors: Magnús M. Halldórsson, Nicolaos Matsakis, and Pavel Veselý

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
We improve the space bound for streaming approximation of Diameter but also of Farthest Neighbor queries, Minimum Enclosing Ball and its Coreset, in high-dimensional Euclidean spaces. In particular, our deterministic streaming algorithms store 𝒪(ε^{-2}log(1/(ε))) points. This improves by a factor of ε^{-1} the previous space bound of Agarwal and Sharathkumar (SODA 2010), while retaining the state-of-the-art approximation guarantees, such as √2+ε for Diameter or Farthest Neighbor queries, and also offering a simpler and more complete argument. Moreover, we show that storing Ω(ε^{-1}) points is necessary for a streaming (√2+ε)-approximation of Farthest Pair and Farthest Neighbor queries.

Cite as

Magnús M. Halldórsson, Nicolaos Matsakis, and Pavel Veselý. Streaming Diameter of High-Dimensional Points. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 58:1-58:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{halldorsson_et_al:LIPIcs.ESA.2025.58,
  author =	{Halld\'{o}rsson, Magn\'{u}s M. and Matsakis, Nicolaos and Vesel\'{y}, Pavel},
  title =	{{Streaming Diameter of High-Dimensional Points}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{58:1--58:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.58},
  URN =		{urn:nbn:de:0030-drops-245263},
  doi =		{10.4230/LIPIcs.ESA.2025.58},
  annote =	{Keywords: streaming algorithm, farthest pair, diameter, minimum enclosing ball, coreset}
}
Document
New Algorithms for Pigeonhole Equal Subset Sum

Authors: Ce Jin, Ryan Williams, and Stan Zhang

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
We study the Pigeonhole Equal Subset Sum problem, which is a total-search variant of the Subset Sum problem introduced by Papadimitriou (1994): we are given a set of n positive integers {w₁,…,w_n} with the additional restriction that ∑_{i=1}^n w_i < 2ⁿ - 1, and want to find two different subsets A,B ⊆ [n] such that ∑_{i∈A} w_i = ∑_{i∈B} w_i. Very recently, Jin and Wu (ICALP 2024) gave a randomized algorithm solving Pigeonhole Equal Subset Sum in O^*(2^{0.4n}) time, beating the classical meet-in-the-middle algorithm with O^*(2^{n/2}) runtime. In this paper, we refine Jin and Wu’s techniques to improve the runtime even further to O^*(2^{n/3}).

Cite as

Ce Jin, Ryan Williams, and Stan Zhang. New Algorithms for Pigeonhole Equal Subset Sum. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 86:1-86:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{jin_et_al:LIPIcs.ESA.2025.86,
  author =	{Jin, Ce and Williams, Ryan and Zhang, Stan},
  title =	{{New Algorithms for Pigeonhole Equal Subset Sum}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{86:1--86:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.86},
  URN =		{urn:nbn:de:0030-drops-245541},
  doi =		{10.4230/LIPIcs.ESA.2025.86},
  annote =	{Keywords: pigeonhole principle, subset sums}
}
Document
Canonical for Automated Theorem Proving in Lean

Authors: Chase Norman and Jeremy Avigad

Published in: LIPIcs, Volume 352, 16th International Conference on Interactive Theorem Proving (ITP 2025)


Abstract
Canonical is a solver for type inhabitation in dependent type theory, that is, the problem of producing a term of a given type. We present a Lean tactic which invokes Canonical to generate proof terms and synthesize programs. The tactic supports higher-order and dependently-typed goals, structural recursion over indexed inductive types, and definitional equality. Canonical finds proofs for 84% of Natural Number Game problems in 51 seconds total.

Cite as

Chase Norman and Jeremy Avigad. Canonical for Automated Theorem Proving in Lean. In 16th International Conference on Interactive Theorem Proving (ITP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 352, pp. 14:1-14:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{norman_et_al:LIPIcs.ITP.2025.14,
  author =	{Norman, Chase and Avigad, Jeremy},
  title =	{{Canonical for Automated Theorem Proving in Lean}},
  booktitle =	{16th International Conference on Interactive Theorem Proving (ITP 2025)},
  pages =	{14:1--14:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-396-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{352},
  editor =	{Forster, Yannick and Keller, Chantal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2025.14},
  URN =		{urn:nbn:de:0030-drops-246128},
  doi =		{10.4230/LIPIcs.ITP.2025.14},
  annote =	{Keywords: Automated Reasoning, Interactive Theorem Proving, Dependent Type Theory, Inhabitation, Unification, Program Synthesis, Formal Methods}
}
Document
RANDOM
On the Spectral Expansion of Monotone Subsets of the Hypercube

Authors: Yumou Fei and Renato Ferreira Pinto Jr.

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
We study the spectral gap of subgraphs of the hypercube induced by monotone subsets of vertices. For a monotone subset A ⊆ {0,1}ⁿ of density μ(A), the previous best lower bound on the spectral gap, due to Cohen [Cohen, 2016], was γ ≳ μ(A)/n², improving upon the earlier bound γ ≳ μ(A)²/n² established by Ding and Mossel [Ding and Mossel, 2014]. In this paper, we prove the optimal lower bound γ ≳ μ(A)/n. As a corollary, we improve the mixing time upper bound of the random walk on constant-density monotone sets from O(n³), as shown by Ding and Mossel, to O(n²). Along the way, we develop two new inequalities that may be of independent interest: (1) a directed L²-Poincaré inequality on the hypercube, and (2) an "approximate" FKG inequality for monotone sets.

Cite as

Yumou Fei and Renato Ferreira Pinto Jr.. On the Spectral Expansion of Monotone Subsets of the Hypercube. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 42:1-42:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{fei_et_al:LIPIcs.APPROX/RANDOM.2025.42,
  author =	{Fei, Yumou and Ferreira Pinto Jr., Renato},
  title =	{{On the Spectral Expansion of Monotone Subsets of the Hypercube}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{42:1--42:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.42},
  URN =		{urn:nbn:de:0030-drops-244081},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.42},
  annote =	{Keywords: Random walks, mixing time, FKG inequality, Poincar\'{e} inequality, directed isoperimetry}
}
Document
Towards a Complexity-Theoretic Dichotomy for TQFT Invariants

Authors: Nicolas Bridges and Eric Samperton

Published in: LIPIcs, Volume 350, 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)


Abstract
We show that for any fixed (2+1)-dimensional TQFT over ℂ of either Turaev-Viro-Barrett-Westbury or Reshetikhin-Turaev type, the problem of (exactly) computing its invariants on closed 3-manifolds is either solvable in polynomial time, or else it is #𝖯-hard to (exactly) contract certain tensors that are built from the TQFT’s fusion category. Our proof is an application of a dichotomy result of Cai and Chen [J. ACM, 2017] concerning weighted constraint satisfaction problems over ℂ. We leave for future work the issue of reinterpreting the conditions of Cai and Chen that distinguish between the two cases (i.e. #𝖯-hard tensor contractions vs. polynomial time invariants) in terms of fusion categories. We expect that with more effort, our reduction can be improved so that one gets a dichotomy directly for TQFTs' invariants of 3-manifolds rather than more general tensors built from TQFTs' fusion categories.

Cite as

Nicolas Bridges and Eric Samperton. Towards a Complexity-Theoretic Dichotomy for TQFT Invariants. In 20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 350, pp. 5:1-5:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bridges_et_al:LIPIcs.TQC.2025.5,
  author =	{Bridges, Nicolas and Samperton, Eric},
  title =	{{Towards a Complexity-Theoretic Dichotomy for TQFT Invariants}},
  booktitle =	{20th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2025)},
  pages =	{5:1--5:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-392-8},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{350},
  editor =	{Fefferman, Bill},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2025.5},
  URN =		{urn:nbn:de:0030-drops-240548},
  doi =		{10.4230/LIPIcs.TQC.2025.5},
  annote =	{Keywords: Complexity, topological quantum field theory, dichotomy theorems, constraint satisfaction problems, tensor categories}
}
Document
U-Prithvi: Integrating a Foundation Model and U-Net for Enhanced Flood Inundation Mapping

Authors: Vit Kostejn, Yamil Essus, Jenna Abrahamson, and Ranga Raju Vatsavai

Published in: LIPIcs, Volume 346, 13th International Conference on Geographic Information Science (GIScience 2025)


Abstract
In recent years, large pre-trained models, commonly referred to as foundation models, have become increasingly popular for various tasks leveraging transfer learning. This trend has expanded to remote sensing, where transformer-based foundation models such as Prithvi, msGFM, and SatSwinMAE have been utilized for a range of applications. While these transformer-based models, particularly the Prithvi model, exhibit strong generalization capabilities, they have limitations on capturing fine-grained details compared to convolutional neural network architectures like U-Net in segmentation tasks. In this paper, we propose a novel architecture, U-Prithvi, which combines the strengths of the Prithvi transformer with those of U-Net. We introduce a RandomHalfMaskLayer to ensure balanced learning from both models during training. Our approach is evaluated on the Sen1Floods11 dataset for flood inundation mapping, and experimental results demonstrate better performance of U-Prithvi over both individual models, achieving improved performance on out-of-sample data. While this principle is illustrated using the Prithvi model, it is easily adaptable to other foundation models.

Cite as

Vit Kostejn, Yamil Essus, Jenna Abrahamson, and Ranga Raju Vatsavai. U-Prithvi: Integrating a Foundation Model and U-Net for Enhanced Flood Inundation Mapping. In 13th International Conference on Geographic Information Science (GIScience 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 346, pp. 18:1-18:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kostejn_et_al:LIPIcs.GIScience.2025.18,
  author =	{Kostejn, Vit and Essus, Yamil and Abrahamson, Jenna and Vatsavai, Ranga Raju},
  title =	{{U-Prithvi: Integrating a Foundation Model and U-Net for Enhanced Flood Inundation Mapping}},
  booktitle =	{13th International Conference on Geographic Information Science (GIScience 2025)},
  pages =	{18:1--18:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-378-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{346},
  editor =	{Sila-Nowicka, Katarzyna and Moore, Antoni and O'Sullivan, David and Adams, Benjamin and Gahegan, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.GIScience.2025.18},
  URN =		{urn:nbn:de:0030-drops-238479},
  doi =		{10.4230/LIPIcs.GIScience.2025.18},
  annote =	{Keywords: GeoAI, flood mapping, foundation model, U-Net, Prithvi}
}
Document
Mutational Signature Refitting on Sparse Pan-Cancer Data

Authors: Gal Gilad, Teresa M. Przytycka, and Roded Sharan

Published in: LIPIcs, Volume 344, 25th International Conference on Algorithms for Bioinformatics (WABI 2025)


Abstract
Mutational processes shape cancer genomes, leaving characteristic marks that are termed signatures. The level of activity of each such process, or its signature exposure, provides important information on the disease, improving patient stratification and the prediction of drug response. Thus, there is growing interest in developing refitting methods that decipher those exposures. Previous work in this domain was unsupervised in nature, employing algebraic decomposition and probabilistic inference methods. Here we provide a supervised approach to the problem of signature refitting and show its superiority over current methods. Our method, SuRe, leverages a neural network model to capture correlations between signature exposures in real data. We show that SuRe outperforms previous methods on sparse mutation data from tumor type specific data sets, as well as pan-cancer data sets, with an increasing advantage as the data become sparser. We further demonstrate its utility in clinical settings.

Cite as

Gal Gilad, Teresa M. Przytycka, and Roded Sharan. Mutational Signature Refitting on Sparse Pan-Cancer Data. In 25th International Conference on Algorithms for Bioinformatics (WABI 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 344, pp. 11:1-11:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{gilad_et_al:LIPIcs.WABI.2025.11,
  author =	{Gilad, Gal and Przytycka, Teresa M. and Sharan, Roded},
  title =	{{Mutational Signature Refitting on Sparse Pan-Cancer Data}},
  booktitle =	{25th International Conference on Algorithms for Bioinformatics (WABI 2025)},
  pages =	{11:1--11:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-386-7},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{344},
  editor =	{Brejov\'{a}, Bro\v{n}a and Patro, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2025.11},
  URN =		{urn:nbn:de:0030-drops-239374},
  doi =		{10.4230/LIPIcs.WABI.2025.11},
  annote =	{Keywords: mutational signatures, signature refitting, cancer genomics, genomic data analysis, somatic mutations}
}
Document
Bridging Language Models and Symbolic Solvers via the Model Context Protocol

Authors: Stefan Szeider

Published in: LIPIcs, Volume 341, 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)


Abstract
This paper presents the MCP Solver, a system that bridges large language models with symbolic solvers through the Model Context Protocol (MCP). The system includes a server and a client component. The server provides an interface to constraint programming (via MiniZinc Python), propositional satisfiability and maximum satisfiability (both via PySAT), and SAT modulo Theories (via Python Z3). The client contains an agent that connects to the server via MCP and uses a language model to autonomously translate problem statements (given in English) into encodings through an incremental editing process and runs the solver. Our experiments demonstrate that this neurosymbolic integration effectively combines the natural language understanding of language models with robust solving capabilities across multiple solving paradigms.

Cite as

Stefan Szeider. Bridging Language Models and Symbolic Solvers via the Model Context Protocol. In 28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 341, pp. 30:1-30:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{szeider:LIPIcs.SAT.2025.30,
  author =	{Szeider, Stefan},
  title =	{{Bridging Language Models and Symbolic Solvers via the Model Context Protocol}},
  booktitle =	{28th International Conference on Theory and Applications of Satisfiability Testing (SAT 2025)},
  pages =	{30:1--30:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-381-2},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{341},
  editor =	{Berg, Jeremias and Nordstr\"{o}m, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SAT.2025.30},
  URN =		{urn:nbn:de:0030-drops-237649},
  doi =		{10.4230/LIPIcs.SAT.2025.30},
  annote =	{Keywords: Large Language Models, Agents, Constraint Programming, Satisfiability Solvers, Maximum Satisfiability, SAT Modulo Theories, Model Context Protocol}
}
Document
A Min-Entropy Approach to Multi-Party Communication Lower Bounds

Authors: Mi-Ying (Miryam) Huang, Xinyu Mao, Shuo Wang, Guangxu Yang, and Jiapeng Zhang

Published in: LIPIcs, Volume 339, 40th Computational Complexity Conference (CCC 2025)


Abstract
Information complexity is one of the most powerful techniques to prove information-theoretical lower bounds, in which Shannon entropy plays a central role. Though Shannon entropy has some convenient properties, such as the chain rule, it still has inherent limitations. One of the most notable barriers is the square-root loss, which appears in the square-root gap between entropy gaps and statistical distances, e.g., Pinsker’s inequality. To bypass this barrier, we introduce a new method based on min-entropy analysis. Building on this new method, we prove the following results. - An Ω(N^{∑_i α_i - max_i {α_i}}/k) randomized communication lower bound of the k-party set-intersection problem where the i-th party holds a random set of size ≈ N^{1-α_i}. - A tight Ω(n/k) randomized lower bound of the k-party Tree Pointer Jumping problems, improving an Ω(n/k²) lower bound by Chakrabarti, Cormode, and McGregor (STOC 08). - An Ω(n/k+√n) lower bound of the Chained Index problem, improving an Ω(n/k²) lower bound by Cormode, Dark, and Konrad (ICALP 19). Since these problems served as hard problems for numerous applications in streaming lower bounds and cryptography, our new lower bounds directly improve these streaming lower bounds and cryptography lower bounds. On the technical side, min-entropy does not have nice properties such as the chain rule. To address this issue, we enhance the structure-vs-pseudorandomness decomposition used by Göös, Pitassi, and Watson (FOCS 17) and Yang and Zhang (STOC 24); both papers used this decomposition to prove communication lower bounds. In this paper, we give a new breath to this method in the multi-party setting, presenting a new toolkit for proving multi-party communication lower bounds.

Cite as

Mi-Ying (Miryam) Huang, Xinyu Mao, Shuo Wang, Guangxu Yang, and Jiapeng Zhang. A Min-Entropy Approach to Multi-Party Communication Lower Bounds. In 40th Computational Complexity Conference (CCC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 339, pp. 33:1-33:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{huang_et_al:LIPIcs.CCC.2025.33,
  author =	{Huang, Mi-Ying (Miryam) and Mao, Xinyu and Wang, Shuo and Yang, Guangxu and Zhang, Jiapeng},
  title =	{{A Min-Entropy Approach to Multi-Party Communication Lower Bounds}},
  booktitle =	{40th Computational Complexity Conference (CCC 2025)},
  pages =	{33:1--33:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-379-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{339},
  editor =	{Srinivasan, Srikanth},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2025.33},
  URN =		{urn:nbn:de:0030-drops-237273},
  doi =		{10.4230/LIPIcs.CCC.2025.33},
  annote =	{Keywords: communication complexity, lifting theorems, set intersection, chained index}
}
Document
From an Odd Arity Signature to a Holant Dichotomy

Authors: Boning Meng, Juqiu Wang, Mingji Xia, and Jiayi Zheng

Published in: LIPIcs, Volume 339, 40th Computational Complexity Conference (CCC 2025)


Abstract
Holant is an essential framework in the field of counting complexity. For over fifteen years, researchers have been clarifying the complexity classification for complex-valued Holant on Boolean domain, a challenge that remains unresolved. In this article, we prove a complexity dichotomy for complex-valued Holant on Boolean domain when a non-trivial signature of odd arity exists. This dichotomy is based on the dichotomy for #EO, and consequently is an FP^NP vs. #P dichotomy as well, stating that each problem is either in FP^NP or #P-hard. Furthermore, we establish a generalized version of the decomposition lemma for complex-valued Holant on Boolean domain. It asserts that each signature can be derived from its tensor product with other signatures, or conversely, the problem itself is in FP^NP. We believe that this result is a powerful method for building reductions in complex-valued Holant, as it is also employed as a pivotal technique in the proof of the aforementioned dichotomy in this article.

Cite as

Boning Meng, Juqiu Wang, Mingji Xia, and Jiayi Zheng. From an Odd Arity Signature to a Holant Dichotomy. In 40th Computational Complexity Conference (CCC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 339, pp. 23:1-23:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{meng_et_al:LIPIcs.CCC.2025.23,
  author =	{Meng, Boning and Wang, Juqiu and Xia, Mingji and Zheng, Jiayi},
  title =	{{From an Odd Arity Signature to a Holant Dichotomy}},
  booktitle =	{40th Computational Complexity Conference (CCC 2025)},
  pages =	{23:1--23:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-379-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{339},
  editor =	{Srinivasan, Srikanth},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2025.23},
  URN =		{urn:nbn:de:0030-drops-237177},
  doi =		{10.4230/LIPIcs.CCC.2025.23},
  annote =	{Keywords: Complexity dichotomy, Counting, Holant problem, #P}
}
Document
Direct Sums for Parity Decision Trees

Authors: Tyler Besselman, Mika Göös, Siyao Guo, Gilbert Maystre, and Weiqiang Yuan

Published in: LIPIcs, Volume 339, 40th Computational Complexity Conference (CCC 2025)


Abstract
Direct sum theorems state that the cost of solving k instances of a problem is at least Ω(k) times the cost of solving a single instance. We prove the first such results in the randomised parity decision tree model. We show that a direct sum theorem holds whenever (1) the lower bound for parity decision trees is proved using the discrepancy method; or (2) the lower bound is proved relative to a product distribution.

Cite as

Tyler Besselman, Mika Göös, Siyao Guo, Gilbert Maystre, and Weiqiang Yuan. Direct Sums for Parity Decision Trees. In 40th Computational Complexity Conference (CCC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 339, pp. 16:1-16:38, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{besselman_et_al:LIPIcs.CCC.2025.16,
  author =	{Besselman, Tyler and G\"{o}\"{o}s, Mika and Guo, Siyao and Maystre, Gilbert and Yuan, Weiqiang},
  title =	{{Direct Sums for Parity Decision Trees}},
  booktitle =	{40th Computational Complexity Conference (CCC 2025)},
  pages =	{16:1--16:38},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-379-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{339},
  editor =	{Srinivasan, Srikanth},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2025.16},
  URN =		{urn:nbn:de:0030-drops-237105},
  doi =		{10.4230/LIPIcs.CCC.2025.16},
  annote =	{Keywords: direct sum, parity decision trees, query complexity}
}
Document
Enabling Containerisation of Distributed Applications with Real-Time Constraints

Authors: Nasim Samimi, Luca Abeni, Daniel Casini, Mauro Marinoni, Twan Basten, Mitra Nasri, Marc Geilen, and Alessandro Biondi

Published in: LIPIcs, Volume 335, 37th Euromicro Conference on Real-Time Systems (ECRTS 2025)


Abstract
Containerisation is becoming a cornerstone of modern distributed systems, thanks to their lightweight virtualisation, high portability, and seamless integration with orchestration tools such as Kubernetes. The usage of containers has also gained traction in real-time cyber-physical systems, such as software-defined vehicles, which are characterised by strict timing requirements to ensure safety and performance. Nevertheless, ensuring real-time execution of co-located containers is challenging because of mutual interference due to the sharing of the same processing hardware. Existing parallel computing frameworks such as Ray and its Kubernetes-enabled variant, KubeRay, excel in distributed computation but lack support for scheduling policies that allow guaranteeing real-time timing constraints and CPU resource isolation between containers, such as the SCHED_DEADLINE policy of Linux. To fill this gap, this paper extends Ray to support real-time containers that leverage SCHED_DEADLINE. To this end, we propose KubeDeadline, a novel, modular Kubernetes extension to support SCHED_DEADLINE. We evaluate our approach through extensive experiments, using synthetic workloads and a case study based on the MobileNet and EfficientNet deep neural networks. Our evaluation shows that KubeDeadline ensures deadline compliance in all synthetic workloads, adds minimal deployment overhead (in the order of milliseconds), and achieves lower worst-case response times, up to 4 times lower, than vanilla Kubernetes under background interference.

Cite as

Nasim Samimi, Luca Abeni, Daniel Casini, Mauro Marinoni, Twan Basten, Mitra Nasri, Marc Geilen, and Alessandro Biondi. Enabling Containerisation of Distributed Applications with Real-Time Constraints. In 37th Euromicro Conference on Real-Time Systems (ECRTS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 335, pp. 3:1-3:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{samimi_et_al:LIPIcs.ECRTS.2025.3,
  author =	{Samimi, Nasim and Abeni, Luca and Casini, Daniel and Marinoni, Mauro and Basten, Twan and Nasri, Mitra and Geilen, Marc and Biondi, Alessandro},
  title =	{{Enabling Containerisation of Distributed Applications with Real-Time Constraints}},
  booktitle =	{37th Euromicro Conference on Real-Time Systems (ECRTS 2025)},
  pages =	{3:1--3:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-377-5},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{335},
  editor =	{Mancuso, Renato},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECRTS.2025.3},
  URN =		{urn:nbn:de:0030-drops-235816},
  doi =		{10.4230/LIPIcs.ECRTS.2025.3},
  annote =	{Keywords: Kubernetes, real-time containers, SCHED\underlineDEADLINE, KubeRay}
}
  • Refine by Type
  • 60 Document/PDF
  • 39 Document/HTML

  • Refine by Publication Year
  • 36 2025
  • 3 2024
  • 7 2023
  • 1 2022
  • 2 2021
  • Show More...

  • Refine by Author
  • 13 Chen, Xi
  • 11 Servedio, Rocco A.
  • 4 De, Anindya
  • 3 Tan, Li-Yang
  • 3 Waingarten, Erik
  • Show More...

  • Refine by Series/Journal
  • 48 LIPIcs
  • 2 OASIcs
  • 2 LITES
  • 7 TGDK
  • 1 DagMan

  • Refine by Classification
  • 7 Theory of computation → Problems, reductions and completeness
  • 6 Theory of computation → Streaming, sublinear and near linear time algorithms
  • 4 Computer systems organization → Real-time systems
  • 4 Theory of computation → Design and analysis of algorithms
  • 3 Computing methodologies → Knowledge representation and reasoning
  • Show More...

  • Refine by Keyword
  • 4 property testing
  • 3 Large Language Models
  • 3 trace reconstruction
  • 2 Boolean functions
  • 2 Complexity dichotomy
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail