51 Search Results for "Cohen-Addad, Vincent"


Document
Invited Talk
Recent Progress on Correlation Clustering: From Local Algorithms to Better Approximation Algorithms and Back (Invited Talk)

Authors: Vincent Cohen-Addad

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
Correlation clustering is a classic model for clustering problems arising in machine learning and data mining. Given a set of data elements represented as vertices of a graph and pairwise similarity represented as edges, the goal is to find a partition of the vertex set so as to minimize the total number of edges across the parts plus the total number of non-edges within the parts. Introduced in the early 2000s [Bansal et al., 2004], correlation clustering has received a large amount of attention through the years. A natural linear programming relaxation was shown to have an integrality gap of at least 2 and at most 2.5 [Ailon et al., 2008] in 2005, and in 2015 at most 2.06 [Chawla et al., 2015]. In 2021, motivated by large-scale application new structural insights allowed to derive a simple, practical algorithm that achieved an O(1)-approximation in a variety of models (Massively Parallel, Sublinear, Streaming or Differentially-private) [Vincent Cohen{-}Addad et al., 2021; Cohen-Addad et al., 2022]. These new insights turned out to be a key building block in designing better algorithms: It serves as a pre-clustering of the input graph that enables algorithm with approximation guarantees significantly better than 2 [Vincent Cohen{-}Addad et al., 2023; Vincent Cohen{-}Addad et al., 2022]. It is a key component in the new algorithm that achieves a 1.44-approximation [Nairen Cao et al., 2024] and in the new local-search based 1.84-approximation for the Massively Parallel, Sublinear, and Streaming models [Vincent Cohen{-}Addad et al., 2024]. This talk will review the above recent development and what are the main open research directions. A collection of joint works with Nairen Cao, Silvio Lattanzi, Euiwoong Lee, Shi Li, David Rasmussen Lolck, Slobodan Mitrovic, Alantha Newman, Ashkan Norouzi-Fard, Nikos Parotsidis, Marcin Pilipczuk, Jakub Tarnawski, Mikkel Thorup, Lukas Vogl, Shuyi Yan, Hanwen Zhang.

Cite as

Vincent Cohen-Addad. Recent Progress on Correlation Clustering: From Local Algorithms to Better Approximation Algorithms and Back (Invited Talk). In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 1:1-1:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cohenaddad:LIPIcs.ESA.2024.1,
  author =	{Cohen-Addad, Vincent},
  title =	{{Recent Progress on Correlation Clustering: From Local Algorithms to Better Approximation Algorithms and Back}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{1:1--1:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.1},
  URN =		{urn:nbn:de:0030-drops-210728},
  doi =		{10.4230/LIPIcs.ESA.2024.1},
  annote =	{Keywords: Approximation Algorithms, Clustering, Local Model}
}
Document
On Connections Between k-Coloring and Euclidean k-Means

Authors: Enver Aman, Karthik C. S., and Sharath Punna

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
In the Euclidean k-means problems we are given as input a set of n points in ℝ^d and the goal is to find a set of k points C ⊆ ℝ^d, so as to minimize the sum of the squared Euclidean distances from each point in P to its closest center in C. In this paper, we formally explore connections between the k-coloring problem on graphs and the Euclidean k-means problem. Our results are as follows: - For all k ≥ 3, we provide a simple reduction from the k-coloring problem on regular graphs to the Euclidean k-means problem. Moreover, our technique extends to enable a reduction from a structured max-cut problem (which may be considered as a partial 2-coloring problem) to the Euclidean 2-means problem. Thus, we have a simple and alternate proof of the NP-hardness of Euclidean 2-means problem. - In the other direction, we mimic the O(1.7297ⁿ) time algorithm of Williams [TCS'05] for the max-cut of problem on n vertices to obtain an algorithm for the Euclidean 2-means problem with the same runtime, improving on the naive exhaustive search running in 2ⁿ⋅ poly(n,d) time. - We prove similar results and connections as above for the Euclidean k-min-sum problem.

Cite as

Enver Aman, Karthik C. S., and Sharath Punna. On Connections Between k-Coloring and Euclidean k-Means. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 9:1-9:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{aman_et_al:LIPIcs.ESA.2024.9,
  author =	{Aman, Enver and Karthik C. S. and Punna, Sharath},
  title =	{{On Connections Between k-Coloring and Euclidean k-Means}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{9:1--9:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.9},
  URN =		{urn:nbn:de:0030-drops-210808},
  doi =		{10.4230/LIPIcs.ESA.2024.9},
  annote =	{Keywords: k-means, k-minsum, Euclidean space, fine-grained complexity}
}
Document
On Finding Longest Palindromic Subsequences Using Longest Common Subsequences

Authors: Gerth Stølting Brodal, Rolf Fagerberg, and Casper Moldrup Rysgaard

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
Two standard textbook problems illustrating dynamic programming are to find the longest common subsequence (LCS) between two strings and to find the longest palindromic subsequence (LPS) of a single string. A popular claim is that the longest palindromic subsequence in a string can be computed as the longest common subsequence between the string and the reversed string. We prove that the correctness of this claim depends on how the longest common subsequence is computed. In particular, we prove that the classical dynamic programming solution by Wagner and Fischer [JACM 1974] for finding an LCS in fact does find an LPS, while a slightly different LCS backtracking strategy makes the algorithm fail to always report a palindrome.

Cite as

Gerth Stølting Brodal, Rolf Fagerberg, and Casper Moldrup Rysgaard. On Finding Longest Palindromic Subsequences Using Longest Common Subsequences. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 35:1-35:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{brodal_et_al:LIPIcs.ESA.2024.35,
  author =	{Brodal, Gerth St{\o}lting and Fagerberg, Rolf and Rysgaard, Casper Moldrup},
  title =	{{On Finding Longest Palindromic Subsequences Using Longest Common Subsequences}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{35:1--35:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.35},
  URN =		{urn:nbn:de:0030-drops-211068},
  doi =		{10.4230/LIPIcs.ESA.2024.35},
  annote =	{Keywords: Palindromic subsequence, longest common subsequence, dynamic programming}
}
Document
Making Multicurves Cross Minimally on Surfaces

Authors: Loïc Dubois

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
On an orientable surface S, consider a collection Γ of closed curves. The (geometric) intersection number i_S(Γ) is the minimum number of self-intersections that a collection Γ' can have, where Γ' results from a continuous deformation (homotopy) of Γ. We provide algorithms that compute i_S(Γ) and such a Γ', assuming that Γ is given by a collection of closed walks of length n in a graph M cellularly embedded on S, in O(n log n) time when M and S are fixed. The state of the art is a paper of Despré and Lazarus [SoCG 2017, J. ACM 2019], who compute i_S(Γ) in O(n²) time, and Γ' in O(n⁴) time if Γ is a single closed curve. Our result is more general since we can put an arbitrary number of closed curves in minimal position. Also, our algorithms are quasi-linear in n instead of quadratic and quartic. Most importantly, our proofs are simpler, shorter, and more structured. We use techniques from two-dimensional topology and from the theory of hyperbolic surfaces. Most notably, we prove a new property of the reducing triangulations introduced by Colin de Verdière, Despré, and Dubois [SODA 2024], reducing our problem to the case of surfaces with boundary. As a key subroutine, we rely on an algorithm of Fulek and Tóth [JCO 2020].

Cite as

Loïc Dubois. Making Multicurves Cross Minimally on Surfaces. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 50:1-50:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dubois:LIPIcs.ESA.2024.50,
  author =	{Dubois, Lo\"{i}c},
  title =	{{Making Multicurves Cross Minimally on Surfaces}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{50:1--50:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.50},
  URN =		{urn:nbn:de:0030-drops-211216},
  doi =		{10.4230/LIPIcs.ESA.2024.50},
  annote =	{Keywords: Algorithms, Topology, Surfaces, Closed Curves, Geometric Intersection Number}
}
Document
Euclidean Capacitated Vehicle Routing in the Random Setting: A 1.55-Approximation Algorithm

Authors: Zipei Nie and Hang Zhou

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
We study the unit-demand capacitated vehicle routing problem in the random setting of the Euclidean plane. The objective is to visit n random terminals in a square using a set of tours of minimum total length, such that each tour visits the depot and at most k terminals. We design an algorithm combining the classical sweep heuristic and the framework for the Euclidean traveling salesman problem due to Arora [J. ACM 1998] and Mitchell [SICOMP 1999]. We show that our algorithm is a polynomial-time approximation of ratio at most 1.55 asymptotically almost surely. This improves on the prior ratio of 1.915 due to Mathieu and Zhou [RSA 2022]. In addition, we conjecture that, for any ε > 0, our algorithm is a (1+ε)-approximation asymptotically almost surely.

Cite as

Zipei Nie and Hang Zhou. Euclidean Capacitated Vehicle Routing in the Random Setting: A 1.55-Approximation Algorithm. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 91:1-91:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{nie_et_al:LIPIcs.ESA.2024.91,
  author =	{Nie, Zipei and Zhou, Hang},
  title =	{{Euclidean Capacitated Vehicle Routing in the Random Setting: A 1.55-Approximation Algorithm}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{91:1--91:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.91},
  URN =		{urn:nbn:de:0030-drops-211627},
  doi =		{10.4230/LIPIcs.ESA.2024.91},
  annote =	{Keywords: capacitated vehicle routing, approximation algorithm, combinatorial optimization}
}
Document
Fully Dynamic k-Means Coreset in Near-Optimal Update Time

Authors: Max Dupré la Tour, Monika Henzinger, and David Saulpic

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
We study in this paper the problem of maintaining a solution to k-median and k-means clustering in a fully dynamic setting. To do so, we present an algorithm to efficiently maintain a coreset, a compressed version of the dataset, that allows easy computation of a clustering solution at query time. Our coreset algorithm has near-optimal update time of Õ(k) in general metric spaces, which reduces to Õ(d) in the Euclidean space ℝ^d. The query time is O(k²) in general metrics, and O(kd) in ℝ^d. To maintain a constant-factor approximation for k-median and k-means clustering in Euclidean space, this directly leads to an algorithm with update time Õ(d), and query time Õ(kd + k²). To maintain a O(polylog k)-approximation, the query time is reduced to Õ(kd).

Cite as

Max Dupré la Tour, Monika Henzinger, and David Saulpic. Fully Dynamic k-Means Coreset in Near-Optimal Update Time. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 100:1-100:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{latour_et_al:LIPIcs.ESA.2024.100,
  author =	{la Tour, Max Dupr\'{e} and Henzinger, Monika and Saulpic, David},
  title =	{{Fully Dynamic k-Means Coreset in Near-Optimal Update Time}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{100:1--100:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.100},
  URN =		{urn:nbn:de:0030-drops-211716},
  doi =		{10.4230/LIPIcs.ESA.2024.100},
  annote =	{Keywords: clustering, fully-dynamic, coreset, k-means}
}
Document
APPROX
Hybrid k-Clustering: Blending k-Median and k-Center

Authors: Fedor V. Fomin, Petr A. Golovach, Tanmay Inamdar, Saket Saurabh, and Meirav Zehavi

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
We propose a novel clustering model encompassing two well-known clustering models: k-center clustering and k-median clustering. In the Hybrid k-Clustering problem, given a set P of points in ℝ^d, an integer k, and a non-negative real r, our objective is to position k closed balls of radius r to minimize the sum of distances from points not covered by the balls to their closest balls. Equivalently, we seek an optimal L₁-fitting of a union of k balls of radius r to a set of points in the Euclidean space. When r = 0, this corresponds to k-median; when the minimum sum is zero, indicating complete coverage of all points, it is k-center. Our primary result is a bicriteria approximation algorithm that, for a given ε > 0, produces a hybrid k-clustering with balls of radius (1+ε)r. This algorithm achieves a cost at most 1+ε of the optimum, and it operates in time 2^{(kd/ε)^𝒪(1)} ⋅ n^𝒪(1). Notably, considering the established lower bounds on k-center and k-median, our bicriteria approximation stands as the best possible result for Hybrid k-Clustering.

Cite as

Fedor V. Fomin, Petr A. Golovach, Tanmay Inamdar, Saket Saurabh, and Meirav Zehavi. Hybrid k-Clustering: Blending k-Median and k-Center. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 4:1-4:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{fomin_et_al:LIPIcs.APPROX/RANDOM.2024.4,
  author =	{Fomin, Fedor V. and Golovach, Petr A. and Inamdar, Tanmay and Saurabh, Saket and Zehavi, Meirav},
  title =	{{Hybrid k-Clustering: Blending k-Median and k-Center}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{4:1--4:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.4},
  URN =		{urn:nbn:de:0030-drops-209975},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.4},
  annote =	{Keywords: clustering, k-center, k-median, Euclidean space, fpt approximation}
}
Document
APPROX
Online k-Median with Consistent Clusters

Authors: Benjamin Moseley, Heather Newman, and Kirk Pruhs

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
We consider the problem in which n points arrive online over time, and upon arrival must be irrevocably assigned to one of k clusters where the objective is the standard k-median objective. Lower-bound instances show that for this problem no online algorithm can achieve a competitive ratio bounded by any function of n. Thus we turn to a beyond worst-case analysis approach, namely we assume that the online algorithm is a priori provided with a predicted budget B that is an upper bound to the optimal objective value (e.g., obtained from past instances). Our main result is an online algorithm whose competitive ratio (measured against B) is solely a function of k. We also give a lower bound showing that the competitive ratio of every algorithm must depend on k.

Cite as

Benjamin Moseley, Heather Newman, and Kirk Pruhs. Online k-Median with Consistent Clusters. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 20:1-20:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{moseley_et_al:LIPIcs.APPROX/RANDOM.2024.20,
  author =	{Moseley, Benjamin and Newman, Heather and Pruhs, Kirk},
  title =	{{Online k-Median with Consistent Clusters}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{20:1--20:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.20},
  URN =		{urn:nbn:de:0030-drops-210133},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.20},
  annote =	{Keywords: k-median, online algorithms, learning-augmented algorithms, beyond worst-case analysis}
}
Document
APPROX
Learning-Augmented Maximum Independent Set

Authors: Vladimir Braverman, Prathamesh Dharangutte, Vihan Shah, and Chen Wang

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
We study the Maximum Independent Set (MIS) problem on general graphs within the framework of learning-augmented algorithms. The MIS problem is known to be NP-hard and is also NP-hard to approximate to within a factor of n^(1-δ) for any δ > 0. We show that we can break this barrier in the presence of an oracle obtained through predictions from a machine learning model that answers vertex membership queries for a fixed MIS with probability 1/2+ε. In the first setting we consider, the oracle can be queried once per vertex to know if a vertex belongs to a fixed MIS, and the oracle returns the correct answer with probability 1/2 + ε. Under this setting, we show an algorithm that obtains an Õ((√Δ)/ε)-approximation in O(m) time where Δ is the maximum degree of the graph. In the second setting, we allow multiple queries to the oracle for a vertex, each of which is correct with probability 1/2 + ε. For this setting, we show an O(1)-approximation algorithm using O(n/ε²) total queries and Õ(m) runtime.

Cite as

Vladimir Braverman, Prathamesh Dharangutte, Vihan Shah, and Chen Wang. Learning-Augmented Maximum Independent Set. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 24:1-24:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{braverman_et_al:LIPIcs.APPROX/RANDOM.2024.24,
  author =	{Braverman, Vladimir and Dharangutte, Prathamesh and Shah, Vihan and Wang, Chen},
  title =	{{Learning-Augmented Maximum Independent Set}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{24:1--24:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.24},
  URN =		{urn:nbn:de:0030-drops-210179},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.24},
  annote =	{Keywords: Learning-augmented algorithms, maximum independent set, graph algorithms}
}
Document
RANDOM
Sparse High Dimensional Expanders via Local Lifts

Authors: Inbar Ben Yaacov, Yotam Dikstein, and Gal Maor

Published in: LIPIcs, Volume 317, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)


Abstract
High dimensional expanders (HDXs) are a hypergraph generalization of expander graphs. They are extensively studied in the math and TCS communities due to their many applications. Like expander graphs, HDXs are especially interesting for applications when they are bounded degree, namely, if the number of edges adjacent to every vertex is bounded. However, only a handful of constructions are known to have this property, all of which rely on algebraic techniques. In particular, no random or combinatorial construction of bounded degree high dimensional expanders is known. As a result, our understanding of these objects is limited. The degree of an i-face in an HDX is the number of (i+1)-faces that contain it. In this work we construct complexes whose higher dimensional faces have bounded degree. This is done by giving an elementary and deterministic algorithm that takes as input a regular k-dimensional HDX X and outputs another regular k-dimensional HDX X̂ with twice as many vertices. While the degree of vertices in X̂ grows, the degree of the (k-1)-faces in X̂ stays the same. As a result, we obtain a new "algebra-free" construction of HDXs whose (k-1)-face degree is bounded. Our construction algorithm is based on a simple and natural generalization of the expander graph construction by Bilu and Linial [Yehonatan Bilu and Nathan Linial, 2006], which build expander graphs using lifts coming from edge signings. Our construction is based on local lifts of high dimensional expanders, where a local lift is a new complex whose top-level links are lifts of the links of the original complex. We demonstrate that a local lift of an HDX is also an HDX in many cases. In addition, combining local lifts with existing bounded degree constructions creates new families of bounded degree HDXs with significantly different links than before. For every large enough D, we use this technique to construct families of bounded degree HDXs with links that have diameter ≥ D.

Cite as

Inbar Ben Yaacov, Yotam Dikstein, and Gal Maor. Sparse High Dimensional Expanders via Local Lifts. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 317, pp. 68:1-68:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{benyaacov_et_al:LIPIcs.APPROX/RANDOM.2024.68,
  author =	{Ben Yaacov, Inbar and Dikstein, Yotam and Maor, Gal},
  title =	{{Sparse High Dimensional Expanders via Local Lifts}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2024)},
  pages =	{68:1--68:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-348-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{317},
  editor =	{Kumar, Amit and Ron-Zewi, Noga},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2024.68},
  URN =		{urn:nbn:de:0030-drops-210612},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2024.68},
  annote =	{Keywords: High Dimensional Expanders, HDX, Spectral Expansion, Lifts, Covers, Explicit Constructions, Randomized Constructions, Deterministic Constructions}
}
Document
DeFiAligner: Leveraging Symbolic Analysis and Large Language Models for Inconsistency Detection in Decentralized Finance

Authors: Rundong Gan, Liyi Zhou, Le Wang, Kaihua Qin, and Xiaodong Lin

Published in: LIPIcs, Volume 316, 6th Conference on Advances in Financial Technologies (AFT 2024)


Abstract
Decentralized Finance (DeFi) has witnessed a monumental surge, reaching 53.039 billion USD in total value locked. As this sector continues to expand, ensuring the reliability of DeFi smart contracts becomes increasingly crucial. While some users are adept at reading code or the compiled bytecode to understand smart contracts, many rely on documentation. Therefore, discrepancies between the documentation and the deployed code can pose significant risks, whether these discrepancies are due to errors or intentional fraud. To tackle these challenges, we developed DeFiAligner, an end-to-end system to identify inconsistencies between documentation and smart contracts. DeFiAligner incorporates a symbolic execution tool, SEVM, which explores execution paths of on-chain binary code, recording memory and stack states. It automatically generates symbolic expressions for token balance changes and branch conditions, which, along with related project documents, are processed by LLMs. Using structured prompts, the LLMs evaluate the alignment between the symbolic expressions and the documentation. Our tests across three distinct scenarios demonstrate DeFiAligner’s capability to automate inconsistency detection in DeFi, achieving recall rates of 92% and 90% on two public datasets respectively.

Cite as

Rundong Gan, Liyi Zhou, Le Wang, Kaihua Qin, and Xiaodong Lin. DeFiAligner: Leveraging Symbolic Analysis and Large Language Models for Inconsistency Detection in Decentralized Finance. In 6th Conference on Advances in Financial Technologies (AFT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 316, pp. 7:1-7:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{gan_et_al:LIPIcs.AFT.2024.7,
  author =	{Gan, Rundong and Zhou, Liyi and Wang, Le and Qin, Kaihua and Lin, Xiaodong},
  title =	{{DeFiAligner: Leveraging Symbolic Analysis and Large Language Models for Inconsistency Detection in Decentralized Finance}},
  booktitle =	{6th Conference on Advances in Financial Technologies (AFT 2024)},
  pages =	{7:1--7:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-345-4},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{316},
  editor =	{B\"{o}hme, Rainer and Kiffer, Lucianna},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.AFT.2024.7},
  URN =		{urn:nbn:de:0030-drops-209431},
  doi =		{10.4230/LIPIcs.AFT.2024.7},
  annote =	{Keywords: Decentralized Finance Security, Large Language Models, Project Review, Symbolic Analysis, Smart Contracts}
}
Document
A Dynamic Logic for Symbolic Execution for the Smart Contract Programming Language Michelson

Authors: Barnabas Arvay, Thi Thu Ha Doan, and Peter Thiemann

Published in: LIPIcs, Volume 313, 38th European Conference on Object-Oriented Programming (ECOOP 2024)


Abstract
Verification of smart contracts is an important topic in the context of blockchain technology. We study an approach to verification that is based on symbolic execution. As a formal basis for symbolic execution, we design a dynamic logic for Michelson, the smart contract language of the Tezos blockchain, and prove its soundness in the proof assistant Agda. Towards the soundness proof we formalize the concrete semantics as well as its symbolic counterpart in a unified setting. The logic encompasses single contract runs as well as inter-contract runs chained in a single transaction.

Cite as

Barnabas Arvay, Thi Thu Ha Doan, and Peter Thiemann. A Dynamic Logic for Symbolic Execution for the Smart Contract Programming Language Michelson. In 38th European Conference on Object-Oriented Programming (ECOOP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 313, pp. 3:1-3:26, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{arvay_et_al:LIPIcs.ECOOP.2024.3,
  author =	{Arvay, Barnabas and Doan, Thi Thu Ha and Thiemann, Peter},
  title =	{{A Dynamic Logic for Symbolic Execution for the Smart Contract Programming Language Michelson}},
  booktitle =	{38th European Conference on Object-Oriented Programming (ECOOP 2024)},
  pages =	{3:1--3:26},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-341-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{313},
  editor =	{Aldrich, Jonathan and Salvaneschi, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2024.3},
  URN =		{urn:nbn:de:0030-drops-208529},
  doi =		{10.4230/LIPIcs.ECOOP.2024.3},
  annote =	{Keywords: Smart Contract, Blockchain, Formal Verification, Symbolic Execution}
}
Document
Type Tailoring

Authors: Ashton Wiersdorf, Stephen Chang, Matthias Felleisen, and Ben Greenman

Published in: LIPIcs, Volume 313, 38th European Conference on Object-Oriented Programming (ECOOP 2024)


Abstract
Type systems evolve too slowly to keep up with the quick evolution of libraries - especially libraries that introduce abstractions. Type tailoring offers a lightweight solution by equipping the core language with an API for modifying the elaboration of surface code into the internal language of the typechecker. Through user-programmable elaboration, tailoring rules appear to improve the precision and expressiveness of the underlying type system. Furthermore, type tailoring cooperates with the host type system by expanding to code that the host then typechecks. In the context of a hygienic metaprogramming system, tailoring rules can even harmoniously compose with one another. Type tailoring has emerged as a theme across several languages and metaprogramming systems, but never with direct support and rarely in the same shape twice. For example, both OCaml and Typed Racket enable forms of tailoring, but in quite different ways. This paper identifies key dimensions of type tailoring systems and tradeoffs along each dimension. It demonstrates the usefulness of tailoring with examples that cover sized vectors, database queries, and optional types. Finally, it outlines a vision for future research at the intersection of types and metaprogramming.

Cite as

Ashton Wiersdorf, Stephen Chang, Matthias Felleisen, and Ben Greenman. Type Tailoring. In 38th European Conference on Object-Oriented Programming (ECOOP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 313, pp. 44:1-44:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{wiersdorf_et_al:LIPIcs.ECOOP.2024.44,
  author =	{Wiersdorf, Ashton and Chang, Stephen and Felleisen, Matthias and Greenman, Ben},
  title =	{{Type Tailoring}},
  booktitle =	{38th European Conference on Object-Oriented Programming (ECOOP 2024)},
  pages =	{44:1--44:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-341-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{313},
  editor =	{Aldrich, Jonathan and Salvaneschi, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2024.44},
  URN =		{urn:nbn:de:0030-drops-208933},
  doi =		{10.4230/LIPIcs.ECOOP.2024.44},
  annote =	{Keywords: Types, Metaprogramming, Macros, Partial Evaluation}
}
Document
Integrals Within Integrals: A Formalization of the Gagliardo-Nirenberg-Sobolev Inequality

Authors: Floris van Doorn and Heather Macbeth

Published in: LIPIcs, Volume 309, 15th International Conference on Interactive Theorem Proving (ITP 2024)


Abstract
We introduce an abstraction which allows arguments involving iterated integrals to be formalized conveniently in type-theory-based proof assistants. We call this abstraction the marginal construction, since it is connected to the marginal distribution in probability theory. The marginal construction gracefully handles permutations to the order of integration (Tonelli’s theorem in several variables), as well as arguments involving an induction over dimension. We implement the marginal construction and several applications in the language Lean. The most difficult of these applications, the Gagliardo-Nirenberg-Sobolev inequality, is a foundational result in the theory of elliptic partial differential equations and has not previously been formalized.

Cite as

Floris van Doorn and Heather Macbeth. Integrals Within Integrals: A Formalization of the Gagliardo-Nirenberg-Sobolev Inequality. In 15th International Conference on Interactive Theorem Proving (ITP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 309, pp. 37:1-37:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{vandoorn_et_al:LIPIcs.ITP.2024.37,
  author =	{van Doorn, Floris and Macbeth, Heather},
  title =	{{Integrals Within Integrals: A Formalization of the Gagliardo-Nirenberg-Sobolev Inequality}},
  booktitle =	{15th International Conference on Interactive Theorem Proving (ITP 2024)},
  pages =	{37:1--37:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-337-9},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{309},
  editor =	{Bertot, Yves and Kutsia, Temur and Norrish, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2024.37},
  URN =		{urn:nbn:de:0030-drops-207657},
  doi =		{10.4230/LIPIcs.ITP.2024.37},
  annote =	{Keywords: Sobolev inequality, measure theory, Lean, formalized mathematics}
}
Document
Mutational Fuzz Testing for Constraint Modeling Systems

Authors: Wout Vanroose, Ignace Bleukx, Jo Devriendt, Dimos Tsouros, Hélène Verhaeghe, and Tias Guns

Published in: LIPIcs, Volume 307, 30th International Conference on Principles and Practice of Constraint Programming (CP 2024)


Abstract
Constraint programming (CP) modeling languages, like MiniZinc, Essence and CPMpy, play a crucial role in making CP technology accessible to non-experts. Both solver-independent modeling frameworks and solvers themselves are complex pieces of software that can contain bugs, which undermines their usefulness. Mutational fuzz testing is a way to test complex systems by stochastically mutating input and verifying preserved properties of the mutated output. We investigate different mutations and verification methods that can be used on the constraint specifications directly. This includes methods proposed in the context of SMT problem specifications, as well as new methods related to global constraints, optimization, and solution counting/preservation. Our results show that such a fuzz testing approach improves the overall code coverage of a modeling system compared to only unit testing, and is able to find bugs in the whole toolchain, from the modeling language transformations themselves to the underlying solvers.

Cite as

Wout Vanroose, Ignace Bleukx, Jo Devriendt, Dimos Tsouros, Hélène Verhaeghe, and Tias Guns. Mutational Fuzz Testing for Constraint Modeling Systems. In 30th International Conference on Principles and Practice of Constraint Programming (CP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 307, pp. 29:1-29:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{vanroose_et_al:LIPIcs.CP.2024.29,
  author =	{Vanroose, Wout and Bleukx, Ignace and Devriendt, Jo and Tsouros, Dimos and Verhaeghe, H\'{e}l\`{e}ne and Guns, Tias},
  title =	{{Mutational Fuzz Testing for Constraint Modeling Systems}},
  booktitle =	{30th International Conference on Principles and Practice of Constraint Programming (CP 2024)},
  pages =	{29:1--29:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-336-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{307},
  editor =	{Shaw, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2024.29},
  URN =		{urn:nbn:de:0030-drops-207149},
  doi =		{10.4230/LIPIcs.CP.2024.29},
  annote =	{Keywords: fuzz testing, Constraint modeling language, bugs, mutational testing, modeling, constraint reformulation}
}
  • Refine by Author
  • 11 Cohen-Addad, Vincent
  • 4 Cohen, Liron
  • 4 Rahli, Vincent
  • 3 Marx, Dániel
  • 2 Abbasi, Fateme
  • Show More...

  • Refine by Classification
  • 15 Theory of computation → Facility location and clustering
  • 9 Theory of computation → Approximation algorithms analysis
  • 4 Theory of computation → Constructive mathematics
  • 4 Theory of computation → Design and analysis of algorithms
  • 4 Theory of computation → Fixed parameter tractability
  • Show More...

  • Refine by Keyword
  • 8 approximation algorithms
  • 7 k-means
  • 7 k-median
  • 5 Clustering
  • 5 clustering
  • Show More...

  • Refine by Type
  • 51 document

  • Refine by Publication Year
  • 30 2024
  • 6 2023
  • 4 2019
  • 4 2021
  • 3 2022
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail