33 Search Results for "Kaplan, Haim"


Document
Optimal Energetic Paths for Electric Cars

Authors: Dani Dorfman, Haim Kaplan, Robert E. Tarjan, and Uri Zwick

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
A weighted directed graph G = (V,A,c), where A ⊆ V× V and c:A → ℝ, naturally describes a road network in which an electric car, or vehicle (EV), can roam. An arc uv ∈ A models a road segment connecting the two vertices (junctions) u and v. The cost c(uv) of the arc uv is the amount of energy the car needs to travel from u to v. This amount can be positive, zero or negative. We consider both the more realistic scenario where there are no negative cycles in the graph, as well as the more challenging scenario, which can also be motivated, where negative cycles may be present. The electric car has a battery that can store up to B units of energy. The car can traverse an arc uv ∈ A only if it is at u and the charge b in its battery satisfies b ≥ c(uv). If the car traverses the arc uv then it reaches v with a charge of min{b-c(uv),B} in its battery. Arcs with a positive cost deplete the battery while arcs with negative costs may charge the battery, but not above its capacity of B. If the car is at a vertex u and cannot traverse any outgoing arcs of u, then it is stuck and cannot continue traveling. We consider the following natural problem: Given two vertices s,t ∈ V, can the car travel from s to t, starting at s with an initial charge b, where 0 ≤ b ≤ B? If so, what is the maximum charge with which the car can reach t? Equivalently, what is the smallest depletion δ_{B,b}(s,t) such that the car can reach t with a charge of b-δ_{B,b}(s,t) in its battery, and which path should the car follow to achieve this? We also refer to δ_{B,b}(s,t) as the energetic cost of traveling from s to t. We let δ_{B,b}(s,t) = ∞ if the car cannot travel from s to t starting with an initial charge of b. The problem of computing energetic costs is a strict generalization of the standard shortest paths problem. When there are no negative cycles, the single-source version of the problem can be solved using simple adaptations of the classical Bellman-Ford and Dijkstra algorithms. More involved algorithms are required when the graph may contain negative cycles.

Cite as

Dani Dorfman, Haim Kaplan, Robert E. Tarjan, and Uri Zwick. Optimal Energetic Paths for Electric Cars. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 42:1-42:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{dorfman_et_al:LIPIcs.ESA.2023.42,
  author =	{Dorfman, Dani and Kaplan, Haim and Tarjan, Robert E. and Zwick, Uri},
  title =	{{Optimal Energetic Paths for Electric Cars}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{42:1--42:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.42},
  URN =		{urn:nbn:de:0030-drops-186955},
  doi =		{10.4230/LIPIcs.ESA.2023.42},
  annote =	{Keywords: Electric cars, Optimal Paths, Battery depletion}
}
Document
The Unweighted and Weighted Reverse Shortest Path Problem for Disk Graphs

Authors: Haim Kaplan, Matthew J. Katz, Rachel Saban, and Micha Sharir

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
We study the reverse shortest path problem on disk graphs in the plane. In this problem we consider the proximity graph of a set of n disks in the plane of arbitrary radii: In this graph two disks are connected if the distance between them is at most some threshold parameter r. The case of intersection graphs is a special case with r = 0. We give an algorithm that, given a target length k, computes the smallest value of r for which there is a path of length at most k between some given pair of disks in the proximity graph. Our algorithm runs in O^*(n^{5/4}) randomized expected time, which improves to O^*(n^{6/5}) for unit disk graphs, where all the disks have the same radius. Our technique is robust and can be applied to many variants of the problem. One significant variant is the case of weighted proximity graphs, where edges are assigned real weights equal to the distance between the disks or between their centers, and k is replaced by a target weight w. In other variants, we want to optimize a parameter different from r, such as a scale factor of the radii of the disks. The main technique for the decision version of the problem (determining whether the graph with a given r has the desired property) is based on efficient implementations of BFS (for the unweighted case) and of Dijkstra’s algorithm (for the weighted case), using efficient data structures for maintaining the bichromatic closest pair for certain bicliques and several distance functions. The optimization problem is then solved by combining the resulting decision procedure with enhanced variants of the interval shrinking and bifurcation technique of [R. Ben Avraham et al., 2015].

Cite as

Haim Kaplan, Matthew J. Katz, Rachel Saban, and Micha Sharir. The Unweighted and Weighted Reverse Shortest Path Problem for Disk Graphs. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 67:1-67:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.ESA.2023.67,
  author =	{Kaplan, Haim and Katz, Matthew J. and Saban, Rachel and Sharir, Micha},
  title =	{{The Unweighted and Weighted Reverse Shortest Path Problem for Disk Graphs}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{67:1--67:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.67},
  URN =		{urn:nbn:de:0030-drops-187208},
  doi =		{10.4230/LIPIcs.ESA.2023.67},
  annote =	{Keywords: Computational geometry, geometric optimization, disk graphs, BFS, Dijkstra’s algorithm, reverse shortest path}
}
Document
Track A: Algorithms, Complexity and Games
Expander Decomposition with Fewer Inter-Cluster Edges Using a Spectral Cut Player

Authors: Daniel Agassy, Dani Dorfman, and Haim Kaplan

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
A (ϕ,ε)-expander decomposition of a graph G (with n vertices and m edges) is a partition of V into clusters V₁,…,V_k with conductance Φ(G[V_i]) ≥ ϕ, such that there are at most ε m inter-cluster edges. Such a decomposition plays a crucial role in many graph algorithms. We give a randomized Õ(m/ϕ) time algorithm for computing a (ϕ, ϕlog²n)-expander decomposition. This improves upon the (ϕ, ϕlog³n)-expander decomposition also obtained in Õ(m/ϕ) time by [Saranurak and Wang, SODA 2019] (SW) and brings the number of inter-cluster edges within logarithmic factor of optimal. One crucial component of SW’s algorithm is a non-stop version of the cut-matching game of [Khandekar, Rao, Vazirani, JACM 2009] (KRV): The cut player does not stop when it gets from the matching player an unbalanced sparse cut, but continues to play on a trimmed part of the large side. The crux of our improvement is the design of a non-stop version of the cleverer cut player of [Orecchia, Schulman, Vazirani, Vishnoi, STOC 2008] (OSVV). The cut player of OSSV uses a more sophisticated random walk, a subtle potential function, and spectral arguments. Designing and analysing a non-stop version of this game was an explicit open question asked by SW.

Cite as

Daniel Agassy, Dani Dorfman, and Haim Kaplan. Expander Decomposition with Fewer Inter-Cluster Edges Using a Spectral Cut Player. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 9:1-9:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{agassy_et_al:LIPIcs.ICALP.2023.9,
  author =	{Agassy, Daniel and Dorfman, Dani and Kaplan, Haim},
  title =	{{Expander Decomposition with Fewer Inter-Cluster Edges Using a Spectral Cut Player}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{9:1--9:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.9},
  URN =		{urn:nbn:de:0030-drops-180619},
  doi =		{10.4230/LIPIcs.ICALP.2023.9},
  annote =	{Keywords: Exapander Decomposition, Cut-Matching Game}
}
Document
Track A: Algorithms, Complexity and Games
Fast Approximation of Search Trees on Trees with Centroid Trees

Authors: Benjamin Aram Berendsohn, Ishay Golinsky, Haim Kaplan, and László Kozma

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
Search trees on trees (STTs) generalize the fundamental binary search tree (BST) data structure: in STTs the underlying search space is an arbitrary tree, whereas in BSTs it is a path. An optimal BST of size n can be computed for a given distribution of queries in 𝒪(n²) time [Knuth, Acta Inf. 1971] and centroid BSTs provide a nearly-optimal alternative, computable in 𝒪(n) time [Mehlhorn, SICOMP 1977]. By contrast, optimal STTs are not known to be computable in polynomial time, and the fastest constant-approximation algorithm runs in 𝒪(n³) time [Berendsohn, Kozma, SODA 2022]. Centroid trees can be defined for STTs analogously to BSTs, and they have been used in a wide range of algorithmic applications. In the unweighted case (i.e., for a uniform distribution of queries), the centroid tree can be computed in 𝒪(n) time [Brodal, Fagerberg, Pedersen, Östlin, ICALP 2001; Della Giustina, Prezza, Venturini, SPIRE 2019]. These algorithms, however, do not readily extend to the weighted case. Moreover, no approximation guarantees were previously known for centroid trees in either the unweighted or weighted cases. In this paper we revisit centroid trees in a general, weighted setting, and we settle both the algorithmic complexity of constructing them, and the quality of their approximation. For constructing a weighted centroid tree, we give an output-sensitive 𝒪(n log h) ⊆ 𝒪(n log n) time algorithm, where h is the height of the resulting centroid tree. If the weights are of polynomial complexity, the running time is 𝒪(n log log n). We show these bounds to be optimal, in a general decision tree model of computation. For approximation, we prove that the cost of a centroid tree is at most twice the optimum, and this guarantee is best possible, both in the weighted and unweighted cases. We also give tight, fine-grained bounds on the approximation-ratio for bounded-degree trees and on the approximation-ratio of more general α-centroid trees.

Cite as

Benjamin Aram Berendsohn, Ishay Golinsky, Haim Kaplan, and László Kozma. Fast Approximation of Search Trees on Trees with Centroid Trees. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 19:1-19:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{berendsohn_et_al:LIPIcs.ICALP.2023.19,
  author =	{Berendsohn, Benjamin Aram and Golinsky, Ishay and Kaplan, Haim and Kozma, L\'{a}szl\'{o}},
  title =	{{Fast Approximation of Search Trees on Trees with Centroid Trees}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{19:1--19:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.19},
  URN =		{urn:nbn:de:0030-drops-180711},
  doi =		{10.4230/LIPIcs.ICALP.2023.19},
  annote =	{Keywords: centroid tree, search trees on trees, approximation}
}
Document
Dynamic Binary Search Trees: Improved Lower Bounds for the Greedy-Future Algorithm

Authors: Yaniv Sadeh and Haim Kaplan

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
Binary search trees (BSTs) are one of the most basic and widely used data structures. The best static tree for serving a sequence of queries (searches) can be computed by dynamic programming. In contrast, when the BSTs are allowed to be dynamic (i.e. change by rotations between searches), we still do not know how to compute the optimal algorithm (OPT) for a given sequence. One of the candidate algorithms whose serving cost is suspected to be optimal up-to a (multiplicative) constant factor is known by the name Greedy Future (GF). In an equivalent geometric way of representing queries on BSTs, GF is in fact equivalent to another algorithm called Geometric Greedy (GG). Most of the results on GF are obtained using the geometric model and the study of GG. Despite this intensive recent fruitful research, the best lower bound we have on the competitive ratio of GF is 4/3. Furthermore, it has been conjectured that the additive gap between the cost of GF and OPT is only linear in the number of queries. In this paper we prove a lower bound of 2 on the competitive ratio of GF, and we prove that the additive gap between the cost of GF and OPT can be Ω(m ⋅ log log n) where n is the number of items in the tree and m is the number of queries.

Cite as

Yaniv Sadeh and Haim Kaplan. Dynamic Binary Search Trees: Improved Lower Bounds for the Greedy-Future Algorithm. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 53:1-53:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{sadeh_et_al:LIPIcs.STACS.2023.53,
  author =	{Sadeh, Yaniv and Kaplan, Haim},
  title =	{{Dynamic Binary Search Trees: Improved Lower Bounds for the Greedy-Future Algorithm}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{53:1--53:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.53},
  URN =		{urn:nbn:de:0030-drops-177055},
  doi =		{10.4230/LIPIcs.STACS.2023.53},
  annote =	{Keywords: Binary Search Trees, Greedy Future, Geometric Greedy, Lower Bounds, Dynamic Optimality Conjecture}
}
Document
Dynamic Connectivity in Disk Graphs

Authors: Haim Kaplan, Alexander Kauer, Katharina Klost, Kristin Knorr, Wolfgang Mulzer, Liam Roditty, and Paul Seiferth

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
Let S ⊆ ℝ² be a set of n planar sites, such that each s ∈ S has an associated radius r_s > 0. Let 𝒟(S) be the disk intersection graph for S. It has vertex set S and an edge between two distinct sites s, t ∈ S if and only if the disks with centers s, t and radii r_s, r_t intersect. Our goal is to design data structures that maintain the connectivity structure of 𝒟(S) as sites are inserted and/or deleted. First, we consider unit disk graphs, i.e., r_s = 1, for all s ∈ S. We describe a data structure that has O(log² n) amortized update and O(log n/log log n) amortized query time. Second, we look at disk graphs with bounded radius ratio Ψ, i.e., for all s ∈ S, we have 1 ≤ r_s ≤ Ψ, for a Ψ ≥ 1 known in advance. In the fully dynamic case, we achieve amortized update time O(Ψ λ₆(log n) log⁷ n) and query time O(log n/log log n), where λ_s(n) is the maximum length of a Davenport-Schinzel sequence of order s on n symbols. In the incremental case, where only insertions are allowed, we get logarithmic dependency on Ψ, with O(α(n)) query time and O(logΨ λ₆(log n) log⁷ n) update time. For the decremental setting, where only deletions are allowed, we first develop an efficient disk revealing structure: given two sets R and B of disks, we can delete disks from R, and upon each deletion, we receive a list of all disks in B that no longer intersect the union of R. Using this, we get decremental data structures with amortized query time O(log n/log log n) that support m deletions in O((nlog⁵ n + m log⁷ n) λ₆(log n) + nlog Ψ log⁴n) overall time for bounded radius ratio Ψ and O((nlog⁶ n + m log⁸n) λ₆(log n)) for arbitrary radii.

Cite as

Haim Kaplan, Alexander Kauer, Katharina Klost, Kristin Knorr, Wolfgang Mulzer, Liam Roditty, and Paul Seiferth. Dynamic Connectivity in Disk Graphs. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 49:1-49:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.SoCG.2022.49,
  author =	{Kaplan, Haim and Kauer, Alexander and Klost, Katharina and Knorr, Kristin and Mulzer, Wolfgang and Roditty, Liam and Seiferth, Paul},
  title =	{{Dynamic Connectivity in Disk Graphs}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{49:1--49:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.49},
  URN =		{urn:nbn:de:0030-drops-160572},
  doi =		{10.4230/LIPIcs.SoCG.2022.49},
  annote =	{Keywords: Disk Graphs, Connectivity, Lower Envelopes}
}
Document
Track A: Algorithms, Complexity and Games
Breaking the 2ⁿ Barrier for 5-Coloring and 6-Coloring

Authors: Or Zamir

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
The coloring problem (i.e., computing the chromatic number of a graph) can be solved in O^*(2ⁿ) time, as shown by Björklund, Husfeldt and Koivisto in 2009. For k = 3,4, better algorithms are known for the k-coloring problem. 3-coloring can be solved in O(1.33ⁿ) time (Beigel and Eppstein, 2005) and 4-coloring can be solved in O(1.73ⁿ) time (Fomin, Gaspers and Saurabh, 2007). Surprisingly, for k > 4 no improvements over the general O^*(2ⁿ) are known. We show that both 5-coloring and 6-coloring can also be solved in O((2-ε) ⁿ) time for some ε > 0. As a crucial step, we obtain an exponential improvement for computing the chromatic number of a very large family of graphs. In particular, for any constants Δ,α > 0, the chromatic number of graphs with at least α⋅ n vertices of degree at most Δ can be computed in O((2-ε) ⁿ) time, for some ε = ε_{Δ,α} > 0. This statement generalizes previous results for bounded-degree graphs (Björklund, Husfeldt, Kaski, and Koivisto, 2010) and graphs with bounded average degree (Golovnev, Kulikov and Mihajlin, 2016). We generalize the aforementioned statement to List Coloring, for which no previous improvements are known even for the case of bounded-degree graphs.

Cite as

Or Zamir. Breaking the 2ⁿ Barrier for 5-Coloring and 6-Coloring. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 113:1-113:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{zamir:LIPIcs.ICALP.2021.113,
  author =	{Zamir, Or},
  title =	{{Breaking the 2ⁿ Barrier for 5-Coloring and 6-Coloring}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{113:1--113:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.113},
  URN =		{urn:nbn:de:0030-drops-141825},
  doi =		{10.4230/LIPIcs.ICALP.2021.113},
  annote =	{Keywords: Algorithms, Graph Algorithms, Graph Coloring}
}
Document
Locality Sensitive Hashing for Efficient Similar Polygon Retrieval

Authors: Haim Kaplan and Jay Tenenbaum

Published in: LIPIcs, Volume 187, 38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021)


Abstract
Locality Sensitive Hashing (LSH) is an effective method of indexing a set of items to support efficient nearest neighbors queries in high-dimensional spaces. The basic idea of LSH is that similar items should produce hash collisions with higher probability than dissimilar items. We study LSH for (not necessarily convex) polygons, and use it to give efficient data structures for similar shape retrieval. Arkin et al. [Arkin et al., 1991] represent polygons by their "turning function" - a function which follows the angle between the polygon’s tangent and the x-axis while traversing the perimeter of the polygon. They define the distance between polygons to be variations of the L_p (for p = 1,2) distance between their turning functions. This metric is invariant under translation, rotation and scaling (and the selection of the initial point on the perimeter) and therefore models well the intuitive notion of shape resemblance. We develop and analyze LSH near neighbor data structures for several variations of the L_p distance for functions (for p = 1,2). By applying our schemes to the turning functions of a collection of polygons we obtain efficient near neighbor LSH-based structures for polygons. To tune our structures to turning functions of polygons, we prove some new properties of these turning functions that may be of independent interest. As part of our analysis, we address the following problem which is of independent interest. Find the vertical translation of a function f that is closest in L₁ distance to a function g. We prove tight bounds on the approximation guarantee obtained by the translation which is equal to the difference between the averages of g and f.

Cite as

Haim Kaplan and Jay Tenenbaum. Locality Sensitive Hashing for Efficient Similar Polygon Retrieval. In 38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 187, pp. 46:1-46:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.STACS.2021.46,
  author =	{Kaplan, Haim and Tenenbaum, Jay},
  title =	{{Locality Sensitive Hashing for Efficient Similar Polygon Retrieval}},
  booktitle =	{38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021)},
  pages =	{46:1--46:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-180-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{187},
  editor =	{Bl\"{a}ser, Markus and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2021.46},
  URN =		{urn:nbn:de:0030-drops-136910},
  doi =		{10.4230/LIPIcs.STACS.2021.46},
  annote =	{Keywords: Locality sensitive hashing, polygons, turning function, L\underlinep distance, nearest neighbors, similarity search}
}
Document
Locality Sensitive Hashing for Set-Queries, Motivated by Group Recommendations

Authors: Haim Kaplan and Jay Tenenbaum

Published in: LIPIcs, Volume 162, 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)


Abstract
Locality Sensitive Hashing (LSH) is an effective method to index a set of points such that we can efficiently find the nearest neighbors of a query point. We extend this method to our novel Set-query LSH (SLSH), such that it can find the nearest neighbors of a set of points, given as a query. Let s(x,y) be the similarity between two points x and y. We define a similarity between a set Q and a point x by aggregating the similarities s(p,x) for all p∈ Q. For example, we can take s(p,x) to be the angular similarity between p and x (i.e., 1-(∠(x,p)/π)), and aggregate by arithmetic or geometric averaging, or taking the lowest similarity. We develop locality sensitive hash families and data structures for a large set of such arithmetic and geometric averaging similarities, and analyze their collision probabilities. We also establish an analogous framework and hash families for distance functions. Specifically, we give a structure for the euclidean distance aggregated by either averaging or taking the maximum. We leverage SLSH to solve a geometric extension of the approximate near neighbors problem. In this version, we consider a metric for which the unit ball is an ellipsoid and its orientation is specified with the query. An important application that motivates our work is group recommendation systems. Such a system embeds movies and users in the same feature space, and the task of recommending a movie for a group to watch together, translates to a set-query Q using an appropriate similarity.

Cite as

Haim Kaplan and Jay Tenenbaum. Locality Sensitive Hashing for Set-Queries, Motivated by Group Recommendations. In 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 162, pp. 28:1-28:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.SWAT.2020.28,
  author =	{Kaplan, Haim and Tenenbaum, Jay},
  title =	{{Locality Sensitive Hashing for Set-Queries, Motivated by Group Recommendations}},
  booktitle =	{17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)},
  pages =	{28:1--28:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-150-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{162},
  editor =	{Albers, Susanne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2020.28},
  URN =		{urn:nbn:de:0030-drops-122756},
  doi =		{10.4230/LIPIcs.SWAT.2020.28},
  annote =	{Keywords: Locality sensitive hashing, nearest neighbors, similarity search, group recommendations, distance functions, similarity functions, ellipsoid}
}
Document
How to Find a Point in the Convex Hull Privately

Authors: Haim Kaplan, Micha Sharir, and Uri Stemmer

Published in: LIPIcs, Volume 164, 36th International Symposium on Computational Geometry (SoCG 2020)


Abstract
We study the question of how to compute a point in the convex hull of an input set S of n points in ℝ^d in a differentially private manner. This question, which is trivial without privacy requirements, turns out to be quite deep when imposing differential privacy. In particular, it is known that the input points must reside on a fixed finite subset G ⊆ ℝ^d, and furthermore, the size of S must grow with the size of G. Previous works [Amos Beimel et al., 2010; Amos Beimel et al., 2019; Amos Beimel et al., 2013; Mark Bun et al., 2018; Mark Bun et al., 2015; Haim Kaplan et al., 2019] focused on understanding how n needs to grow with |G|, and showed that n=O(d^2.5 ⋅ 8^(log^*|G|)) suffices (so n does not have to grow significantly with |G|). However, the available constructions exhibit running time at least |G|^d², where typically |G|=X^d for some (large) discretization parameter X, so the running time is in fact Ω(X^d³). In this paper we give a differentially private algorithm that runs in O(n^d) time, assuming that n=Ω(d⁴ log X). To get this result we study and exploit some structural properties of the Tukey levels (the regions D_{≥ k} consisting of points whose Tukey depth is at least k, for k=0,1,…). In particular, we derive lower bounds on their volumes for point sets S in general position, and develop a rather subtle mechanism for handling point sets S in degenerate position (where the deep Tukey regions have zero volume). A naive approach to the construction of the Tukey regions requires n^O(d²) time. To reduce the cost to O(n^d), we use an approximation scheme for estimating the volumes of the Tukey regions (within their affine spans in case of degeneracy), and for sampling a point from such a region, a scheme that is based on the volume estimation framework of Lovász and Vempala [László Lovász and Santosh S. Vempala, 2006] and of Cousins and Vempala [Ben Cousins and Santosh S. Vempala, 2018]. Making this framework differentially private raises a set of technical challenges that we address.

Cite as

Haim Kaplan, Micha Sharir, and Uri Stemmer. How to Find a Point in the Convex Hull Privately. In 36th International Symposium on Computational Geometry (SoCG 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 164, pp. 52:1-52:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.SoCG.2020.52,
  author =	{Kaplan, Haim and Sharir, Micha and Stemmer, Uri},
  title =	{{How to Find a Point in the Convex Hull Privately}},
  booktitle =	{36th International Symposium on Computational Geometry (SoCG 2020)},
  pages =	{52:1--52:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-143-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{164},
  editor =	{Cabello, Sergio and Chen, Danny Z.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2020.52},
  URN =		{urn:nbn:de:0030-drops-122107},
  doi =		{10.4230/LIPIcs.SoCG.2020.52},
  annote =	{Keywords: Differential privacy, Tukey depth, Convex hull}
}
Document
Sample Complexity Bounds for Influence Maximization

Authors: Gal Sadeh, Edith Cohen, and Haim Kaplan

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Influence maximization (IM) is the problem of finding for a given s ≥ 1 a set S of |S|=s nodes in a network with maximum influence. With stochastic diffusion models, the influence of a set S of seed nodes is defined as the expectation of its reachability over simulations, where each simulation specifies a deterministic reachability function. Two well-studied special cases are the Independent Cascade (IC) and the Linear Threshold (LT) models of Kempe, Kleinberg, and Tardos [Kempe et al., 2003]. The influence function in stochastic diffusion is unbiasedly estimated by averaging reachability values over i.i.d. simulations. We study the IM sample complexity: the number of simulations needed to determine a (1-ε)-approximate maximizer with confidence 1-δ. Our main result is a surprising upper bound of O(s τ ε^{-2} ln (n/δ)) for a broad class of models that includes IC and LT models and their mixtures, where n is the number of nodes and τ is the number of diffusion steps. Generally τ ≪ n, so this significantly improves over the generic upper bound of O(s n ε^{-2} ln (n/δ)). Our sample complexity bounds are derived from novel upper bounds on the variance of the reachability that allow for small relative error for influential sets and additive error when influence is small. Moreover, we provide a data-adaptive method that can detect and utilize fewer simulations on models where it suffices. Finally, we provide an efficient greedy design that computes an (1-1/e-ε)-approximate maximizer from simulations and applies to any submodular stochastic diffusion model that satisfies the variance bounds.

Cite as

Gal Sadeh, Edith Cohen, and Haim Kaplan. Sample Complexity Bounds for Influence Maximization. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 29:1-29:36, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{sadeh_et_al:LIPIcs.ITCS.2020.29,
  author =	{Sadeh, Gal and Cohen, Edith and Kaplan, Haim},
  title =	{{Sample Complexity Bounds for Influence Maximization}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{29:1--29:36},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.29},
  URN =		{urn:nbn:de:0030-drops-117140},
  doi =		{10.4230/LIPIcs.ITCS.2020.29},
  annote =	{Keywords: Sample complexity, Influence maximization, Submodular maximization}
}
Document
Triangles and Girth in Disk Graphs and Transmission Graphs

Authors: Haim Kaplan, Katharina Klost, Wolfgang Mulzer, Liam Roditty, Paul Seiferth, and Micha Sharir

Published in: LIPIcs, Volume 144, 27th Annual European Symposium on Algorithms (ESA 2019)


Abstract
Let S subset R^2 be a set of n sites, where each s in S has an associated radius r_s > 0. The disk graph D(S) is the undirected graph with vertex set S and an undirected edge between two sites s, t in S if and only if |st| <= r_s + r_t, i.e., if the disks with centers s and t and respective radii r_s and r_t intersect. Disk graphs are used to model sensor networks. Similarly, the transmission graph T(S) is the directed graph with vertex set S and a directed edge from a site s to a site t if and only if |st| <= r_s, i.e., if t lies in the disk with center s and radius r_s. We provide algorithms for detecting (directed) triangles and, more generally, computing the length of a shortest cycle (the girth) in D(S) and in T(S). These problems are notoriously hard in general, but better solutions exist for special graph classes such as planar graphs. We obtain similarly efficient results for disk graphs and for transmission graphs. More precisely, we show that a shortest (Euclidean) triangle in D(S) and in T(S) can be found in O(n log n) expected time, and that the (weighted) girth of D(S) can be found in O(n log n) expected time. For this, we develop new tools for batched range searching that may be of independent interest.

Cite as

Haim Kaplan, Katharina Klost, Wolfgang Mulzer, Liam Roditty, Paul Seiferth, and Micha Sharir. Triangles and Girth in Disk Graphs and Transmission Graphs. In 27th Annual European Symposium on Algorithms (ESA 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 144, pp. 64:1-64:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kaplan_et_al:LIPIcs.ESA.2019.64,
  author =	{Kaplan, Haim and Klost, Katharina and Mulzer, Wolfgang and Roditty, Liam and Seiferth, Paul and Sharir, Micha},
  title =	{{Triangles and Girth in Disk Graphs and Transmission Graphs}},
  booktitle =	{27th Annual European Symposium on Algorithms (ESA 2019)},
  pages =	{64:1--64:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-124-5},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{144},
  editor =	{Bender, Michael A. and Svensson, Ola and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2019.64},
  URN =		{urn:nbn:de:0030-drops-111859},
  doi =		{10.4230/LIPIcs.ESA.2019.64},
  annote =	{Keywords: disk graph, transmission graph, triangle, girth}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
A Faster Deterministic Exponential Time Algorithm for Energy Games and Mean Payoff Games (Track B: Automata, Logic, Semantics, and Theory of Programming)

Authors: Dani Dorfman, Haim Kaplan, and Uri Zwick

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We present an improved exponential time algorithm for Energy Games, and hence also for Mean Payoff Games. The running time of the new algorithm is O (min(m n W, m n 2^{n/2} log W)), where n is the number of vertices, m is the number of edges, and when the edge weights are integers of absolute value at most W. For small values of W, the algorithm matches the performance of the pseudopolynomial time algorithm of Brim et al. on which it is based. For W >= n2^{n/2}, the new algorithm is faster than the algorithm of Brim et al. and is currently the fastest deterministic algorithm for Energy Games and Mean Payoff Games. The new algorithm is obtained by introducing a technique of forecasting repetitive actions performed by the algorithm of Brim et al., along with the use of an edge-weight scaling technique.

Cite as

Dani Dorfman, Haim Kaplan, and Uri Zwick. A Faster Deterministic Exponential Time Algorithm for Energy Games and Mean Payoff Games (Track B: Automata, Logic, Semantics, and Theory of Programming). In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 114:1-114:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dorfman_et_al:LIPIcs.ICALP.2019.114,
  author =	{Dorfman, Dani and Kaplan, Haim and Zwick, Uri},
  title =	{{A Faster Deterministic Exponential Time Algorithm for Energy Games and Mean Payoff Games}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{114:1--114:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.114},
  URN =		{urn:nbn:de:0030-drops-106909},
  doi =		{10.4230/LIPIcs.ICALP.2019.114},
  annote =	{Keywords: Energy Games, Mean Payoff Games, Scaling}
}
Document
Efficient Algorithms for Geometric Partial Matching

Authors: Pankaj K. Agarwal, Hsien-Chih Chang, and Allen Xiao

Published in: LIPIcs, Volume 129, 35th International Symposium on Computational Geometry (SoCG 2019)


Abstract
Let A and B be two point sets in the plane of sizes r and n respectively (assume r <= n), and let k be a parameter. A matching between A and B is a family of pairs in A x B so that any point of A cup B appears in at most one pair. Given two positive integers p and q, we define the cost of matching M to be c(M) = sum_{(a, b) in M}||a-b||_p^q where ||*||_p is the L_p-norm. The geometric partial matching problem asks to find the minimum-cost size-k matching between A and B. We present efficient algorithms for geometric partial matching problem that work for any powers of L_p-norm matching objective: An exact algorithm that runs in O((n + k^2)polylog n) time, and a (1 + epsilon)-approximation algorithm that runs in O((n + k sqrt{k})polylog n * log epsilon^{-1}) time. Both algorithms are based on the primal-dual flow augmentation scheme; the main improvements involve using dynamic data structures to achieve efficient flow augmentations. With similar techniques, we give an exact algorithm for the planar transportation problem running in O(min{n^2, rn^{3/2}}polylog n) time.

Cite as

Pankaj K. Agarwal, Hsien-Chih Chang, and Allen Xiao. Efficient Algorithms for Geometric Partial Matching. In 35th International Symposium on Computational Geometry (SoCG 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 129, pp. 6:1-6:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{agarwal_et_al:LIPIcs.SoCG.2019.6,
  author =	{Agarwal, Pankaj K. and Chang, Hsien-Chih and Xiao, Allen},
  title =	{{Efficient Algorithms for Geometric Partial Matching}},
  booktitle =	{35th International Symposium on Computational Geometry (SoCG 2019)},
  pages =	{6:1--6:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-104-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{129},
  editor =	{Barequet, Gill and Wang, Yusu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2019.6},
  URN =		{urn:nbn:de:0030-drops-104109},
  doi =		{10.4230/LIPIcs.SoCG.2019.6},
  annote =	{Keywords: partial matching, transportation, optimal transport, minimum-cost flow, bichromatic closest pair}
}
Document
General Techniques for Approximate Incidences and Their Application to the Camera Posing Problem

Authors: Dror Aiger, Haim Kaplan, Efi Kokiopoulou, Micha Sharir, and Bernhard Zeisl

Published in: LIPIcs, Volume 129, 35th International Symposium on Computational Geometry (SoCG 2019)


Abstract
We consider the classical camera pose estimation problem that arises in many computer vision applications, in which we are given n 2D-3D correspondences between points in the scene and points in the camera image (some of which are incorrect associations), and where we aim to determine the camera pose (the position and orientation of the camera in the scene) from this data. We demonstrate that this posing problem can be reduced to the problem of computing epsilon-approximate incidences between two-dimensional surfaces (derived from the input correspondences) and points (on a grid) in a four-dimensional pose space. Similar reductions can be applied to other camera pose problems, as well as to similar problems in related application areas. We describe and analyze three techniques for solving the resulting epsilon-approximate incidences problem in the context of our camera posing application. The first is a straightforward assignment of surfaces to the cells of a grid (of side-length epsilon) that they intersect. The second is a variant of a primal-dual technique, recently introduced by a subset of the authors [Aiger et al., 2017] for different (and simpler) applications. The third is a non-trivial generalization of a data structure Fonseca and Mount [Da Fonseca and Mount, 2010], originally designed for the case of hyperplanes. We present and analyze this technique in full generality, and then apply it to the camera posing problem at hand. We compare our methods experimentally on real and synthetic data. Our experiments show that for the typical values of n and epsilon, the primal-dual method is the fastest, also in practice.

Cite as

Dror Aiger, Haim Kaplan, Efi Kokiopoulou, Micha Sharir, and Bernhard Zeisl. General Techniques for Approximate Incidences and Their Application to the Camera Posing Problem. In 35th International Symposium on Computational Geometry (SoCG 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 129, pp. 8:1-8:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{aiger_et_al:LIPIcs.SoCG.2019.8,
  author =	{Aiger, Dror and Kaplan, Haim and Kokiopoulou, Efi and Sharir, Micha and Zeisl, Bernhard},
  title =	{{General Techniques for Approximate Incidences and Their Application to the Camera Posing Problem}},
  booktitle =	{35th International Symposium on Computational Geometry (SoCG 2019)},
  pages =	{8:1--8:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-104-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{129},
  editor =	{Barequet, Gill and Wang, Yusu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2019.8},
  URN =		{urn:nbn:de:0030-drops-104129},
  doi =		{10.4230/LIPIcs.SoCG.2019.8},
  annote =	{Keywords: Camera positioning, Approximate incidences, Incidences}
}
  • Refine by Author
  • 29 Kaplan, Haim
  • 9 Sharir, Micha
  • 6 Mulzer, Wolfgang
  • 6 Zwick, Uri
  • 5 Dorfman, Dani
  • Show More...

  • Refine by Classification
  • 9 Theory of computation → Computational geometry
  • 6 Theory of computation → Data structures design and analysis
  • 3 Theory of computation → Design and analysis of algorithms
  • 3 Theory of computation → Graph algorithms analysis
  • 2 Information systems → Information retrieval
  • Show More...

  • Refine by Keyword
  • 3 Computational geometry
  • 2 Approximate incidences
  • 2 Locality sensitive hashing
  • 2 bipartite matching
  • 2 data structure
  • Show More...

  • Refine by Type
  • 33 document

  • Refine by Publication Year
  • 7 2019
  • 5 2018
  • 5 2023
  • 4 2017
  • 3 2015
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail