LIPIcs.SoCG.2021.18.pdf
- Filesize: 0.89 MB
- 17 pages
Computing the similarity of two point sets is a ubiquitous task in medical imaging, geometric shape comparison, trajectory analysis, and many more settings. Arguably the most basic distance measure for this task is the Hausdorff distance, which assigns to each point from one set the closest point in the other set and then evaluates the maximum distance of any assigned pair. A drawback is that this distance measure is not translational invariant, that is, comparing two objects just according to their shape while disregarding their position in space is impossible. Fortunately, there is a canonical translational invariant version, the Hausdorff distance under translation, which minimizes the Hausdorff distance over all translations of one of the point sets. For point sets of size n and m, the Hausdorff distance under translation can be computed in time 𝒪̃(nm) for the L₁ and L_∞ norm [Chew, Kedem SWAT'92] and 𝒪̃(nm (n+m)) for the L₂ norm [Huttenlocher, Kedem, Sharir DCG'93]. As these bounds have not been improved for over 25 years, in this paper we approach the Hausdorff distance under translation from the perspective of fine-grained complexity theory. We show (i) a matching lower bound of (nm)^{1-o(1)} for L₁ and L_∞ assuming the Orthogonal Vectors Hypothesis and (ii) a matching lower bound of n^{2-o(1)} for L₂ in the imbalanced case of m = 𝒪(1) assuming the 3SUM Hypothesis.
Feedback for Dagstuhl Publishing