LIPIcs.DISC.2017.44.pdf
- Filesize: 358 kB
- 3 pages
We revisit the hardness of approximating the diameter of a network. In the CONGEST model, ~Omega(n) rounds are necessary to compute the diameter [Frischknecht et al. SODA'12]. Abboud et al. [DISC 2016] extended this result to sparse graphs and, at a more fine-grained level, showed that, for any integer 1 <= l <= polylog(n) , distinguishing between networks of diameter 4l + 2 and 6l + 1 requires ~Omega(n) rounds. We slightly tighten this result by showing that even distinguishing between diameter 2l + 1 and 3l + 1 requires ~Omega(n) rounds. The reduction of Abboud et al. is inspired by recent conditional lower bounds in the RAM model, where the orthogonal vectors problem plays a pivotal role. In our new lower bound, we make the connection to orthogonal vectors explicit, leading to a conceptually more streamlined exposition. This is suited for teaching both the lower bound in the CONGEST model and the conditional lower bound in the RAM model.
Feedback for Dagstuhl Publishing