LIPIcs.ISAAC.2024.3.pdf
- Filesize: 329 kB
- 1 pages
Fine-grained complexity attempts to precisely determine the time complexity of a problem and has emerged as a guide for algorithm design in recent times. Some of the central problems in fine-grain complexity deals with computation of distances. For example, computing all pairs shortest paths in a weighted graph, computing edit distance between two sequences or two trees, and computing distance of a sequence from a context free language. Many of these problems reduce to computation of matrix products over various algebraic structures, predominantly over the (min,+) semiring. Obtaining a truly subcubic algorithm for (min,+) product is one of the outstanding open questions in computer science. Interestingly many of the aforementioned distance computation problems have some additional structural properties. Specifically, when we perturb the inputs slightly, we do not expect a huge change in the output. This simple yet powerful observation has led to better algorithms for many problems for which we were able to improve the running time after several decades. This includes problems such as the Language Edit Distance, RNA folding, and Dyck Edit Distance. Indeed, this structure in the problem leads to matrices that have the Lipschitz property, and we gave the first truly subcubic time algorithm for computing (min,+) product over such Lipschitz matrices. Follow-up work by several researchers obtained improved bounds for monotone matrices, and for (min,+) convolution under similar structures leading to improved bounds for a series of optimization problems. These result in not just faster algorithms for exact computation but also for approximation algorithms. In particular, we show how fast (min,+) product computation over monotone matrices can lead to better additive approximation algorithms for computing all pairs shortest paths on unweighted undirected graphs, leading to improvements after twenty four years.
Feedback for Dagstuhl Publishing