Abstract
A central goal of algorithmic research is to determine how fast computational problems can be solved in the worst case. Theorems from complexity theory state that there are problems that, on inputs of size n, can be solved in t(n) time but not in O(t(n)^{1epsilon}) time for epsilon>0. The main challenge is to determine where in this hierarchy various natural and important problems lie. Throughout the years, many ingenious algorithmic techniques have been developed and applied to obtain blazingly fast algorithms for many problems. Nevertheless, for many other central problems, the best known running times are essentially those of their classical algorithms from the 1950s and 1960s.
Unconditional lower bounds seem very difficult to obtain, and so practically all known time lower bounds are conditional. For years, the main tool for proving hardness of computational problems have been NPhardness reductions, basing hardness on P neq NP. However, when we care about the exact running time (as opposed to merely polynomial vs nonpolynomial), NPhardness is not applicable, especially if the problem is already solvable in polynomial time. In recent years, a new theory has been developed, based on "finegrained reductions" that focus on exact running times. Mimicking NPhardness, the approach is to (1) select a key problem X that is conjectured to require essentially t(n) time for some t, and (2) reduce X in a finegrained way to many important problems. This approach has led to the discovery of many meaningful relationships between problems, and even sometimes to equivalence classes.
The main key problems used to base hardness on have been: the 3SUM problem, the CNFSAT problem (based on the Strong Exponential Time Hypothesis (SETH)) and the All Pairs Shortest Paths Problem. Research on SETHbased lower bounds has flourished in particular in recent years showing that the classical algorithms are optimal for problems such as Approximate Diameter, Edit Distance, Frechet Distance, Longest Common Subsequence etc.
In this talk I will give an overview of the current progress in this area of study, and will highlight some exciting new developments and their relationship to database theory.
BibTeX  Entry
@InProceedings{vassilevskawilliams:LIPIcs:2018:8613,
author = {Virginia Vassilevska Williams},
title = {{Finegrained Algorithms and Complexity}},
booktitle = {21st International Conference on Database Theory (ICDT 2018)},
pages = {1:11:1},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959770637},
ISSN = {18688969},
year = {2018},
volume = {98},
editor = {Benny Kimelfeld and Yael Amsterdamer},
publisher = {Schloss DagstuhlLeibnizZentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2018/8613},
URN = {urn:nbn:de:0030drops86135},
doi = {10.4230/LIPIcs.ICDT.2018.1},
annote = {Keywords: algorithms, complexity, finegrained}
}
Keywords: 

algorithms, complexity, finegrained 
Seminar: 

21st International Conference on Database Theory (ICDT 2018) 
Issue Date: 

2018 
Date of publication: 

28.02.2018 