We prove essentially optimal fine-grained lower bounds on the gap between a data structure and a partially retroactive version of the same data structure. Precisely, assuming any one of three standard conjectures, we describe a problem that has a data structure where operations run in O(T(n,m)) time per operation, but any partially retroactive version of that data structure requires T(n,m)⋅m^{1-o(1)} worst-case time per operation, where n is the size of the data structure at any time and m is the number of operations. Any data structure with operations running in O(T(n,m)) time per operation can be converted (via the "rollback method") into a partially retroactive data structure running in O(T(n,m)⋅m) time per operation, so our lower bound is tight up to an m^o(1) factor common in fine-grained complexity.
@InProceedings{chung_et_al:LIPIcs.ISAAC.2022.32, author = {Chung, Lily and Demaine, Erik D. and Hendrickson, Dylan and Lynch, Jayson}, title = {{Lower Bounds on Retroactive Data Structures}}, booktitle = {33rd International Symposium on Algorithms and Computation (ISAAC 2022)}, pages = {32:1--32:12}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-258-7}, ISSN = {1868-8969}, year = {2022}, volume = {248}, editor = {Bae, Sang Won and Park, Heejin}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2022.32}, URN = {urn:nbn:de:0030-drops-173171}, doi = {10.4230/LIPIcs.ISAAC.2022.32}, annote = {Keywords: Retroactivity, time travel, rollback, fine-grained complexity} }
Feedback for Dagstuhl Publishing