A Novel Prediction Setup for Online Speed-Scaling

Authors Antonios Antoniadis , Peyman Jabbarzade, Golnoosh Shahkarami



PDF
Thumbnail PDF

File

LIPIcs.SWAT.2022.9.pdf
  • Filesize: 0.76 MB
  • 20 pages

Document Identifiers

Author Details

Antonios Antoniadis
  • University of Twente, Enschede, The Netherlands
Peyman Jabbarzade
  • University of Maryland, College Park, MD, USA
Golnoosh Shahkarami
  • Max Planck Institut für Informatik, Saarbrücken, Germany
  • Universität des Saarlandes, Saarbrücken, Germany

Cite AsGet BibTex

Antonios Antoniadis, Peyman Jabbarzade, and Golnoosh Shahkarami. A Novel Prediction Setup for Online Speed-Scaling. In 18th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 227, pp. 9:1-9:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)
https://doi.org/10.4230/LIPIcs.SWAT.2022.9

Abstract

Given the rapid rise in energy demand by data centers and computing systems in general, it is fundamental to incorporate energy considerations when designing (scheduling) algorithms. Machine learning can be a useful approach in practice by predicting the future load of the system based on, for example, historical data. However, the effectiveness of such an approach highly depends on the quality of the predictions and can be quite far from optimal when predictions are sub-par. On the other hand, while providing a worst-case guarantee, classical online algorithms can be pessimistic for large classes of inputs arising in practice. This paper, in the spirit of the new area of machine learning augmented algorithms, attempts to obtain the best of both worlds for the classical, deadline based, online speed-scaling problem: Based on the introduction of a novel prediction setup, we develop algorithms that (i) obtain provably low energy-consumption in the presence of adequate predictions, and (ii) are robust against inadequate predictions, and (iii) are smooth, i.e., their performance gradually degrades as the prediction error increases.

Subject Classification

ACM Subject Classification
  • Theory of computation → Scheduling algorithms
Keywords
  • learning augmented algorithms
  • speed-scaling
  • energy-efficiency
  • scheduling theory
  • online algorithms

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Ahmed Abousamra, David P. Bunde, and Kirk Pruhs. An experimental comparison of speed scaling algorithms with deadline feasibility constraints. CoRR, abs/1307.0531, 2013. URL: http://arxiv.org/abs/1307.0531.
  2. Susanne Albers. Energy-efficient algorithms. Commun. ACM, 53(5):86-96, 2010. Google Scholar
  3. Susanne Albers, Antonios Antoniadis, and Gero Greiner. On multi-processor speed scaling with migration. J. Comput. Syst. Sci., 81(7):1194-1209, 2015. Google Scholar
  4. Susanne Albers and Hiroshi Fujiwara. Energy-efficient algorithms for flow time minimization. ACM Trans. Algorithms, 3(4):49, 2007. Google Scholar
  5. Eric Angel, Evripidis Bampis, Fadi Kacem, and Dimitrios Letsios. Speed scaling on parallel processors with migration. J. Comb. Optim., 37(4):1266-1282, 2019. Google Scholar
  6. Antonios Antoniadis, Christian Coester, Marek Elias, Adam Polak, and Bertrand Simon. Online metric algorithms with untrusted predictions. In International Conference on Machine Learning, pages 345-355. PMLR, 2020. Google Scholar
  7. Antonios Antoniadis, Themis Gouleakis, Pieter Kleer, and Pavel Kolev. Secretary and online matching problems with machine learned advice. In NeurIPS, 2020. Google Scholar
  8. Étienne Bamas, Andreas Maggiori, Lars Rohwedder, and Ola Svensson. Learning augmented energy minimization via speed scaling. In NeurIPS, 2020. Google Scholar
  9. Étienne Bamas, Andreas Maggiori, and Ola Svensson. The primal-dual method for learning augmented algorithms. In NeurIPS, 2020. Google Scholar
  10. Nikhil Bansal, David P. Bunde, Ho-Leung Chan, and Kirk Pruhs. Average rate speed scaling. Algorithmica, 60(4):877-889, 2011. Google Scholar
  11. Nikhil Bansal, Ho-Leung Chan, Dmitriy Katz, and Kirk Pruhs. Improved bounds for speed scaling in devices obeying the cube-root rule. Theory Comput., 8(1):209-229, 2012. Google Scholar
  12. Nikhil Bansal, Tracy Kimbrel, and Kirk Pruhs. Speed scaling to manage energy and temperature. J. ACM, 54(1), March 2007. Google Scholar
  13. Nikhil Bansal, Kirk Pruhs, and Clifford Stein. Speed scaling for weighted flow time. SIAM J. Comput., 39(4):1294-1308, 2009. Google Scholar
  14. Avrim Blum and Carl Burch. On-line learning and the metrical task system problem. Machine Learning, 39(1):35-58, 2000. Google Scholar
  15. Dale L. Critchlow, Robert H. Dennard, and Stanley Schuster. Design and characteristics of n-channel insulated-gate field-effect transistors. IBM J. Res. Dev., 44(1):70-83, 2000. Google Scholar
  16. Paul Dütting, Silvio Lattanzi, Renato Paes Leme, and Sergei Vassilvitskii. Secretaries with advice. In EC, pages 409-429. ACM, 2021. Google Scholar
  17. Amos Fiat, Yuval Rabani, and Yiftach Ravid. Competitive k-server algorithms. Journal of Computer and System Sciences, 48(3):410-428, 1994. Google Scholar
  18. Sreenivas Gollapudi and Debmalya Panigrahi. Online algorithms for rent-or-buy with expert advice. In ICML, 2019. Google Scholar
  19. Mark Herbster and Manfred K Warmuth. Tracking the best expert. Machine learning, 32(2):151-178, 1998. Google Scholar
  20. Sandy Irani and Kirk Pruhs. Algorithmic problems in power management. SIGACT News, 36(2):63-76, 2005. Google Scholar
  21. Nicola Jones. How to stop data centers from gobbling up the world’s electricity, 2018. [Online; accessed 02-August-2021]. Google Scholar
  22. N. Littlestone and M.K. Warmuth. The weighted majority algorithm. Information and Computation, 108(2):212-261, 1994. URL: https://doi.org/10.1006/inco.1994.1009.
  23. Thodoris Lykouris and Sergei Vassilvtiskii. Competitive caching with machine learned advice. In International Conference on Machine Learning, pages 3296-3305. PMLR, 2018. Google Scholar
  24. Michael Mitzenmacher. Scheduling with predictions and the price of misprediction. In ITCS, volume 151 of LIPIcs, pages 14:1-14:18. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2020. Google Scholar
  25. Michael Mitzenmacher and Sergei Vassilvitskii. Algorithms with predictions. In Beyond the Worst-Case Analysis of Algorithms, pages 646-662. Cambridge University Press, 2020. Google Scholar
  26. Benjamin Moseley, Sergei Vassilvitskii, Silvio Lattanzi, and Thomas Lavastida. Online scheduling via learned weights. In SODA, 2020. Google Scholar
  27. Manish Purohit, Zoya Svitkina, and Ravi Kumar. Improving online algorithms via ml predictions. In Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018. Google Scholar
  28. Dhruv Rohatgi. Near-optimal bounds for online caching with machine learned advice. In SODA, 2020. Google Scholar
  29. Shufan Wang and Jian Li. Online algorithms for multi-shop ski rental with machine learned predictions. In AAMAS, pages 2035-2037. International Foundation for Autonomous Agents and Multiagent Systems, 2020. Google Scholar
  30. Alexander Wei. Better and simpler learning-augmented online caching. In APPROX/RANDOM, volume 176 of LIPIcs, pages 60:1-60:17. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2020. Google Scholar
  31. Adam Wierman, Lachlan L. H. Andrew, and Ao Tang. Power-aware speed scaling in processor sharing systems: Optimality and robustness. Perform. Evaluation, 69(12):601-622, 2012. Google Scholar
  32. F. Frances Yao, Alan J. Demers, and Scott Shenker. A scheduling model for reduced CPU energy. In FOCS, pages 374-382. IEEE Computer Society, 1995. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail