Improved Cut Strategy for Tensor Network Contraction Orders

Authors Christoph Staudt , Mark Blacher , Julien Klaus , Farin Lippmann , Joachim Giesen



PDF
Thumbnail PDF

File

LIPIcs.SEA.2024.27.pdf
  • Filesize: 1.79 MB
  • 19 pages

Document Identifiers

Author Details

Christoph Staudt
  • Friedrich Schiller University Jena, Germany
Mark Blacher
  • Friedrich Schiller University Jena, Germany
Julien Klaus
  • Friedrich Schiller University Jena, Germany
Farin Lippmann
  • Friedrich Schiller University Jena, Germany
Joachim Giesen
  • Friedrich Schiller University Jena, Germany

Cite AsGet BibTex

Christoph Staudt, Mark Blacher, Julien Klaus, Farin Lippmann, and Joachim Giesen. Improved Cut Strategy for Tensor Network Contraction Orders. In 22nd International Symposium on Experimental Algorithms (SEA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 301, pp. 27:1-27:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)
https://doi.org/10.4230/LIPIcs.SEA.2024.27

Abstract

In the field of quantum computing, simulating quantum systems on classical computers is crucial. Tensor networks are fundamental in simulating quantum systems. A tensor network is a collection of tensors, that need to be contracted into a result tensor. Tensor contraction is a generalization of matrix multiplication to higher order tensors. The contractions can be performed in different orders, and the order has a significant impact on the number of floating point operations (flops) needed to get the result tensor. It is known that finding an optimal contraction order is NP-hard. The current state-of-the-art approach for finding efficient contraction orders is to combinine graph partitioning with a greedy strategy. Although heavily used in practice, the current approach ignores so-called free indices, chooses node weights without regarding previous computations, and requires numerous hyperparameters that need to be tuned at runtime. In this paper, we address these shortcomings by developing a novel graph cut strategy. The proposed modifications yield contraction orders that significantly reduce the number of flops in the tensor contractions compared to the current state of the art. Moreover, by removing the need for hyperparameter tuning at runtime, our approach converges to an efficient solution faster, which reduces the required optimization time by at least an order of magnitude.

Subject Classification

ACM Subject Classification
  • Theory of computation → Algorithm design techniques
  • Mathematics of computing → Solvers
  • Applied computing → Physics
Keywords
  • tensor network
  • contraction order
  • graph partitioniong
  • quantum simulation

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. Optuna: A next-generation hyperparameter optimization framework. In Ankur Teredesai, Vipin Kumar, Ying Li, Rómer Rosales, Evimaria Terzi, and George Karypis, editors, Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD 2019, Anchorage, AK, USA, August 4-8, 2019, pages 2623-2631. ACM, 2019. URL: https://doi.org/10.1145/3292500.3330701.
  2. Frank Arute, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C. Bardin, Rami Barends, Rupak Biswas, Sergio Boixo, Fernando G. S. L. Brandao, David A. Buell, Brian Burkett, Yu Chen, and et al. Quantum supremacy using a programmable superconducting processor. Nature, 574(7779):505-510, October 2019. Number: 7779 Publisher: Nature Publishing Group. URL: https://doi.org/10.1038/s41586-019-1666-5.
  3. Adrian Cho. The biggest flipping challenge in quantum computing. Science, 2020. URL: https://doi.org/10.1126/science.abd7332.
  4. Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to Algorithms, 3rd Edition. MIT Press, 2009. URL: http://mitpress.mit.edu/books/introduction-algorithms.
  5. Eugene F. Dumitrescu, Allison L. Fisher, Timothy D. Goodrich, Travis S. Humble, Blair D. Sullivan, and Andrew L. Wright. Benchmarking treewidth as a practical component of tensor network simulations. PLOS ONE, 13(12):1-19, December 2018. URL: https://doi.org/10.1371/journal.pone.0207827.
  6. Johnnie Gray. quimb: A python package for quantum information and many-body calculations. J. Open Source Softw., 3(29):819, 2018. URL: https://doi.org/10.21105/joss.00819.
  7. Johnnie Gray. jcmgray/cotengrust, February 2024. original-date: 2023-08-31T18:57:15Z. URL: https://github.com/jcmgray/cotengrust.
  8. Johnnie Gray and Stefanos Kourtis. Hyper-optimized tensor network contraction. Quantum, 5:410, 2021. URL: https://doi.org/10.22331/q-2021-03-15-410.
  9. Cupjin Huang, Fang Zhang, Michael Newman, Xiaotong Ni, Dawei Ding, Junjie Cai, Xun Gao, Tenghui Wang, Feng Wu, Gengyan Zhang, Hsiang-Sheng Ku, Zhengxiong Tian, Junyin Wu, Haihong Xu, Huanjun Yu, Bo Yuan, Mario Szegedy, Yaoyun Shi, Hui-Hai Zhao, Chunqing Deng, and Jianxin Chen. Efficient parallelization of tensor network contraction for simulating quantum computation. Nat. Comput. Sci., 1(9):578-587, 2021. URL: https://doi.org/10.1038/s43588-021-00119-7.
  10. Cameron Ibrahim, Danylo Lykov, Zichang He, Yuri Alexeev, and Ilya Safro. Constructing optimal contraction trees for tensor network quantum circuit simulation. In IEEE High Performance Extreme Computing Conference, HPEC 2022, Waltham, MA, USA, September 19-23, 2022, pages 1-8. IEEE, 2022. URL: https://doi.org/10.1109/HPEC55821.2022.9926353.
  11. Stefanos Kourtis, Claudio Chamon, Eduardo R. Mucciolo, and Andrei E. Ruckenstein. Fast counting with tensor networks. SciPost Physics, 7(5):060, November 2019. arXiv:1805.00475 [cond-mat, physics:physics]. URL: https://doi.org/10.48550/arXiv.1805.00475.
  12. Chi-Chung Lam, P. Sadayappan, and Rephael Wenger. On optimizing a class of multi-dimensional loops with reductions for parallel execution. Parallel Process. Lett., 7(2):157-168, 1997. URL: https://doi.org/10.1142/S0129626497000176.
  13. Ling Liang, Jianyu Xu, Lei Deng, Mingyu Yan, Xing Hu, Zheng Zhang, Guoqi Li, and Yuan Xie. Fast search of the optimal contraction sequence in tensor networks. IEEE J. Sel. Top. Signal Process., 15(3):574-586, January 2021. URL: https://doi.org/10.1109/JSTSP.2021.3051231.
  14. Igor L. Markov and Yaoyun Shi. Simulating quantum computation by contracting tensor networks. SIAM J. Comput., 38(3):963-981, 2008. URL: https://doi.org/10.1137/050644756.
  15. Eli A. Meirom, Haggai Maron, Shie Mannor, and Gal Chechik. Optimizing tensor network contraction using reinforcement learning. In Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvári, Gang Niu, and Sivan Sabato, editors, International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA, volume 162 of Proceedings of Machine Learning Research, pages 15278-15292. PMLR, 2022. URL: https://proceedings.mlr.press/v162/meirom22a.html.
  16. Román Orús. Tensor networks for complex quantum systems. Nature Reviews Physics, 1(9):538-550, September 2019. URL: https://doi.org/10.1038/s42254-019-0086-7.
  17. Feng Pan, Keyang Chen, and Pan Zhang. Solving the sampling problem of the sycamore quantum circuits. Phys. Rev. Lett., 129:090502, August 2022. URL: https://doi.org/10.1103/PhysRevLett.129.090502.
  18. Feng Pan and Pan Zhang. Simulating the Sycamore quantum supremacy circuits, March 2021. URL: https://arxiv.org/abs/2103.03074v1.
  19. Taylor L. Patti, Jean Kossaifi, Anima Anandkumar, and Susanne F. Yelin. Variational quantum optimization with multibasis encodings. Phys. Rev. Res., 4:033142, August 2022. URL: https://doi.org/10.1103/PhysRevResearch.4.033142.
  20. Robert N. C. Pfeifer, Jutho Haegeman, and Frank Verstraete. Faster identification of optimal contraction sequences for tensor networks. Physical Review E, 90(3):033315, September 2014. arXiv:1304.6112 [cond-mat, physics:physics, physics:quant-ph]. URL: https://doi.org/10.48550/arXiv.1304.6112.
  21. J. Rapin and O. Teytaud. Nevergrad - A gradient-free optimization platform. https://GitHub.com/FacebookResearch/Nevergrad, 2018.
  22. Frank Schindler and Adam S. Jermyn. Algorithms for tensor network contraction ordering. Mach. Learn. Sci. Technol., 1(3):35001, July 2020. Publisher: IOP Publishing. URL: https://doi.org/10.1088/2632-2153/ab94c5.
  23. Sebastian Schlag, Tobias Heuer, Lars Gottesbüren, Yaroslav Akhremtsev, Christian Schulz, and Peter Sanders. High-quality hypergraph partitioning. ACM J. Exp. Algorithmics, 27:1.9:1-1.9:39, February 2022. URL: https://doi.org/10.1145/3529090.
  24. Daniel G. A. Smith and Johnnie Gray. optbackslash_einsum - A python package for optimizing contraction order for einsum-like expressions. J. Open Source Softw., 3(26):753, 2018. URL: https://doi.org/10.21105/joss.00753.
  25. Christoph Staudt. Hybrid Contraction Tree Optimizer. Software, swhId: https://archive.softwareheritage.org/swh:1:dir:72b3334932a79d590af0d303cb339bd8fc93abe6;origin=https://github.com/ti2-group/hybrid_contraction_tree_optimizer;visit=swh:1:snp:43cd970f4096efa2f95c1f4cf7b1c164a5d1dd51;anchor=swh:1:rev:dcf3c162f4d532634350e83cbf7af54c9b745ded, (visited on 13/05/2024). URL: https://github.com/ti2-group/hybrid_contraction_tree_optimizer.
  26. Shi-Xin Zhang, Jonathan Allcock, Zhou-Quan Wan, Shuo Liu, Jiace Sun, Hao Yu, Xing-Han Yang, Jiezhong Qiu, Zhaofeng Ye, Yu-Qin Chen, Chee-Kong Lee, Yicong Zheng, Shao-Kai Jian, Hong Yao, Chang-Yu Hsieh, and Shengyu Zhang. Tensorcircuit: a quantum software framework for the NISQ era. Quantum, 7:912, February 2023. Publisher: Verein zur Förderung des Open Access Publizierens in den Quantenwissenschaften. URL: https://doi.org/10.22331/q-2023-02-02-912.