Found 2 Possible Name Variants:

Document

**Published in:** LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)

One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance by Landau et al. gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unresolved, including whether there exist rigorous efficient algorithms when the ground space is degenerate (and poly(n) dimensional), or for the poly(n) lowest energy states for 1D systems, or even whether such states admit succinct classical descriptions or area laws.
In this paper we give a new algorithm for finding low energy states for 1D systems, based on a rigorously justified renormalization group (RG)-type transformation. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly(n) degenerate ground spaces and an n^{O(\log n)} algorithm for the poly(n) lowest energy states for 1D systems (under a mild density condition). We note that for these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is O(nM(n)), where M(n) is the time required to multiply two n by n matrices.

Itai Arad, Zeph Landau, Umesh V. Vazirani, and Thomas Vidick. Rigorous Rg Algorithms and Area Laws for Low Energy Eigenstates In 1D. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 46:1-46:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)

Copy BibTex To Clipboard

@InProceedings{arad_et_al:LIPIcs.ITCS.2017.46, author = {Arad, Itai and Landau, Zeph and Vazirani, Umesh V. and Vidick, Thomas}, title = {{Rigorous Rg Algorithms and Area Laws for Low Energy Eigenstates In 1D}}, booktitle = {8th Innovations in Theoretical Computer Science Conference (ITCS 2017)}, pages = {46:1--46:14}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-029-3}, ISSN = {1868-8969}, year = {2017}, volume = {67}, editor = {Papadimitriou, Christos H.}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.46}, URN = {urn:nbn:de:0030-drops-81920}, doi = {10.4230/LIPIcs.ITCS.2017.46}, annote = {Keywords: Hamiltonian complexity, area law, gapped ground states, algorithm} }

Document

**Published in:** LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)

We consider multiplayer games in which the players fall in two teams of size k, with payoffs equal within, and of opposite sign across, the two teams. In the classical case of k=1, such zero-sum games possess a unique value, independent of order of play, due to the von Neumann minimax theorem. However, this fails for all k>1; we can measure this failure by a duality gap, which quantifies the benefit of being the team to commit last to its strategy. In our main result we show that the gap equals 2(1-2^{1-k}) for m=2 and 2(1-\m^{-(1-o(1))k}) for m>2, with m being the size of the action space of each player.
At a finer level, the cost to a team of individual players acting independently while the opposition employs joint randomness is 1-2^{1-k} for k=2, and 1-\m^{-(1-o(1))k} for m>2.
This class of multiplayer games, apart from being a natural bridge between two-player zero-sum games and general multiplayer games, is motivated from Biology (the weak selection model of evolution) and Economics (players with shared utility but poor coordination).

Leonard Schulman and Umesh V. Vazirani. The Duality Gap for Two-Team Zero-Sum Games. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 56:1-56:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)

Copy BibTex To Clipboard

@InProceedings{schulman_et_al:LIPIcs.ITCS.2017.56, author = {Schulman, Leonard and Vazirani, Umesh V.}, title = {{The Duality Gap for Two-Team Zero-Sum Games}}, booktitle = {8th Innovations in Theoretical Computer Science Conference (ITCS 2017)}, pages = {56:1--56:8}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-029-3}, ISSN = {1868-8969}, year = {2017}, volume = {67}, editor = {Papadimitriou, Christos H.}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.56}, URN = {urn:nbn:de:0030-drops-81429}, doi = {10.4230/LIPIcs.ITCS.2017.56}, annote = {Keywords: multi-player games, duality gap, zero-sum games, evolution} }

Document

Invited Talk

**Published in:** LIPIcs, Volume 29, 34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014)

Even the most seasoned students of evolution, starting with Darwin himself, have occasionally expressed amazement at the fact that the mechanism of natural selection has produced the whole of Life as we see it around us. From a computational perspective, it is natural to marvel at evolution's solution to the problems of robotics, vision and theorem proving! What, then, is the complexity of evolution, viewed as an algorithm? One answer to this question is 10^{12}, roughly the number of sequential steps or generations from the earliest single celled creatures to today's Homo Sapiens. To put this into perspective, the processor of a modern cell phone can perform 10^{12} steps in less than an hour. Another answer is 10^30, the degree of parallelism, roughly the maximum number of organisms living on the Earth at any time. Perhaps the answer should be the product of the two numbers, roughly 10^42, to reflect the total work done by evolution, viewed as a parallel algorithm.
Here we argue, interpreting our recently published paper, that none of the above answers is really correct. Viewing evolution as an algorithm poses an additional challenge: recombination. Even if evolution succeeds in producing a particularly good solution (a highly fit individual), its offspring would only inherit half its genes, and therefore appear unlikely to be a good solution. This is the core of the problem of explaining the role of sex in evolution, known as the "queen of problems in evolutionary biology".
The starting point is the diffusion-equation-based approach of theoretical population geneticists, who analyze the changing allele frequencies (over the generations) in the gene pool, consisting of the aggregate of the genetic variants (or "alleles") over all genes (or "loci") and over all individuals in a species. Taking this viewpoint to its logical conclusion, rather than acting on individuals or species or genes, evolution acts on this gene pool, or genetic soup, by making it more "potent", in the sense that it increases the expected fitness of genotype drawn randomly from this soup. Moreover, for much genetic variation, this soup may be assumed to be in the regime of weak selection, a regime where the probability of occurrence of a certain genotype involving various alleles at different loci is simply the product of the probabilities of each of its alleles. In this regime, we show that evolution in the regime of weak selection can be formulated as a game, where the recombining loci are the players, the alleles in those loci are possible moves or actions of each player, and the expected payoff of each player-locus is precisely the organism's expected fitness across the genotypes that are present in the population. Moreover, the dynamics specified by the diffusion equations of theoretical population geneticists is closely approximated by the dynamics of multiplicative weight updates (MWUA).
The algorithmic connection to MWUA brings with it new insights for evolutionary biology, specifically, into the question of how genetic diversity is maintained in the presence of natural selection. For this it is useful to consider a dual view of MWUA, which expresses "what each gene is optimizing" as it plays the game. Remarkably this turns out to be a particular convex combination of the entropy of its distribution over alleles and cumulative expected fitness. This sheds new light on the maintenance of diversity in evolution.
All of this suggests that the complexity of evolution should indeed be viewed as 10^12, but for a subtle reason. It is the number of steps of multiplicative weight updates carried out on allele frequencies in the genetic soup. A closer examination of this reveals further that the accurate tracking of allele frequencies over the generations requires the simulation of a quadratic dynamical system (two parents for each offspring). Moreover the simulation of even simple quadratic dynamical systems is known to be PSPACE-hard. This suggests that the tracking of allele frequencies might require large population sizes for each species, putting into perspective the number 10^30. Finally, it is worth noting that in this view there is a primacy to recombination or sex, which serve to provide robustness to the mechanism of evolution, as well as the framework within which MWUA operates.

Erick Chastain, Adi Livnat, Christos H. Papadimitriou, and Umesh V. Vazirani. Algorithms, Games, and Evolution (Invited Talk). In 34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 29, pp. 45-46, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)

Copy BibTex To Clipboard

@InProceedings{chastain_et_al:LIPIcs.FSTTCS.2014.45, author = {Chastain, Erick and Livnat, Adi and Papadimitriou, Christos H. and Vazirani, Umesh V.}, title = {{Algorithms, Games, and Evolution}}, booktitle = {34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014)}, pages = {45--46}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-939897-77-4}, ISSN = {1868-8969}, year = {2014}, volume = {29}, editor = {Raman, Venkatesh and Suresh, S. P.}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2014.45}, URN = {urn:nbn:de:0030-drops-48310}, doi = {10.4230/LIPIcs.FSTTCS.2014.45}, annote = {Keywords: evolution, recombination, coordination games, multiplicative weight updates} }

Document

Invited Talk

**Published in:** LIPIcs, Volume 13, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011)

Quantum states generally require exponential sized classical descriptions, but the long conjectured area law provides hope that a large class of natural quantum states can be described succinctly. Recent progress in formally proving the area law is described.

Umesh V. Vazirani. Quantum State Description Complexity (Invited Talk). In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011). Leibniz International Proceedings in Informatics (LIPIcs), Volume 13, pp. 26-27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)

Copy BibTex To Clipboard

@InProceedings{vazirani:LIPIcs.FSTTCS.2011.26, author = {Vazirani, Umesh V.}, title = {{Quantum State Description Complexity}}, booktitle = {IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2011)}, pages = {26--27}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-939897-34-7}, ISSN = {1868-8969}, year = {2011}, volume = {13}, editor = {Chakraborty, Supratik and Kumar, Amit}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2011.26}, URN = {urn:nbn:de:0030-drops-33408}, doi = {10.4230/LIPIcs.FSTTCS.2011.26}, annote = {Keywords: area law, Hamiltonian, description complexity, detectability lemma, entanglement} }

Document

**Published in:** LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)

Entanglement is a quantum resource, in some ways analogous to randomness in classical computation. Inspired by recent work of Gheorghiu and Hoban, we define the notion of "pseudoentanglement", a property exhibited by ensembles of efficiently constructible quantum states which are indistinguishable from quantum states with maximal entanglement. Our construction relies on the notion of quantum pseudorandom states - first defined by Ji, Liu and Song - which are efficiently constructible states indistinguishable from (maximally entangled) Haar-random states. Specifically, we give a construction of pseudoentangled states with entanglement entropy arbitrarily close to log n across every cut, a tight bound providing an exponential separation between computational vs information theoretic quantum pseudorandomness. We discuss applications of this result to Matrix Product State testing, entanglement distillation, and the complexity of the AdS/CFT correspondence. As compared with a previous version of this manuscript (arXiv:2211.00747v1) this version introduces a new pseudorandom state construction, has a simpler proof of correctness, and achieves a technically stronger result of low entanglement across all cuts simultaneously.

Scott Aaronson, Adam Bouland, Bill Fefferman, Soumik Ghosh, Umesh Vazirani, Chenyi Zhang, and Zixin Zhou. Quantum Pseudoentanglement. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 2:1-2:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)

Copy BibTex To Clipboard

@InProceedings{aaronson_et_al:LIPIcs.ITCS.2024.2, author = {Aaronson, Scott and Bouland, Adam and Fefferman, Bill and Ghosh, Soumik and Vazirani, Umesh and Zhang, Chenyi and Zhou, Zixin}, title = {{Quantum Pseudoentanglement}}, booktitle = {15th Innovations in Theoretical Computer Science Conference (ITCS 2024)}, pages = {2:1--2:21}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-309-6}, ISSN = {1868-8969}, year = {2024}, volume = {287}, editor = {Guruswami, Venkatesan}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.2}, URN = {urn:nbn:de:0030-drops-195300}, doi = {10.4230/LIPIcs.ITCS.2024.2}, annote = {Keywords: Quantum computing, Quantum complexity theory, entanglement} }

Document

**Published in:** LIPIcs, Volume 158, 15th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2020)

A proof of quantumness is a method for provably demonstrating (to a classical verifier) that a quantum device can perform computational tasks that a classical device with comparable resources cannot. Providing a proof of quantumness is the first step towards constructing a useful quantum computer.
There are currently three approaches for exhibiting proofs of quantumness: (i) Inverting a classically-hard one-way function (e.g. using Shor’s algorithm). This seems technologically out of reach. (ii) Sampling from a classically-hard-to-sample distribution (e.g. BosonSampling). This may be within reach of near-term experiments, but for all such tasks known verification requires exponential time. (iii) Interactive protocols based on cryptographic assumptions. The use of a trapdoor scheme allows for efficient verification, and implementation seems to require much less resources than (i), yet still more than (ii).
In this work we propose a significant simplification to approach (iii) by employing the random oracle heuristic. (We note that we do not apply the Fiat-Shamir paradigm.)
We give a two-message (challenge-response) proof of quantumness based on any trapdoor claw-free function. In contrast to earlier proposals we do not need an adaptive hard-core bit property. This allows the use of smaller security parameters and more diverse computational assumptions (such as Ring Learning with Errors), significantly reducing the quantum computational effort required for a successful demonstration.

Zvika Brakerski, Venkata Koppula, Umesh Vazirani, and Thomas Vidick. Simpler Proofs of Quantumness. In 15th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 158, pp. 8:1-8:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)

Copy BibTex To Clipboard

@InProceedings{brakerski_et_al:LIPIcs.TQC.2020.8, author = {Brakerski, Zvika and Koppula, Venkata and Vazirani, Umesh and Vidick, Thomas}, title = {{Simpler Proofs of Quantumness}}, booktitle = {15th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2020)}, pages = {8:1--8:14}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-146-7}, ISSN = {1868-8969}, year = {2020}, volume = {158}, editor = {Flammia, Steven T.}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2020.8}, URN = {urn:nbn:de:0030-drops-120677}, doi = {10.4230/LIPIcs.TQC.2020.8}, annote = {Keywords: Proof of Quantumness, Random Oracle, Learning with Errors} }

Document

Abstract

**Published in:** LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)

The AdS/CFT correspondence is central to efforts to reconcile gravity and quantum mechanics, a fundamental goal of physics. It posits a duality between a gravitational theory in Anti de Sitter (AdS) space and a quantum mechanical conformal field theory (CFT), embodied in a map known as the AdS/CFT dictionary mapping states to states and operators to operators. This dictionary map is not well understood and has only been computed on special, structured instances. In this work we introduce cryptographic ideas to the study of AdS/CFT, and provide evidence that either the dictionary must be exponentially hard to compute, or else the quantum Extended Church-Turing thesis must be false in quantum gravity.
Our argument has its origins in a fundamental paradox in the AdS/CFT correspondence known as the wormhole growth paradox. The paradox is that the CFT is believed to be "scrambling" - i.e. the expectation value of local operators equilibrates in polynomial time - whereas the gravity theory is not, because the interiors of certain black holes known as "wormholes" do not equilibrate and instead their volume grows at a linear rate for at least an exponential amount of time. So what could be the CFT dual to wormhole volume? Susskind’s proposed resolution was to equate the wormhole volume with the quantum circuit complexity of the CFT state. From a computer science perspective, circuit complexity seems like an unusual choice because it should be difficult to compute, in contrast to physical quantities such as wormhole volume.
We show how to create pseudorandom quantum states in the CFT, thereby arguing that their quantum circuit complexity is not "feelable", in the sense that it cannot be approximated by any efficient experiment. This requires a specialized construction inspired by symmetric block ciphers such as DES and AES, since unfortunately existing constructions based on quantum-resistant one way functions cannot be used in the context of the wormhole growth paradox as only very restricted operations are allowed in the CFT. By contrast we argue that the wormhole volume is "feelable" in some general but non-physical sense. The duality between a "feelable" quantity and an "unfeelable" quantity implies that some aspect of this duality must have exponential complexity. More precisely, it implies that either the dictionary is exponentially complex, or else the quantum gravity theory is exponentially difficult to simulate on a quantum computer.
While at first sight this might seem to justify the discomfort of complexity theorists with equating computational complexity with a physical quantity, a further examination of our arguments shows that any resolution of the wormhole growth paradox must equate wormhole volume to an "unfeelable" quantity, leading to the same conclusions. In other words this discomfort is an inevitable consequence of the paradox.

Adam Bouland, Bill Fefferman, and Umesh Vazirani. Computational Pseudorandomness, the Wormhole Growth Paradox, and Constraints on the AdS/CFT Duality (Abstract). In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 63:1-63:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)

Copy BibTex To Clipboard

@InProceedings{bouland_et_al:LIPIcs.ITCS.2020.63, author = {Bouland, Adam and Fefferman, Bill and Vazirani, Umesh}, title = {{Computational Pseudorandomness, the Wormhole Growth Paradox, and Constraints on the AdS/CFT Duality}}, booktitle = {11th Innovations in Theoretical Computer Science Conference (ITCS 2020)}, pages = {63:1--63:2}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-134-4}, ISSN = {1868-8969}, year = {2020}, volume = {151}, editor = {Vidick, Thomas}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.63}, URN = {urn:nbn:de:0030-drops-117486}, doi = {10.4230/LIPIcs.ITCS.2020.63}, annote = {Keywords: Quantum complexity theory, pseudorandomness, AdS/CFT correspondence} }

Document

**Published in:** LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)

A critical goal for the field of quantum computation is quantum supremacy - a demonstration of any quantum computation that is prohibitively hard for classical computers. It is both a necessary milestone on the path to useful quantum computers as well as a test of quantum theory in the realm of high complexity. A leading near-term candidate, put forth by the Google/UCSB team, is sampling from the probability distributions of randomly chosen quantum circuits, called Random Circuit Sampling (RCS).
While RCS was defined with experimental realization in mind, we give strong complexity-theoretic evidence for the classical hardness of RCS, placing it on par with the best theoretical proposals for supremacy. Specifically, we show that RCS satisfies an average-case hardness condition - computing output probabilities of typical quantum circuits is as hard as computing them in the worst-case, and therefore #P-hard. Our reduction exploits the polynomial structure in the output amplitudes of random quantum circuits, enabled by the Feynman path integral. In addition, it follows from known results that RCS also satisfies an anti-concentration property, namely that errors in estimating output probabilities are small with respect to the probabilities themselves. This makes RCS the first proposal for quantum supremacy with both of these properties. We also give a natural condition under which an existing statistical measure, cross-entropy, verifies RCS, as well as describe a new verification measure which in some formal sense maximizes the information gained from experimental samples.

Adam Bouland, Bill Fefferman, Chinmay Nirkhe, and Umesh Vazirani. "Quantum Supremacy" and the Complexity of Random Circuit Sampling. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 15:1-15:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)

Copy BibTex To Clipboard

@InProceedings{bouland_et_al:LIPIcs.ITCS.2019.15, author = {Bouland, Adam and Fefferman, Bill and Nirkhe, Chinmay and Vazirani, Umesh}, title = {{"Quantum Supremacy" and the Complexity of Random Circuit Sampling}}, booktitle = {10th Innovations in Theoretical Computer Science Conference (ITCS 2019)}, pages = {15:1--15:2}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-095-8}, ISSN = {1868-8969}, year = {2019}, volume = {124}, editor = {Blum, Avrim}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.15}, URN = {urn:nbn:de:0030-drops-101084}, doi = {10.4230/LIPIcs.ITCS.2019.15}, annote = {Keywords: quantum supremacy, average-case hardness, verification} }

Document

**Published in:** LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)

The No Low-Energy Trivial States (NLTS) conjecture of Freedman and Hastings (Quantum Information and Computation 2014), which asserts the existence of local Hamiltonians whose low-energy states cannot be generated by constant-depth quantum circuits, identifies a fundamental obstacle to resolving the quantum PCP conjecture. Progress towards the NLTS conjecture was made by Eldar and Harrow (Foundations of Computer Science 2017), who proved a closely related theorem called No Low-Error Trivial States (NLETS). In this paper, we give a much simpler proof of the NLETS theorem and use the same technique to establish superpolynomial circuit size lower bounds for noisy ground states of local Hamiltonians (assuming QCMA != QMA), resolving an open question of Eldar and Harrow. We discuss the new light our results cast on the relationship between NLTS and NLETS.
Finally, our techniques imply the existence of approximate quantum low-weight check (qLWC) codes with linear rate, linear distance, and constant weight checks. These codes are similar to quantum LDPC codes except (1) each particle may participate in a large number of checks, and (2) errors only need to be corrected up to fidelity 1 - 1/poly(n). This stands in contrast to the best-known stabilizer LDPC codes due to Freedman, Meyer, and Luo which achieve a distance of O(sqrt{n log n}).
The principal technique used in our results is to leverage the Feynman-Kitaev clock construction to approximately embed a subspace of states defined by a circuit as the ground space of a local Hamiltonian.

Chinmay Nirkhe, Umesh Vazirani, and Henry Yuen. Approximate Low-Weight Check Codes and Circuit Lower Bounds for Noisy Ground States. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 91:1-91:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)

Copy BibTex To Clipboard

@InProceedings{nirkhe_et_al:LIPIcs.ICALP.2018.91, author = {Nirkhe, Chinmay and Vazirani, Umesh and Yuen, Henry}, title = {{Approximate Low-Weight Check Codes and Circuit Lower Bounds for Noisy Ground States}}, booktitle = {45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)}, pages = {91:1--91:11}, series = {Leibniz International Proceedings in Informatics (LIPIcs)}, ISBN = {978-3-95977-076-7}, ISSN = {1868-8969}, year = {2018}, volume = {107}, editor = {Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.91}, URN = {urn:nbn:de:0030-drops-90950}, doi = {10.4230/LIPIcs.ICALP.2018.91}, annote = {Keywords: quantum pcps, local hamiltonians, error-correcting codes} }

Document

**Published in:** Dagstuhl Seminar Reports. Dagstuhl Seminar Reports, Volume 1 (2021)

Marek Karpinski, Michael Luby, and Umesh Vazirani. Randomized Algorithms (Dagstuhl Seminar 9124). Dagstuhl Seminar Report 14, pp. 1-24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (1991)

Copy BibTex To Clipboard

@TechReport{karpinski_et_al:DagSemRep.14, author = {Karpinski, Marek and Luby, Michael and Vazirani, Umesh}, title = {{Randomized Algorithms (Dagstuhl Seminar 9124)}}, pages = {1--24}, ISSN = {1619-0203}, year = {1991}, type = {Dagstuhl Seminar Report}, number = {14}, institution = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik}, address = {Dagstuhl, Germany}, URL = {https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemRep.14}, URN = {urn:nbn:de:0030-drops-149022}, doi = {10.4230/DagSemRep.14}, }

X

Feedback for Dagstuhl Publishing

Feedback submitted

Please try again later or send an E-mail