Minimum Circuit Size, Graph Isomorphism, and Related Problems

We study the computational power of deciding whether a given truth-table can be described by a circuit of a given size (the Minimum Circuit Size Problem, or MCSP for short), and of the variant denoted as MKTP where circuit size is replaced by a polynomially-related Kolmogorov measure. All prior reductions from supposedly-intractable problems to MCSP / MKTP hinged on the power of MCSP / MKTP to distinguish random distributions from distributions produced by hardness-based pseudorandom generator constructions. We develop a fundamentally different approach inspired by the well-known interactive proof system for the complement of Graph Isomorphism (GI). It yields a randomized reduction with zero-sided error from GI to MKTP. We generalize the result and show that GI can be replaced by any isomorphism problem for which the underlying group satisfies some elementary properties. Instantiations include Linear Code Equivalence, Permutation Group Conjugacy, and Matrix Subspace Conjugacy. Along the way we develop encodings of isomorphism classes that are efficiently decodable and achieve compression that is at or near the information-theoretic optimum; those encodings may be of independent interest.

How hard is MµP?Some known reductions:

Eliminating error?
Similar results:

How to eliminate error?
MµP is only used to generate witnesses, which are then checked in deterministic polynomial time Thus, showing GI ∈ coRP MµP using a similar approach implicitly requires GI ∈ coNP, i.
Main idea: use KT-complexity of a random sample to estimate the entropy Hope: KT(y ) is typically near the entropy, never much larger Then KT(y ) > θ is a witness of nonisomorphism.
Truth: KT(y )/t is typically near the entropy, never much larger Then KT(y )/t > θ is a witness of nonisomorphism.
Bounding KT in isomorphic case Describe y as , then as y is t independent samples from a distribution of entropy s + 1, KT(y )/t ≥ s + 1 − o(1) holds w.h.p.
=⇒ coRP MKTP algorithm for GI on rigid graphs Assume for simplicity that there are as many distinct permutations of G 0 as of G 1 .
Let s be entropy in random permutation of , hope KT(y )/t looks the same: Two complications: • Encoding distinct permutations of G 0 as numbers 1, . . ., n! is too expensive Proof.Recall that there is a deterministic algorithm using an oracle for GI that computes generators for Aut(G ).
Plug in an existing RP MKTP algorithm for the oracle: this gives us generators for a group A with A = Aut(G ) w.h.p.

Prune generators of
|A| can be computed efficiently from its generators.Output log |A|.
General graphs: Recap

Generic Encoding Lemma
We saw that for any rigid graph G , the n! distinct permutations of G can be encoded as integers 1, . . ., n!.
This can be extended to general graphs, but still involves heavy use of the structure of the symmetric group.

Open Problems
Open problem: SZK? Obstacle is devising witnesses for non-flat distributions There are distributions with low entropy but supported on every string-nontrivial worst-case bound on KT-complexity is impossible.
The argument should work for MCSP, but fails for annoying technical reasons.This is true even for rigid-GI.
Use KT complexity in two ways: • Counting argument: KT(y ) ts + t whp • Encoding: any string of length ts has KT ts For circuits, we get: • Counting argument: CSIZE(y ) (ts + t)/ log(ts + t) whp • Encoding: any string of length ts has CSIZE ts/ log(ts) Low-order terms matter: best known bounds require exponentially-large t to force gap between the isomorphic and nonisomorphic cases MCSP = {(x, θ) : x has circuit complexity at most θ} How hard is MCSP?• Factoring ∈ ZPP MCSP [Allender-Buhrman-Koucký-van Melkebeek-Ronneburger] • DiscreteLog ∈ ZPP MCSP [Allender-Buhrman-Koucký-van Melkebeek-Ronneburger, Rudow] • GI ∈ RP MCSP [Allender-Das] • SZK ⊆ BPP MCSP[Allender-Das] where GI = graph isomorphism SZK = problems with statistical zero knowledge protocols Can replace MCSP by MµP for any complexity measure µ polynomially related to circuit size Describe a string x by a program p so that p(i) = i-th bit of x KT(x) = smallest |p| + T , where • p describes x • p runs in at most T steps for all i MKTP = {(x, θ) : KT(x) ≤ θ} Time-bounded Turing machines with advice ∼ = Circuits =⇒ KT polynomially-related to circuit complexity where GI = graph isomorphism SZK = problems with statistical zero knowledge protocols MµP = MCSP, MKTP, . ..Eliminate error in GI and SZK reductions?Theorem.GI ∈ ZPP MKTP Fundamentally different reduction from before Extends to any 'explicit' isomorphism problem, including several where the best known algorithms are still exponential Doesn't (yet) work for MCSP How do the old reductions work?Hinge on MµP breaking PRGs PRG from any one-way function [Håstad-Impagliazzo-Levin-Luby] Inversion Lemma.There is a poly-time randomized Turing machine M using oracle access to MµP so that the following holds.For any circuit C , if σ ∼ {0, 1} n , Pr[C (τ ) = C (σ)] ≥ 1/poly(|C |) where τ = M(C , C (σ)) [Allender-Buhrman-Koucký-van Melkebeek-Ronneburger] Example: Fix a graph G .Let C map a permutation σ to σ(G ).M inverts C : if σ(G ) is a random permutation of G , then M(C , σ(G )) finds τ s.t.τ (G ) = σ(G ) with good probability Example: GI in RPˆMCSP [Allender-Das] Theorem.GI ∈ RP MµP [Allender-Das] e., NP-witnesses for nonisomorphism Approach uses MKTP to help with verification Theorem.GI ∈ ZPP MKTP Nonisomorphism has NP MKTP witnesses.Key idea: KT complexity is a good estimator for the entropy of samplable distributions Graph Isomorphism in ZPPˆMKTP Graph Isomorphism GI = decide whether two given graphs (G 0 , G 1 ) are isomorphic Aut(G ) = group of automorphisms of G Number of distinct permutations of G = n!/|Aut(G )| To show GI ∈ ZPP MKTP , suffices to show GI ∈ coRP MKTP , i.e., to witness nonisomorphism Recall: KT(x) = smallest |p| + T where p describes x in time T Intuition for bounding KT(x): describe a string x by a program p taking advice α so that p α (i) = i-th bit of x KT(x) is smallest |p| + |α| + T where • p with advice α describes x • p runs in at most T steps for all i Consider sampling r ∼ {0, 1} and π ∼ S n uniformly, and outputting the adjacency matrix of π(G r ).

1 s s + 1 θ
Indexing the various permutations of a non-rigid graph G as numbers 1, . . ., n! is too expensive Need to use numbers 1, . . ., N where N = n!/|Aut(G )| Such a specific encoding exists, but will see a more general-purpose substitute soon It suffices to give a probably-approximately-correct overestimator (PAC overestimator) for θ: θ KT (y )/t, G 0 ∼ = G 1 KT (y )/t, G 0 ∼ = G Equivalently, it suffices to give a PAC underestimator for log |Aut(G i )|, since θ = (log n! − log |Aut(G i )|) + 1/2 Claim.There is an efficient randomized algorithm using MKTP to PAC underestimate log |Aut(G )| when given G .
Rigid graphs: Isomorphic caseLehmer Code.There is an indexing of S n by the numbers 1, . . ., n! so that the i-th permutation can be decoded from the binary representation of i in time poly(n).
• Fixed data: n, t, adjacency matrix of G 0 • Per-sample data: τ 1 , . . ., τ t • Decoding algo: to output j-th bit of y , look up appropriate τ i and compute τ i (G 0 ) Suppose each τ i can be encoded into s bits: KT(y ) < O(1) |p| + poly(n, log t) + ts |α| + poly(n, log t) T = ts + poly(n, log t) ts + t (t large) Naïve conversion to binary: KT(y ) < t s + poly(n, log t) ts + t ?ts + t Blocking trick: amortize encoding overhead across samples Yields for some δ > 0, KT(y ) ≤ ts + t 1−δ poly(n), i.e., KT(y )/t ≤ s + poly(n)/t δ Rigid graphs: Recap Overhead of log s + O(1) is worse than s − s, but can still be amortized out.End result: for any graph G , any t permutations of G has KT-complexity at most ts + t 1−δ poly(n).In general, for any circuit C of max-entropy s, any t samples from C has KT-complexity at most ts + t 1−δ poly(|C |).Let C be any circuit sampling a distribution of max-entropy s max and min-entropy s min .Let y be the concatenation of t independent samples from C .Then KT(y )/t is typically between s min − o(1) and s max + o(1), and never much larger.Nice case: s max − s min = o(1).C is "almost flat".
Complete problem for SZK: determine whether a given samplable distribution has entropy at least a given threshold Entropy estimator theorem can reproduce SZK ⊆ BPP MKTP SZK ⊆ ZPP MKTP ?