You are on page 1of 16

The Equivalence of

Sampling and Searching

Scott Aaronson
MIT

In complexity theory, we
love at least four types
of problems
Given an input x{0,1}n

Languages / Decision Problems. Decide if


xL or xL
Promise Problems. Decide if xYES or
xNO
Search Problems. Output an element of a
(nonempty) set Ax{0,1}m, with probability
1-, in poly(n,1/) time

Suppose we want to know


whether quantum computers
are stronger than classical
computers
(To pick a random example of a complexity
question)
Then which formal question do we really
mean to ask?

BPP vs. BQP?


PromiseBPP vs.
PromiseBQP?
FBPP vs. FBQP?
SampBPP vs.

Easy Implications
SampBPP=SampBQP FBPP=FBQP

PromiseBPP=PromiseBQP
BPP=BQP
Crucial question: Can these implications
be reversed?
We show that at least one of them
can:
FBPP=FBQP SampBPP=SampBQP

Application to Linear
Optics

[A.-Arkhipov, STOC11] study a


rudimentary type of quantum computer
based entirely on linear optics: identical,
non-interacting photons passing through a
network of beamsplitters
Our model doesnt seem to be universal for
quantum computing (or even classical
computing)but it can solve sampling
problems that we give evidence are hard
classically
Using todays result, we automatically also

But the QC stuff is just one


application of a much more
general result

Informal Statement:

Let S={Dx}x be any sampling problem.


Then there exists a search problem
RS={Ax}x thats equivalent to S, in the
following sense:
For any reasonable complexity class C
(BPP, BQP, BPPSPACE, etc.),

RSFC SSampC

Intuition
Suppose our sampling problem is to sample
uniformly from a set A{0,1}n
First stab at an equivalent search
problem: output any element of A
That clearly doesnt workfinding an A
element could be much easier than
sampling a random element!
Better idea: output an element yA
whose Kolmogorov complexity K(y) is close
to log2|A|

Clearly, if we can sample a random yA,


then with high probability K(y)log2|A|
But conversely, if a randomized machine M
outputs a y with K(y)log2|A|, it can only do
so by sampling y almost-uniformly from A.
For otherwise, M would yield a succinct
description of y, contrary to assumption!
Technical part: Generalize to nonuniform
distributions
Requires notion of a universal
randomness test from algorithmic
information theory

Comments
If we just wanted a search problem at
least as hard as S, that would be easy:
Kolmogorov complexity only comes in
because we need RS to be equivalent to S
Our reduction from sampling to search is
non-black-box: it requires the
assumption that we have a Turing
machine to solve RS!
Our result provides an extremely natural
application of Kolmogorov complexity to
standard complexity: one that doesnt
just amount to a counting argument

Kolmogorov Review
K(y | x): Prefix-free Kolmogorov complexity
of y, conditioned on x
Kolmogorentropy Lemma: Let D={py}
be a distribution, and let y be in its
support. Then
1
K y log 2
K D O1 ,
py
where K(D) is the length of the shortest
program to sample from D. Same holds if
we replace K(y) by K(y|x) and K(D) by K(D|
x).

Constructing the Search


Problem
Were given a sampling problem S={D } ,
x

where on input x{0,1}n, >0, the goal is


to sample an m-bit string from a
distribution C thats -close to D=Dx, in
m
1
m N
poly(n,1/)
time.
Let
p y : Pr y , N : 2.1 , Y y1 , , y N 0,1 , : 1 log 2 ,
D
x

Ax ,

1
: Y : log 2
K Y | x,
p y1 p y N

Then the search problem RS is this: on


input x{0,1}n, >0, output an N-tuple
Y=y1,,yNAx, with probability 1-, in

Equivalence Proof
Lemma: Let C be any distribution over
{0,1}m such that |C-Dx|. Then
1
PrN Y Ax , N O .
Y ~C
2
In other words, any algorithm that solves
the sampling problem also solves the
search problem w.h.p.
Proof: Counting argument.

Lemma: Given a probabilistic Turing machine


B, suppose

Pr B x, Ax , 1 .

Let C be the distribution over m-bit strings


obtained by running B(x,), then picking one
its N outputs y1,,yN randomly. Then there
exists a constantQB such
that words: if B solves
In other

C Dx QB

the search problem w.h.p.,


then it also solves the
sampling problem
Proof Sketch: Use Kolmogorentropy
Lemma
to show B(x,)s output distribution has small
KL-divergence from DN. Similar to Parallel
Repetition Theorem, this implies C has small
KL-divergence from D. By Pinskers Inequality,

Wrapping Up
Theorem:
Let O be any oracle that, given x, 01/, and a
random string r, outputs a sample from a
distribution C such that |C-Dx|. Then
RSFBPPO.
Let B be any probabilistic Turing machine
that, given x,01/, outputs a YAx, with
probability 1-. Then SSampBPPB.

Application to Quantum
Complexity
Suppose FBPP=FBQP.
Let SSampBQP.
Then RSFBQP
reduction]

[RSS

RSFBPP [by hypothesis]


SSampBPP. [SRS
reduction]

Open Problems
Can we show theres no black-box
equivalence between search and
sampling problems? (I.e., that our use of
Kolmogorov complexity was necessary?)
The converse direction: Given a search
problem, can we construct an equivalent
sampling problem?
What if we want the search problem to be
checkable?
Can redo proof with space-bounded Kolmogorov
complexity to put search problem in PSPACE, but
More
equivalence
theoremsideally,
seems hard to do better

involving decision and promise

You might also like