\documentclass{article}
\usepackage{amsmath,amssymb}
\DeclareMathOperator{\PH}{PH}
\DeclareMathOperator{\PSPACE}{PSPACE}
\DeclareMathOperator{\FP}{FP}
\DeclareMathOperator{\GapP}{Gap-P}
\DeclareMathOperator{\acc}{acc}
\DeclareMathOperator{\rej}{rej}
\DeclareMathOperator{\OR}{OR}
\DeclareMathOperator{\AC}{AC}
\input{preamble.tex}
\begin{document}
\def\st{\ |\ }
\def\sub{\subset}
\def\NumP{\# P}
\def\R{\mathbb{R}}
\lecture{11}{March 9, 2000}{Dan Spielman}{Michael Brewer}
\section{Old News}
First, a clarification of what happened last time. We showed that given
an OR gate with arbitrary parity, $\OR(x_1,\dots,x_n)$, we could replace
it with a randomized polynomial $P(r,x_1,\dots,x_n)$, having degree
$O((\log n)^k)$ and with $|r| = O((\log n)^k)$ for some $k$, such that
$$\forall x \left( \Pr_{r}(\OR(x_1,\dots,x_n) = P(r,x_1,\dots,x_n) ) >
1 - \left(\frac{7}{8} \right)^{n^{k-3}} \right).$$
Given an $\AC_0$ circuit $C$ of constant depth $c$, then, we can replace
each $\OR$ gate with $P$,
and each AND gate with $1 - P(1 - \overline{x})$ to get a polynomial $P_C$
with degree $O((\log n)^{k+c})$. Given that
$$\#\{\text{gates}\} \times \left(\frac{7}{8} \right)^{n^{k-3}} < \frac{1}{4},$$
we have that
$$\Pr_{r} (C(x_1,\dots,x_n) = P_C(r,x_1,\dots,x_n)) > \frac{3}{4}.$$
Now, assuming that $C = \text{parity}(x_1,\dots,x_n)$, and transforming our
polynomials from $\{0,1\}$ to $\{1,-1\}$, as discussed last time, we get
that $\text{parity}(x_1,\dots,x_n) = \prod_{i=1}^n x_i$.
Fixing our random bits $r$, we get that
$P_C(r,x_1,\dots,x_n) = \text{parity}(x_1,\dots,x_n)$ on 3/4 of the inputs.
Let $S = \{\text{inputs on which $P_C$ and parity agree}\}$. Then, for
the function space $S \to \R$, maps from $S$ to the real numbers, we have
$\dim (S \to \R) \geq \tfrac{3}{4} \cdot 2^n$. On the other hand, using
the parity function, we only
need monomials of degree at most $\tfrac{n}{2}$ to express all such functions,
using the fact that $(\pm 1)^2 = 1$ and the fact that any monomial is
equivalent to one of smaller degree modulo the parity function. This is
a contradiction, since $\tfrac{n}{2}$ is much smaller than
$\tfrac{3}{4} \cdot 2^n$.
In fact, by careful analysis of the proof above, we get that a circuit
computing parity needs $2^{n^{\Omega(1/d)}}$ gates.
Now on to the new material:
\section{A Relativization for $\PH \not= \PSPACE$}
\begin{theorem}There exists an oracle $A$ such that $\PH^A \subsetneq
\PSPACE^A$.
\end{theorem}
\begin{proof}
The idea is to find something that a $\PSPACE$ machine can do easily, but which
is very hard for a $\PH$ machine. We will essentially use a very long parity
computation.
For an oracle $A$, define
$$L_A = \{0^n \st A \ \text{has an odd number of
strings of length $n$}\}.$$
\begin{itemize}
\item $L_A \in \PSPACE^A$.
To see whether $0^n \in L_A$, we check whether each word of length $n$ is
in $A$, keeping a running total of how many are, and compute parity of the
total.
\item $L_A \not\in \PH^A$.
It suffices to show that $L_A \not\in \Sigma_k^P$ for any $k$, and in
particular to show that no $\Sigma_k^P$ machine which runs in time $n^c$ for
constant $c$ decides $L_A$. Suppose, to the contrary, that such a machine
$M$ did decide $L_A$.
\begin{figure}[h]
\begin{center}
\input{circuit.latex}
\caption{\text{Circuit representing $M$.}}
\label{figure:circuit}
\end{center}
\end{figure}
We can represent $M$ with an alternating circuit $C$, a loose
diagram of which is given in Figure \ref{figure:circuit}, which starts
with an OR gate, has branching factor $2^{n^c}$ at each node, and does
polynomial ($n^c$) work at each leaf.
Note the size of this circuit isn't quite polynomial in $2^n = N$. Its
size is bounded by $2^{k(\log N)^c} < 2^{N^{1/k}}$ for sufficiently large
$N$, however, so we will be able to apply our circuit lower bound on parity.
At each branch, $C$ can access the oracle polynomially many times.
To apply our lower bound, we'll need to correct this:
\begin{claim}
For any oracle machine $M^? \in \Sigma_k^P$, there exists ${M'}^? \in
\Sigma_{k+2}^P$, equivalent, which reads the oracle only once on each branch.
\end{claim}
\begin{proof-sketch}
We want $M'$ to simulate $M$. We can assume $M$ passes through all $\exists$
and $\forall$ states before asking questions of the oracle. $M'$ will follow
$M$ through this stage. At this point, $M'$ will:
\begin{enumerate}
\item Use $\exists$ state to guess which questions $M$ will ask of the oracle,
and what the answers the oracle gives will be.
\item Simulate $M$ using these guesses, rejecting if $M$ asks a question that
$M'$ didn't guess the answer to above.
\item Use $\forall$ state to choose one question from (a), ask the oracle that
question, and accept if and only if its guess agreed with the oracle.
\end{enumerate}
\end{proof-sketch}
\pagebreak
Now, for any $M \in \Sigma_k^P$, for each input length $n$, on input $0^n$,
we get some depth $k+2$ AND/OR circuit of size $2^{\textit{poly}(n)}$, with
inputs of whether or not a single word of length $2^n$ is in the oracle, and
NOTs just above the inputs. We can feed the circuit the words in the oracle
of length $n$ to decide whether or not the circuit accepts.
From our circuit lower bound, we know that for some sufficiently large $n$,
there exists $A$ such that the circuit gets parity wrong. That is, either
$M^A(0^n)$ accepts when $0^n \not\in L_A$, or $M^A(0^n)$ rejects when
$0^n \in L_A$.
To diagonalize, build the following $A$. Enumerate machines $M_i$, and find
for each $i$ sufficiently large $n_i$ so that $M_j$, $j < i$, asks no questions
about words of length at least $n_i$ on input $0^{n_j}$, and so that there
exists some language $A_i$ for which $M_i$ fails to compute parity.
Make $A$ agree with $A_i$ on words of length $n_i$.
\end{itemize}
\end{proof}
\section{$\NumP$ and $\GapP$}
We previously defined $\NumP$. Note that this class is inflexible: the sum or
product of two functions is in $\NumP$, but the difference is not. We define
something a bit easier to work with. Write
\begin{align*}
\acc_M(x) &= \#\{\text{accepting paths of $M$ on $x$}\} \\
\rej_M(x) &= \#\{\text{rejecting paths of $M$ on $x$}\}
\end{align*}
\begin{definition} A function $f \in \GapP$ if there exists a polynomial-time
non-deterministic TM $M$ such that
$$\forall x (f(x) = \acc_M(x) - \rej_M(x)).$$
\end{definition}
\begin{theorem}$P^{\NumP} = P^{\GapP}$.\end{theorem}
\begin{proof}
\begin{itemize}
\item $P^{\GapP} \sub P^{\NumP}$.
We must compute in $P^{\NumP}$, for a polynomial time NTM $M$,
$\# \{\text{accepting paths of $M$}\} - \#\{\text{rejecting paths of
$M$}\}$. Using the $\NumP$ oracle will give us the number of accepting
paths. To get the number of rejecting paths, we swap the accept and reject
states for $M$, creating a new NTM $M'$, and call the $\NumP$ oracle on $M'$.
\item $P^{\NumP} \sub P^{\GapP}.$
Given a polynomial time NTM $M$, replace our reject states as follows:
\begin{figure}[h]
\begin{center}
\input{modify.latex}
\caption{\text{Change in reject states of $M$.}}
\end{center}
\end{figure}
Call the resulting machine $M'$. Then
$$\acc_{M'}(x) - \rej_{M'}(x) = \acc_M(x).$$
\end{itemize}
\end{proof}
Some great things about $\GapP$:
\begin{lemma}
Let $f,g \in \GapP$ and $h \in \FP$, the class of polynomial-time-computable
functions. Then
\begin{enumerate}
\item $f + g \in \GapP$
\item $-f \in \GapP$
\item $f \cdot g \in \GapP$
\item $h \in \GapP$
\end{enumerate}
\end{lemma}
\begin{proof}
Suppose that $M_f$ and $M_g$ are defining NTMs of $f$ and $g$,
respectively. Then
\begin{enumerate}
\begin{figure}[h]
\begin{center}
\input{fplusg.latex}
\hspace{15ex}
\input{fg.latex}
\caption{\text{Defining NTMs for $f + g$ and $f \cdot g$, respectively.}}
\label{figure:plusandtimes}
\end{center}
\end{figure}
\item $f + g$ has as defining NTM the machine on the left hand side
of Figure \ref{figure:plusandtimes}. Clearly the number of accepting
and rejecting states add.
\item $-f$ has as defining NTM the machine generated from $M_f$ by
swapping accept and reject states.
\item The defining NTM for $f \cdot g$ has computation structure as on the
right hand side of Figure \ref{figure:plusandtimes}. We
modify the machine so that it accepts if both $M_f$ and $M_g$ accept or
both reject. Call the result $M'$. We have on input $x$,
\begin{align*}
\acc_{M'}(x) - \rej_{M'}(x) &= \acc_{M_f}(x) \acc_{M_g}(x) +
\rej_{M_f}(x) \rej_{M_g}(x) \\
&\hspace{10ex}- \acc_{M_f}(x) \rej_{M_g}(x)
- \rej_{M_f}(x) \acc_{M_g}(x) \\
&= (\acc_{M_f}(x) - \rej_{M_f}) \cdot (\acc_{M_g}(x) - \rej_{M_g}) \\
&= f(x) \cdot g(x)
\end{align*}
as desired.
\item Define $M$ to, on input $x$, deterministically compute $h(x)$. If
$h(x) = 0$, then $M$ will branch once, accepting on one branch and
rejecting on the other. If $h(x) > 0$ ($< 0$, respectively), then $M$ will
branch according to the following recursive rule applied to $|h(x)|$, accepting
(resp., rejecting) on each leaf. If $M$ needs to create 1 node, it will
not branch, otherwise $M$ will branch,
creating
$\lceil |h(x)| \rceil$ leaves on its left branch, and $\lfloor |h(x)| \rfloor$
leaves on the right branch.
\begin{figure}[h]
\begin{center}
\input{three.latex}
\caption{\text{Branching diagram for $M$ when $h(x) = 3$.}}
\label{figure:three}
\end{center}
\end{figure}
In all $M$ will create $|h(x)|$ leaves, so that $\acc_M(x) - \rej_M(x) = h(x)$.
An example for $M$'s branching process when $h(x) = 3$ is in Figure
\ref{figure:three}.
\end{enumerate}
\end{proof}
\end{document}