\documentclass{article}
\usepackage{psfig,amsmath,amscd}
\input{preamble.tex}
\newcommand{\tm}{{\rm TM}}
\newcommand{\prob}[2][]{\underset{#1}{\rm Prob}\left[{#2}\right]}
\newcommand{\func}{\rightarrow}
\newcommand{\Space}[1]{\rm SPACE(#1)}
\begin{document}
\lecture{14}{April 1, 1999}{Daniel A. Spielman}{Prahladh Harsha}
\section{Pseudo-Random Generators}
In today's lecture we shall introduce the notion of a {\it
Pseudo-Random Generator} (PRG). Roughly speaking, a PRG is a
deterministic device that takes a fairly small (random) seed, blows it
up in size and fools a randomised $\tm$ into believing that this
blown-up string was a purely random string. In more formal terms, we have
\begin{definition}
Let $M$ be a randomised $\tm$ that on input $w$ requires $f(|w|)$
random bits. The family $\{g_n\}_{n=1}^{\infty}$ of functions
($g_n:\{0,1\}^{s(n)} \func \{0,1\}^{f(n)}$) is an $\epsilon-$generator
for $M$ if
\begin{eqnarray*}
\forall w\qquad \left| \prob[r \in \{0,1\}^{f(|w|)}]{M(w,r) \mathrm{
accepts}} - \prob[z \in \{0,1\}^{s(|w|)}]{M(w,g_{|w|}(z)) \mathrm{
accepts}} \right| < \epsilon
\end{eqnarray*}
\end{definition}
Instead of a family of functions, it is more convenient to view $g$,
the $\epsilon-$generator for $M$ as a $\tm$ that on input a string of length
$s(n)$ outputs a string of length $f(n)$. $s(n)$ is the length of the
seed of the PRG.
\section{Randomised Logspace $\tm$}
In this lecture, we shall present a PRG that fools all randomised
logspace $\tm$s. In the next lecture, we shall see a PRG for
polynomial time $\tm$s based on some unproven hardness
assumptions. Before proceeding any further, we have to clarify a point
regarding how the random bits are accessed in a randomised space
bounded computation (logspace in our case). There are 2 varying
definitions of randomised space bounded $\tm$s based on the manner
the random bits are accessed
\begin{itemize}
\item The $\tm$ obtains the random bits as and when required by it.
\item The string of random bits is fed as an auxiliary off-line input
(on a separate tape) in addition to the regular input to the $\tm$. In
this case, the head accessing the random bits on this tape can move
back and forth on the tape.
\end{itemize}
It is to be noted that in the case of randomised time bounded
computation it is immaterial which convention we observe. For
randomised space bounded computation, we shall consider only $\tm$s of
the first kind or equivalently consider $\tm$s of the second kind in
which the head on the random tape is restricted in the sense that it
can only move right along the tape (ie., the random tape is one-way
read-only tape) We shall also assume that all randomised space $S$
$\tm$s halt in time less than $2^S$.
\section{Nissan's Generator}
The PRG for randomised logspace $\tm$s is as follows:
\begin{theorem}\label{nissan}
There exists a $\Space{n}$ $\tm$ $ G$, $G:\{0,1\}^{O(\log^2m)} \func
\{0,1\}^m$ such that for all randomised logspace $\tm$ $M$, $G$ is a
$\frac{1}{n}-$generator for M.\footnote{We shall infact prove a more
general result that for all randomised space $S$ $\tm$ $M$, $G$ is a
$\frac{1}{2^S}-$generator for M.}
\end{theorem}
Before going into proving the existence of a PRG as mentioned in
Theorem \ref{nissan}, we shall first study the structure of the
computation tableau of a randomised space $S$ $\tm$ and find how this
structure can be exploited to reduce the randomness. The computation
tableau for a randomised space $S$ $\tm$ is as shown in Figure
1.
\begin{figure}[h]\label{comptable}
\begin{center}
\mbox{\psfig{file=computation.ps}}
\caption{The Computation Tableau of a space $S$ TM}
\end{center}
\end{figure}
In the original definition of the randomised $\tm$, the computation
requires atmost $R(=2^S)$ random bits. Consider the tableau being divided into
2 halves with each half requiring random strings $r_1$ and $r_2$
respectively each of length $r (=R/2)$. If $r_1$ and $r_2$ are chosen
independently, then by definition the randomised $\tm$ works to give
the right results. We would like to choose $r_1$ and $r_2$ in such a
manner that the probability of going from state $s_1$ to state $s_2$
(ie., $\prob{s_1 \func s_2}$) and $\prob{s_2 \func s_3}$ are not
significantly different from the case when $r_1$ and $r_2$ are chosen
independently. We would now use the fact that the computation tableau
is very thin (width $S$ compared to length $2^S$) to let us choose
$r_1$ and $r_2$ in a manner better than independently.
To put things more formally, we have 2 algorithms $A_1$ and $A_2$ such that
\begin{equation*}
\begin{CD}
A_1 @<<< x_1\\
@VV{b_1}V\\
A_2 @<<< x_2\\
@VVV\\
b_2
\end{CD}
\end{equation*}
\begin{center}{{\bf Figure 2:} $A_1,A_2$ with random inputs}\end{center}
\begin{itemize}
\item $A_1$ takes an input $x_1$ of length $r$ and outputs a string
$b_1$ of length $c$.
\item $A_2$ takes as input the output $b_1$ of $A_1$ and another
string $x_2$ of length $r$ and outputs a string $b_2$ of length
$c$.
\end{itemize}
We can consider $A_1$ and $A_2$ to be the upper and lower halves
respectively of the computation of the randomised space $S$ $\tm$ and
$x_1,x_2$ to be the random strings $r_1,r_2$. What we are in search of
is a generator that supplies strings $x_1$ and $x_2$ in a fashion
better than choosing them independently. For notational brevity, given
a function $g$, define functions $g^l$ and $g^r$ such that $g^l(z)$
and $g^r(z)$ denote the left half and the right half of the string
$g(z)$ (ie., $g(z) = g^l(z)\circ g^r(z)$ and $|g^l(z)| = |g^r(z)|$)\footnote{$\circ$ denotes the concatenation operator}
\begin{definition}
A function $g$ ($g:\{0,1\}^t \func \{0,1\}^r \times\{0,1\}^r$) is
defined to be a $\epsilon-$ generator for communication $c$ if for all
functions $A_1$ and $A_2$ such that $A_1:\{0,1\}^r \func \{0,1\}^c$
and $A_2:\{0,1\}^c\times\{0,1\}^r \func \{0,1\}^c$, we have that
\begin{eqnarray*}
\forall b \qquad \left| \prob[x_1,x_2 \in
\{0,1\}^r]{A_2(A_1(x_1),x_2) = b} - \prob[z \in
\{0,1\}^t]{A_2(A_1(g^l(z)),g^r(z)) = b} \right| <
\epsilon
\end{eqnarray*}
\end{definition}
\begin{equation*}
\begin{CD}
A_1 @<<< g^l(z) \\
@VV{b_1}V\\
A_2 @<<< g^r(z) \\
@VVV\\
b_2
\end{CD}
\end{equation*}
\begin{center}{{\bf Figure 3:} $A_1,A_2$ with inputs from
generator}\end{center}
For notational convenience, we shall call a $2^{-c}-$generator for
communication $c$ a $c-$generator.
\section{Proof of Theorem \ref{nissan}}
For the present we shall assume the following lemma and give a sketch
of its proof in the end of the lecture.
\begin{lemma}\label{genexist}
There exists a constant $k>0$ such that for all $r,c$, there exists a
polynomial time computable $c-$generator $g$ where g is such that
$g:\{0,1\}^{r+kc} \func \{0,1\}^r\times \{0,1\}^r$.
\end{lemma}
Let $g_i:\{0,1\}^{(i+1)kS} \func \{0,1\}^{ikS} \times \{0,1\}^{ikS}$
be a $2S-$generator for $i = 1,2,\ldots ,S$ as guaranteed by lemma
\ref{genexist}. Define functions $G_i:\{0,1\}^{(i+1)kS} \func
\{0,1\}^{2^ikS}$ for $i = 0,1,\ldots S$ inductively as follows
\begin{eqnarray*}
G_0(z) & = & z\\
G_i(z) & = & G_{i-1}(g_i^l(z))\circ G_{i-1}(g_i^r(z))
\end{eqnarray*}
We shall show that the existence of $G_S$ implies Theorem
\ref{nissan}. Clearly, by definition of $G_S$, $G_S$ is a $\Space{n}$
$\tm$ (ie., it runs in space $O(kS^2)$). We only have to show that
$G_S$ fools all randomised space $S$ $\tm$s, which is implied by the
following theorem.
\begin{theorem}
For all $\tm$s A that run in space $S$ and in time $2^ikS$, $G_i$ is
an $(2^{i+1}-1)2^{-2S}-$generator for A
\end{theorem}
\begin{proof}
We shall prove by induction. Assume the statement of the theorem to be
true for all $i