[FrontPage] [TitleIndex] [WordIndex

Note: You are looking at a static copy of the former PineWiki site, used for class notes by James Aspnes from 2003 to 2012. Many mathematical formulas are broken, and there are likely to be other bugs as well. These will most likely not be fixed. You may be able to find more up-to-date versions of some of these notes at http://www.cs.yale.edu/homes/aspnes/#classes.

A collection of random variables is k-wise independent if any subset of k of the variables is independent (see RandomVariables). For discrete random variables, this means that for any subset of k of them, X1...Xk, and any values x1...xk, it holds that Pr[∧i Xi = xi] = ∏i Pr[Xi=xi].

K-wise independent variables are interesting for randomized algorithms because they are often easier to generate that fully independent random variables, and for many purposes k-wise independence for some small k is enough.

1. Generating k-wise independent random variables

1.1. Using a random Boolean matrix

Let M be a random n×m matrix over ℤ2; then the random variables Mx for each x∈ℤ2m-{0} are uniformly distributed over ℤ2n and pairwise independent. Proof: For uniform distribution we are just taking the XOR of one or more random vectors, which gives a random vector. For pairwise independence, observe that in computing Mx and My there is at least one vector that is included in one but not the other; so we have Mx = My⊕r where r is a random vector.

This gives 2b-1 pairwise-independent n-bit values using exactly bn bits.

1.2. Using a random polynomial over a finite field

Let ak-1...a0 be independent uniform values over some finite field (see FiniteFields). Then the values p(x) = ∑i aixi are k-wise independent. (This generalizes the 2-universal hash family given in HashTables.)

Proof: Given any points x1...xk and values y1...yk, there is a unique degree-(k-1) polynomial p for which p(xi) = yi for i=1..k. The constructive way to prove this is to use Lagrange interpolation (see Lagrange polynomial), but in GF(pn) we can prove it just by counting. For fixed x1...xk, there are exactly pnk choices for both y1...yk and the coefficients ak-1...,a0,. Suppose two polynomials p and q agree on x1...xk. Then each xi is a zero of p-q. Because p-q has k distinct zeros, it is either zero or has degree at least k.1 Since p-q can't have degree k, we have p-q=0 ⇒ p=q. So the map from choices of yi to coefficient ai is injective, and since the codomain and domain have the same size it must be bijective. It follows that Pr[∧p(xi)=yi] = 1/p^(kn) for all k distinct xi.

Here we get pn k-wise independent (n lg p)-bit values using exactly kn lg p bits. To compare with the previous case, let p=2, then we get 2n k-wise independent n-bit values using kn bits.

2. Almost k-wise independence

Often we are happy with values that are ε-almost k-wise independent, which means that |Pr[∧i Xi = xi] - ∏i Pr[Xi=xi]| ≤ ε for some small ε. This can be used to cut down on how many fully random bits we need. We won't talk about these constructions much, but a classic paper on the subject is here.

3. Pairwise independence and Chebyshev's inequality

One application of k-wise independence is for Chebyshev's inequality arguments. For example, if we are sampling a sequence of identically distributed values Xi (e.g., the outcome of an experiment when we run a randomized algorithm), Pr[|∑ Xi - nEXi| > α] < Var[∑Xi]/α2 by Chebyshev's inequality. For arbitrary random variables, Var[∑Xi] = ∑ Var[Xi] + ∑i≠j Cov(Xi,Xj); but if the Xi are pairwise independent, Cov(Xi,Xj) = 0 for all i≠j. So we can save some random bits when doing statistical sampling by using a pairwise independent family.

4. Connection to cryptographically secure pseudorandom generators

A cryptographically secure pseudorandom generator is a function that takes a small number of random bits and generates a large number of bits that are computationally indistinguishable from random, in the sense that any polynomial-time Turing machine, when fed the pseudorandom bits as input, will answer yes with almost exactly the same probability as if fed real random bits as input. This means that pseudorandom bits are just as good as real random bits for running polynomial-time randomized algorithms. In general, we don't expect to get cryptographically secure pseudorandom generators without using cryptography (and unless P≠NP).

The framework for k-wise independent pseudorandom generators and cryptographically secure pseudorandom generators is exactly the same: we extract many bits from few. Since random bits are k-wise independent, the output of a cryptographically secure pseudorandom generator will look k-wise independent to any polynomial-time computation, but won't necessarily be k-wise independent. In the other direction, the output of a k-wise independent generate will be k-wise independent, but won't necessarily look random (for example, given any k values for the random-polynomial generator we can immediately predict all the others). Which you want depends on what you are doing. For a problem where you only need k-wise independence, using a k-wise independent family gives provable bounds that don't depend on unproven complexity-theoretic assumptions.

In real life, most people who aren't doing cryptography seem to use something like Mersenne twister, which is neither cryptographically secure nor k-wise independent for large k. This is an example of how practical programming often operations in a more benevolent world that the dystopian tyranny of the universal quantifier.


  1. Proof: Let z be a zero of a nonzero polynomial f(x); then f(x) = (x-z)q(x) + r by the DivisionAlgorithm, and at x = z we get r = 0, proving f(x) = (x-z)q(x) for some q with deg(f) = deg(q)+1. We also have that g is nonzero as otherwise f would be zero. By induction we get that f(x) = ∏i (x-zi) h(x) where the zi are the zeros of f and h(x) is some polynomial with no zeros, implying that deg(f) ≥ k when f has k zeros. (1)

2014-06-17 11:58