1. Randomized merging
Suppose we start with n piles of 1 object each, and at each step we pick two remaining piles uniformly at random, and merge them together at a cost equal to the size of the first of the two piles chosen. (For example, we may need to change a pointer in every element of the first pile to point to the second pile.)
What is the exact expected cost of all merges if we stop when there is only one pile left? Hint: compute E[S] = ∑ E[Xm] where Xm is the cost of the merge starting with m piles.
Use moment generating functions to get the best reasonable-looking bound you can on the upper tail of the total cost S. Ideally, your bound should look like Pr[S ≥ (1+δ)(something)] ≤ exp(something).
Note that you can't necessarily use Chernoff bounds directly, since the Xm aren't likely to be independent. Instead, you may find it helpful to try to compute a worst-case upper bound on E[exp(αXm)|Xn...Xm+1]. A useful fact here is that if ∑ xi = c, and you want to maximize ∑ f(xi) subject to xi ≥ a for all i, then when f is convex the maximum value is obtained by making all but one of the xi equal to a, and when f is concave the maximum value is obtained by making all of the xi equal to c/a.
2. A Collatz-like process
The Collatz conjecture says that if you start with any positive integer x0, and let xt+1 = (3xt+1)/2 if xt is odd and xt/2 if xt is even, then you will eventually get to 1. Since the Collatz conjecture seems to be hard to prove, let's simplify the game by sending x either to 3x/2 (without the extra 1) or to x/2, with the choice of which to do made using a fair coin-flip. Note that in this simplified game, xt does not have to be an integer.
What is the expected value of xt (in the simplified game) as a function of t and x0?
Show that there is a constant c such that ln xt - ct is a martingale.
Assuming that x0 = 1, use the Azuma-Hoeffding inequality get a bound on the probability that xt ≥ N for any fixed N > 0.
3. Random back-off
A collection of n processes wish to transmit information across a shared broadcast channel, where a message can be transmitted successfully in a given time step only if exactly one process attempts to broadcast. If more than one process attempts to transmit, we get a collision, which is detected by the process's hardware. Otherwise all processes either see the message (if one process transmits) or the absence of a message (if no process transmits).
Suppose that at time 0, all n processes are active (and thus attempt to transmit). After any step where there is a collision, each active process remains active with some probability p, where 0 < p < 1. After a successful transmission (one active process) or silence (no active processes), all processes become active again.
- Show that some process eventually successfully transmits a message, with probability 1, and that the expected time E[T] until this happens is finite. (You do not need to calculate E[T].)
- Show that there is a constant c (which may depend on n and p) such that Pr[T ≥ α E[T]] ≤ exp(-cα).
Let At be the event that a message is transmitted at time t. Show that for fixed n and p, limt→∞ Pr[At] converges to a positive constant.