# 1. Randomized merging

Suppose we start with n piles of 1 object each, and at each step we pick two remaining piles uniformly at random, and merge them together at a cost equal to the size of the first of the two piles chosen. (For example, we may need to change a pointer in every element of the first pile to point to the second pile.)

What is the exact expected cost of all merges if we stop when there is only one pile left? Hint: compute E[S] = ∑ E[X

_{m}] where X_{m}is the cost of the merge starting with m piles.Use moment generating functions to get the best reasonable-looking bound you can on the upper tail of the total cost S. Ideally, your bound should look like Pr[S ≥ (1+δ)(

*something*)] ≤ exp(*something*).

Note that you can't necessarily use Chernoff bounds directly, since the X_{m} aren't likely to be independent. Instead, you may find it helpful to try to compute a worst-case upper bound on E[exp(αX_{m})|X_{n}...X_{m+1}]. A useful fact here is that if ∑ x_{i} = c, and you want to maximize ∑ f(x_{i}) subject to x_{i} ≥ a for all i, then when f is convex the maximum value is obtained by making all but one of the x_{i} equal to a, and when f is concave the maximum value is obtained by making all of the x_{i} equal to c/a.

# 2. A Collatz-like process

The Collatz conjecture says that if you start with any positive integer x_{0}, and let x_{t+1} = (3x_{t}+1)/2 if x_{t} is odd and x_{t}/2 if x_{t} is even, then you will eventually get to 1. Since the Collatz conjecture seems to be hard to prove, let's simplify the game by sending x either to 3x/2 (without the extra 1) or to x/2, with the choice of which to do made using a fair coin-flip. Note that in this simplified game, x_{t} does not have to be an integer.

What is the expected value of x

_{t}(in the simplified game) as a function of t and x_{0}?Show that there is a constant c such that ln x

_{t}- ct is a martingale.Assuming that x

_{0}= 1, use the Azuma-Hoeffding inequality get a bound on the probability that x_{t}≥ N for any fixed N > 0.

# 3. Random back-off

A collection of n processes wish to transmit information across a shared broadcast channel, where a message can be transmitted successfully in a given time step only if exactly one process attempts to broadcast. If more than one process attempts to transmit, we get a collision, which is detected by the process's hardware. Otherwise all processes either see the message (if one process transmits) or the absence of a message (if no process transmits).

Suppose that at time 0, all n processes are active (and thus attempt to transmit). After any step where there is a collision, each active process remains active with some probability p, where 0 < p < 1. After a successful transmission (one active process) or silence (no active processes), all processes become active again.

- Show that some process eventually successfully transmits a message, with probability 1, and that the expected time E[T] until this happens is finite. (You do not need to calculate E[T].)
- Show that there is a constant c (which may depend on n and p) such that Pr[T ≥ α E[T]] ≤ exp(-cα).
Let A

_{t}be the event that a message is transmitted at time t. Show that for fixed n and p, lim_{t→∞}Pr[A_{t}] converges to a positive constant.