Recall that the moment-generating function for a random variable X is the function E[eαX] of α, which is equal to ∑ E[Xk]αk/k!. Here we describe a random variable 2X for which the moment-generating function is not defined for all α.
Suppose you start with $1. I repeatedly flip a fair coin. If it comes up heads, I double your money. If it comes up tails, I send you way with whatever you've got so far. If X is the number of times I get heads, then the money you get is 2X = eX ln 2, and your expected return is E[eX ln 2]. But if we just add up all the cases, you get $1 with probability 1/2, $2 with probability 1/4, $4 with probability 1/8, etc. giving a total of 1/2+1/2+1/2+..., which diverges. So there is no expectation for 2X and thus no value for E[eX ln 2].
This doesn't mean that the distribution of X has no moment-generating function. For smaller α we can calculate E[eαX] = = = (provided eα/2 < 1). So we get a moment-generating function that, like many other GeneratingFunctions, happens to have a bounded radius of convergence.
Curiously, in the original game, the likelihood that you leave with 2n or more dollars is bounded by 2⋅2-n, which is suspiciously similar to the bound from Markov's inequality for a variable with expectation 2. This demonstrates that Markov's inequality is not reversible: knowing that Pr[X ≥ α] ≤ m/α says nothing about EX.