Read about the Kelly Criterion... And then imagine a situation where you either:
(a) win 10% of your bankroll 100% of the time (low variance)
(b) win 30% of your bankroll 50% of the time, lose 10% of your bankroll 50% of the time (high variance)
Same EV, but rate of growth is better for (a). Here's how to show that...
In the first case (a), you can consider your bankroll after a number of bets to look like the following, if you start with $1.
BR = 1 * 1.1 * 1.1 * 1.1... (As many 1.1s as bets are made)
In the second case (b), it would look something like this:
BR = 1 * 1.3 * 0.9 * 0.9 * 1.3 * 1.3 * 1.3 * 0.9 * 0.9...
But in the long run, each of the 1.3 and 0.9 terms will average out to sqrt(1.3 * 0.9) = 1.082
So in the first case, we are growing our bankroll by 10% every time a bet's made. In the second case, we are growing our bankroll only by about 8.2%.
Now, suppose instead we are on the losing end:
(a) We lose 10% of our bankroll every time a bet is made
(b) We lose 30% of our bankroll 50% of the time, and win 10% of our bankroll 50% of the time.
Again, both (a) and (b) have same EV, but (a) shrinks our bankroll slower than (b). And here's why:
Like before we have for (a)
BR = 1 * 0.9 * 0.9 * 0.9... [as many bets are made]
And for (b) we have something like
BR = 1 * 0.7 * 1.1 * 1.1 * 0.7 *...
And the 0.7 and 1.1 terms average out to:
sqrt(0.7 * 1.1) = 0.877
So in case (a) we are shrinking our bankroll by about 10% per bet, but in case (b) we are shrinking it by about 12.3% per bet.
tldr; Rate of growth of our bankroll will always be worse when we increase variance - whether we are on the winning or losing end of a bet.
Maybe there are better resources, but here's one:
http://www.elem.com/~btilly/kelly-criterion/
Last edited by pocketzeroes; 10-19-2017 at 11:33 AM.