Quote:
Originally Posted by d_saxton
Another problem. In the risk of ruin section, a risk function R is defined where R(x) is the probability of going broke starting with a bankroll of x. (We assume that the gambler's fortune is determined by the partial sums of a sequence of independent and identically distributed random variables X_1, X_2, X_3, ..., so that after n plays his fortune is x + X_1 + X_2 + ... + X_n, and this continues until he goes broke.)
Quote:
Originally Posted by d_saxton
On pg. 282 they state a property of the risk function which is that R(a + b) = R(a)R(b), which comes from the idea that being ruined with a bankroll of a + b is the same as being ruined with a bankroll of a, then ruined with a bankroll of b, and these events are independent. First, I think this is only approximately true, because the gambler's fortune takes discrete jumps, so the event where he is ruined with a bankroll a + b is actually not the same as being ruined with a bankroll of a, then starting over and being ruined with a bankroll of b, because he would be recovering for free whatever amount he was in the hole after being ruined the first time. So, the right-hand side is actually smaller than the left because of this free money. In order for this relation to hold, I think one needs to assume that the gambler's fortune is a *continuous* process rather than a discrete one.
You are right that there is some funny definitional stuff surrounding what being "ruined" is when you have a fixed bet size and the distribution of outcomes can lead to you having a non-integer multiple of that fixed bet size. (Like in blackjack with 3:2 payoffs) Also there is some funny stuff (as in poker) where your distribution of outcomes changes because you are shorter-stacked, all-in for less, other people at the table have more money etc. We chose to gloss over some of those definitional things and sort of treat the situation in the way that most closely conforms to the way we treat bankrolls in practice, which is: the R(x) we are considering are for x >> 1. So in that sense, the details of going broke from a starting bankroll of 1 unit in the types of situations we are mostly interested in are unimportant and the R(1) in the formulas could be reasonable viewed as an abstracted R(1). So while it is not precisely true that R(2) = R(1)^2, it is an extremely close approximation to say that R(200) = R(100)^2, and that's what really matters when looking at risk of ruin. But technically you are right.
(there's even a short paragraph about this on p282)
Quote:
Originally Posted by d_saxton
But in any case, I think there's another problem when he goes to apply this formula, which is that we've implicitly assumed that a, b > 0. For instance, for any a > 0 consider that 1 = R(0) = R(a - a) = R(a)R(-a) = R(a) * 1, and therefore R(a) = 1 for all a, which is obviously wrong.
OK, there is some definitional looseness about this, which can be rectified easily. Let d(x) be the distribution of outcomes. If necessary, truncate the left tail of d such that it doesn't exceed the current bankroll. Define z to be the smallest value of the bankroll after one play.
Now if b is positive, the argument in the text gives us the log-linearity we need to define R as an exponential.
If b is negative, write R(a+b) = R(a+b+z-z). Then z >=0 and a+b-z>=0, so this is R(z)R(a+b-z). Since b-z is positive, R(z) = zR(1) and R(a+b-z)=(a+b-z)R(1), and the same log-linearity argument shows that R(a+b) is an exponential with the same constant as when b was positive.
Then there's no problem when we apply the law of total probability to the known exponential R(a+b) by conditioning on one draw from the outcome distribution. R remains undefined for negative bankrolls and happy times reign.