Open Side Menu Go to the Top
Register
Find a RIT situation that changes EV Find a RIT situation that changes EV

07-26-2018 , 10:05 PM
I have a bet with my friend:

My position - HU, in a game of HE, there is no possible combination of hole cards and board cards where the EV of an all in player is different when he chooses to run it once, twice, three, four etc times.

His position - There are cases (not specified) where running it once or three times is more beneficial than running it twice.

How can I demonstrate mathematically that it doesn't matter how many times it's run? My position is that it makes no difference how many times it's run, all that running it more does is reduce variance. However, I do not know how to show this rigorously using maths.

The bet is for $100. Happy to contribute it to the charity of 2+2 choice if I win, I'm sure he will be too if he wins, once I link him to this thread.
07-26-2018 , 10:13 PM
if a player is drawing to one out, imo it would be advantageous to run it twice or more against him. i don't think the ev changes, but it's the quality of being on a finite timeline with no real long run which seals this for me.
07-26-2018 , 10:16 PM
Quote:
Originally Posted by Tuma
if a player is drawing to one out, imo it would be advantageous to run it twice or more against him. i don't think the ev changes, but it's the quality of being on a finite timeline with no real long run which seals this for me.
My position (about your post and in the bet) is that you're wrong. However, if you could show me mathematically that you're right, I will concede.

Edit: mea culpa, you said EV doesn't change. Agree with you then, reducing variance is usually good.

Last edited by d2_e4; 07-26-2018 at 10:22 PM.
07-26-2018 , 10:24 PM
HU vs 1-outter on the flop for your entire bankroll.

Hero AA
Villain KK

Flop: K 6 7

A is revealed to be dead.

RIO: 4.44% of the time hero doesn't survive.
RIT: 100% of the time hero survives.
07-26-2018 , 10:26 PM
Quote:
Originally Posted by Tuma
HU vs 1-outter on the flop for your entire bankroll.

Hero AA
Villain KK

Flop: K 6 7

A is revealed to be dead.

RIO: 4.44% of the time you don't survive.
RIT: 100% of the time And I thinyou survive.
I really don't see your point, or how this is relevant to what I asked?

Last edited by d2_e4; 07-26-2018 at 10:32 PM. Reason: And I think you meant ~96%, not that it's in any way relevant to the question. Edit 2: 100%, really? What happened to the ace
07-26-2018 , 10:44 PM
another argument, this for the camp that running it once is superior:

Barry Greenstein (paraphrased, my words not his, from the Joe Ingram podcast) -

Quote:
...After telling Phil Ivey the reason why I always run it once, he decided to never run it twice again. When players know they can make risky plays to make your life miserable and get two chances to beat you, when that is taken away from them and they have only have one chance, they are less willing to go to war with you.
again, same ev, but a more beneficial situation is created choosing to run it once or twice, which is what your friend posited in the op.
07-26-2018 , 10:47 PM
Quote:
Originally Posted by Tuma
another argument, this for the camp that running it once is superior:

Barry Greenstein (paraphrased, my words not his, from the Joe Ingram podcast) -



again, same ev, but a more beneficial situation is created choosing to run it once or twice, which is what your friend posited in the op.
You realise there is a reason I posted this in the probability forum, and not in LLSNL, right? We agreed that "beneficial" was defined by "EV". Thanks for pointing out that there may be other reasons to run it twice, though.

Anyway, this specific bet revolves around whether ev changes vs running it once or three times vs twice, so your metagame tips, while welcome, are completely irrelevant to this thread.
07-26-2018 , 10:55 PM
yeah, which is why it's puzzling the tone you are taking considering you asked for followup on my first and second reply. good luck on getting the answer you are looking for.
07-26-2018 , 11:01 PM
Quote:
Originally Posted by Tuma
yeah, which is why it's puzzling the tone you are taking considering you asked for followup on my first and second reply. good luck on getting the answer you are looking for.
I'm not sure what you mean, but if I have been rude, I apologise. It just seemed like you kept answering a question I didn't ask.

Bottom line is I would like someone's assistance to run the sims on this and am willing to contribute $100 to charity for the results.

Actually, if anyone can solve this analytically rather than through sims, I will make up 200 outside the bet and donate 300.
07-26-2018 , 11:49 PM
This question has been asked many times. It is pretty common knowledge that the number of runs has no effect whatsoever on EV of the current hand (assuming a uniformly random distribution). It does reduce variance and may have metagame benefits, though. Note that I did not read any of the following threads, but you may find a satisfactory answer in one of them.

https://forumserver.twoplustwo.com/1...estion-633089/
https://forumserver.twoplustwo.com/1...d-orly-406670/
https://forumserver.twoplustwo.com/1...-twice-405853/

My personally preferred argument follows (sorry if it's poorly articulated):

Assume the deck is uniformly randomly distributed, meaning that each card in the deck can be any of the unkwown cards with equal frequency. This means the order of the cards does not matter. You could deal from any point of the deck if you wished without benefiting either player. Now, clearly the first and second are equal in terms of benefit to one player or the other because the order of the cards doesn't matter. Because our equity in each run is the same, our EV from running it twice is the same as running it once.

The superstition that RIT can improve or hurt EV is really the same superstition as "stealing the dealer's bust card" in blackjack.

Maybe it should be noted that if the deck is not uniformly distributed, RIT could be beneficial to one player. It is possible, although unlikely, that one would be able to benefit from RIT based on knowledge of the non-random distribution of the cards, perhaps from colluding with the dealer, or the shuffle being very weak.
07-26-2018 , 11:53 PM
Quote:
Originally Posted by browni3141
This question has been asked many times. It is pretty common knowledge that the number of runs has no effect whatsoever on EV of the current hand (assuming a uniformly random distribution). It does reduce variance and may have metagame benefits, though. Note that I did not read any of the following threads, but you may find a satisfactory answer in one of them.

https://forumserver.twoplustwo.com/1...estion-633089/
https://forumserver.twoplustwo.com/1...d-orly-406670/
https://forumserver.twoplustwo.com/1...-twice-405853/

My personally preferred argument follows (sorry if it's poorly articulated):

Assume the deck is uniformly randomly distributed, meaning that each card in the deck can be any of the unkwown cards with equal frequency. This means the order of the cards does not matter. You could deal from any point of the deck if you wished without benefiting either player. Now, clearly the first and second are equal in terms of benefit to one player or the other because the order of the cards doesn't matter. Because our equity in each run is the same, our EV from running it twice is the same as running it once.

The superstition that RIT can improve or hurt EV is really the same superstition as "stealing the dealer's bust card" in blackjack.

Maybe it should be noted that if the deck is not uniformly distributed, RIT could be beneficial to one player. It is possible, although unlikely, that one would be able to benefit from RIT based on knowledge of the non-random distribution of the cards, perhaps from colluding with the dealer, or the shuffle being very weak.
I agree with everything you've said (it's basically my position), but I need to be able to prove it to win the bet. I need a rigorous mathematical treatment.

Edit: the bet is basically that my friend cannot find 1 counterexample to this. If there is a counterexample to a HU ai where one player benefits (ev-wise) by choosing the number of runouts, I lose. Unfortunately, as stated, the bet is that I have to prove that no such hands exist - he doesn't have to prove that one does. As it stands, it might end up being a push, because I have no idea how to prove this.
07-27-2018 , 12:27 AM
I have not perused the threads already linked to but this has been asked and answered (and proven) many times over the years on 2+2.

The search feature is your friend.
07-27-2018 , 12:34 AM
Quote:
Originally Posted by whosnext
I have not perused the threads already linked to but this has been asked and answered (and proven) many times over the years on 2+2.

The search feature is your friend.
whosnext - Thank you for responding. I agree that this has been asked and answered; however, I have never seen it proven. I am a big lurker of the site and I have seen many threads on this topic, but I have no idea how to prove this mathematically.
07-27-2018 , 12:35 AM
I should add - I have seen it proven for specific hands. I have never seen it proven for an arbitrary hand.
07-27-2018 , 01:12 AM
Quote:
Originally Posted by d2_e4
I should add - I have seen it proven for specific hands. I have never seen it proven for an arbitrary hand.
I do not know how to write “rigorous mathematical proofs,” but I believe my post contains all the elements of the proof you want. Based on the premise that a deck is random, you can prove that RIT does not affect EV by showing that the first and second runs have the same equity for each player.
07-27-2018 , 04:50 AM
Quote:
Originally Posted by browni3141
Assume the deck is uniformly randomly distributed, meaning that each card in the deck can be any of the unkwown cards with equal frequency. This means the order of the cards does not matter. You could deal from any point of the deck if you wished without benefiting either player. Now, clearly the first and second are equal in terms of benefit to one player or the other because the order of the cards doesn't matter. Because our equity in each run is the same, our EV from running it twice is the same as running it once.
OP, I don't know what your requirements for "a rigorous mathematical treatment" look like, but the above surely is a mathematical rigorous proof.

You might be a more verbose saying that:

1) For the reasons above, the a priori probability of winning each run is the same. Call it p.
2) So, the EV of a single run is p*P/N, where P is the pot size and N the number of runs (for each run you are playing for the Nth part of the pot).
3) Since the EV is additive, your total EV is the sum of the equities of each run:

N*p*P/N = p*P

which is the EV if you run it once.

However, I bet that your friend won't be convinced. Before this thread, you didn't miss a rigorous proof. I guess that both you don't fully realize what a proof is and your friend isn't able to get a proof.

Last edited by nickthegeek; 07-27-2018 at 04:56 AM.
07-27-2018 , 12:26 PM
Yes, this is the exact same phenomenon (as stated above) as the well-known edict that you don't need to "take into account" any opponents' unknown cards when calculating the probability of hitting your flush draw. Unseen cards emanate from, and are rightly considered, part of the unknown deck stub just like all other cards not yet dealt.

Consider the case of all-in pre-flop. Hero and Villain turn over their hole cards and they both understand that Hero has equity 62%. This means that if ALL possible 5-card boards are dealt out (like on a computer program), Hero will win 62% of the times. Colloquially, Hero wins 62% of the time "on average".

Now take 5 random cards from the deck stub and place them on the table face down. Each person will understand and agree that Hero has a 62% chance of winning with that board.

Next take another 5 random cards from the deck stub and place them on the table face down next to the first board (which is still face down). Again, it is easy to see, and each person will understand and agree that Hero has a 62% chance of winning with this second (unseen) board.

Do it one more time. Take still another 5 cards from the deck stub and place them on the table next to the other two boards (which are still face down). Once again each player will understand and agree that Hero has a 62% chance of winning with this third (unseen) board.

No matter how many 5-card boards are pulled (unseen) from the deck stub, no matter what hole cards Hero and Villain hold, Hero's equity is always the same (62% in the example) for each board. And, of course, this means that Hero's overall equity is always 62% no matter how many 5-card boards are pulled. Equity is a concept pertaining to how often a player is expected to win before future board cards are revealed.

The exact same argument applies to each and every other possibility of running multiple turn/rivers and rivers. Before the future board cards are revealed, the equities are always identical and unchanged, no matter how many future board cards are pulled.

This argument needs no further proof since it is a definitive proof. Verification of this argument has been given in many specific instances via meticulous algebraic proofs such as on the river if Villain has X outs. But, as stated above by others, these algebraic proofs of specific cases serve only to "confirm" what has been generally proven already.

Last edited by whosnext; 07-27-2018 at 12:32 PM.
07-27-2018 , 02:29 PM
Here's my version of a math proof:

Two players are in a showdown situation, with b board cards yet to be seen (b=1, 2, or 5). If a player has a winning probability of W, then, prior to any board cards being dealt, W is the probability that each set of b cards in the remaining deck will give the player a win. For example, the second set of b cards is equivalent to the dealer burning off b+1 cards before dealing instead of the usual one burn card. If the showdown is run r times, then the player’s expectation can be found in the following way:

Let
P = total pot
r = number of runs
Xi = 1 if player wins on the i-th run; else Xi = 0. Xi is therefore a binary random variable
Then the expected value of Xi is E(Xi) = W*1 + (1-W)*0 = W

For each run, if the player wins, he wins P/r according to the standard payout for running it r times.

Therefore, we can write the amount won in r runs as

Amount Won = X1*P/r + X2*P/r + . . . +Xr*P/r
= (Sum Xi )*P/r

Since the expected value of a sum is equal to the sum of the expected values,

EV =Sum E(Xi) * P/r

But, E(Xi) = W for all i; therefore
EV = rW*P/r = W*P

But, W*P is the EV for the player if it is run only once, proving that EV does not change with running it more than once.
08-14-2018 , 11:28 AM
Algebraic proof that the EV is the same:

Equity when running once = P(win the first runout)

Equity when running twice = P(win both runouts) + P(chop)⋅1/2
= P(win 1st)⋅P(win 2nd) + P(win 1st)⋅P(lose 2nd)
= P(win 1st) ⋅ P(any outcome 2nd runout)
= P(win 1st runout)
08-23-2018 , 02:35 AM
Quote:
Originally Posted by heehaww
Algebraic proof that the EV is the same:

Equity when running once = P(win the first runout)

Equity when running twice = P(win both runouts) + P(chop)⋅1/2
= P(win 1st)⋅P(win 2nd) + P(win 1st)⋅P(lose 2nd)
= P(win 1st) ⋅ P(any outcome 2nd runout)
= P(win 1st runout)
OMG can you tutor me through these proof classes. Nice proof bro.
08-23-2018 , 04:38 AM
It seems to me that heehaww simplified the proof a bit (maybe I am missing some clever trick). I think a valid proof along those lines would be something like the following:

Let A be winning the first runout and let B be winning the second runout.

Equity when running it twice = P(A & B) + 0.5*P(chop)

= P(A & B) + 0.5*P(A & not B) + 0.5*P(B & not A)

= P(A)*P(B|A) + 0.5*P(A)*P(not B|A) + 0.5*P(B)*P(not A|B)

= P(A)*P(B|A) + 0.5*P(A)*P(not B|A) + 0.5*P(A)*P(not B|A)

= P(A)*P(B|A) + P(A)*P(not B|A)

= P(A)*[P(B|A) + P(not B|A)]

= P(A)*1

= P(A)

I suppose a completely valid proof would incorporate chop probs for each runout but I will stop at this point.
08-24-2018 , 02:55 PM
I think he addresses the chop probabilities.

Quote:
P(win both runouts) + P(chop)⋅1/2 = P(win 1st)⋅P(win 2nd) + (P(win 1st)⋅P(lose 2nd) + P(lose first)*P(win second))/2
Since P(win1)*P(lose2)=P(lose1)*P(win2), we have 2/2.

Quote:
P(win both runouts) + P(chop)⋅1/2 = P(win 1st)⋅P(win 2nd) + P(win 1st)⋅P(lose 2nd)
Using distribution AB+AB*=A.
Quote:
= P(win 1st runout)

Last edited by Pcallinallin; 08-24-2018 at 02:57 PM. Reason: TELL ME IF IM WRONG PLEASE!
08-24-2018 , 03:37 PM
I was merely pointing out that any runout can lead to a chopped pot. Running it once can lead to a chopped pot.

So any complete and valid proof of the underlying proposition (along those lines) would probably have to take that possibility into account.

At this point the thread has run its course, so I am closing the thread. If anybody else has anything urgent they'd like to contribute, they can PM me.
Closed Thread Subscribe
...

      
m