Two Plus Two Poker Forums Baffling Math Paradox
 Register FAQ Search Today's Posts Mark Forums Read Video Directory TwoPlusTwo.com

 Notices 2013 Two Plus Two Party at South Point BET RAISE FOLD screening at 3PM in lounge next to poker room. July 6th Party registration at 5PM. Food served at 6PM

 Poker Theory General poker theory

07-02-2012, 10:14 PM   #196
grinder

Join Date: Feb 2008
Posts: 454

Quote:
 Originally Posted by Kittens I'll take us off on a slight tangent now: Some posters on this thread have used the Dominance Principle: if strategy A gives either the same profit or higher profit than strategy B in every case, then strategy A must have an equal or higher expected profit than strategy B. (Some have even posted a stronger version: if there is at least one case where A's profit beats B's profit (and all other cases are same or equal) then A is definitely better). This seems 'obvious' at first but it turns out to actually not work for impossible games, see http://en.wikipedia.org/wiki/Newcomb's_paradox
I will take a look at that. I'm not really talking about cases, I'm talking about something equivalent to lottery tickets. Two tickets can have infinite EV but one can clearly be better than the other. One ticket could have a 99% chance of winning \$100 million and the other could have 99% chance of losing \$100 million while both have 1% chance of winning various amounts producing infinite EV and identical probability distributions for all the other amounts.

 07-02-2012, 10:50 PM #197 grinder   Join Date: Feb 2008 Posts: 454 Re: Baffling Math Paradox After I look at my number and calculate my EV it will tell me to switch. But if I have an advisor who peeks at the other slip of paper without seeing mine, and he calculates my EV he will tell me to stay. If all he can do is signal me "switch" or "stay" it seems that calculating EV's has failed.
07-04-2012, 09:51 PM   #198
Pooh-Bah

Join Date: Jun 2009
Posts: 5,862

Quote:
 Originally Posted by bobf Two tickets can have infinite EV but one can clearly be better than the other. One ticket could have a 99% chance of winning \$100 million and the other could have 99% chance of losing \$100 million while both have 1% chance of winning various amounts producing infinite EV and identical probability distributions for all the other amounts.
In this case neither ticket is better than the other (perhaps unintuitive, but true)

What would you pick in real life? answer - neither, since this situation is not possible in real life

07-05-2012, 02:38 AM   #199
grinder

Join Date: Feb 2008
Posts: 454

Quote:
 Originally Posted by Kittens In this case neither ticket is better than the other (perhaps unintuitive, but true) What would you pick in real life? answer - neither, since this situation is not possible in real life
In real life modified to allow for unbounded payouts they are not equal and I'm sure which ticket I would choose.

07-05-2012, 07:20 PM   #200
Pooh-Bah

Join Date: Jul 2007
Location: Vancouver, BC
Posts: 5,665

Quote:
 Originally Posted by bobf In real life modified to allow for unbounded payouts they are not equal and I'm sure which ticket I would choose.
In that case I would assume I'm being scammed and not take either ticket.

Also how come it seems in every example you come up with the \$ amounts keep getting higher and higher. Is that supposed to make your point stronger? lol

07-05-2012, 07:41 PM   #201
Carpal \'Tunnel

Join Date: Sep 2008
Location: central nj
Posts: 7,945

Quote:
 Originally Posted by Kittens This should be the end of the discussion then. We all agree to switch once you look at the paper, and that it's pointless to switch before looking.
I left this thread for a while so I don't know quite where we're at, but I still maintain that it's pointless to switch whether or not you've looked. Your actual expectation for this instance of the game is set once the values are chosen, and no amount of switching back and forth can change your chances of picking the higher or lower value.

07-06-2012, 12:27 AM   #202
grinder

Join Date: Feb 2008
Posts: 454

Quote:
 Originally Posted by DarkMagus In that case I would assume I'm being scammed and not take either ticket. Also how come it seems in every example you come up with the \$ amounts keep getting higher and higher. Is that supposed to make your point stronger? lol
To make the obvious more obvious. Seems you are deflecting to avoid the actual point which is that if A has some (potentially large) advantages over B and B has no advantages over A then A is better. Seems strange to me that I even need to argue this point.

07-06-2012, 09:14 AM   #203
centurion

Join Date: Jul 2010
Location: Sheffield
Posts: 105

Quote:
 Originally Posted by ganstaman I left this thread for a while so I don't know quite where we're at, but I still maintain that it's pointless to switch whether or not you've looked. Your actual expectation for this instance of the game is set once the values are chosen, and no amount of switching back and forth can change your chances of picking the higher or lower value.
That is similar to saying that the expected value of a forced all-in or fold hold'em poker hand is set when the deck has been shuffled and hole cards dealt face down because the cards that have and will be dealt are determined.

That looking at your hole cards and seeing pocket aces is no different than looking or seeing 32o or pushing all in blind.

After all with a random deal there is a 50-50 chance you'd win this all in confrontation before looking at your hole cards.

Back to the game in question.

If you look and see a 1 then you will always switch to get a 3, which is the obvious counter example to your statement.

If you look and see a 3 then there is a 2/3 chance you will get a 1 and a 1/3 chance you will get a 9. (Agreed?)

So by switching from a 3 you gain 2/3 in EV. (Agreed?)

For any integer X you see then switching gains you 2X/9 in EV? (Agreed?)

So if you look at the sheet and see any integer then you should switch? (Agreed?)

I really can't see the flaw in any of this reasoning, the thing I don't understand is the contradiction between:

Don't look means EV(Switch)=EV(Stay)
Look and see any integer means EV(Switch)>EV(Stay)

I blame infinity.

07-06-2012, 10:22 AM   #204
grinder

Join Date: Feb 2008
Posts: 454

Quote:
 Originally Posted by Scouse Rob Don't look means EV(Switch)=EV(Stay) Look and see any integer means EV(Switch)>EV(Stay) I blame infinity.
If we play a bunch of games and always switch without looking, but we save the slips and mark our choices, then retroactively go back and calculate or AverageGain(Switch given initial slip was n) it will still tend towards 2/9n for each n, seemingly telling us we were smart to switch. But if instead we go back and calculate what would have happened had we stayed, AverageGain(Stay given other slip was n) is will also tend towards 2/9n for each n, seemingly telling us we should have stayed.

I blame infinity too. There is no benefit in raising your EV in the context of having more infinite EV opportunities. Once you look you have moved from having an infinite EV opportunity to not having any more, assuming you won't play the game again.

The only case that is bothering me is when I play the game exactly 1 time and look. Now I am not in an infinite EV situation any more. So I should take EV advice and switch. Yet if I plan ahead of time to do exactly that, look and then switch, I am clearly making a bad plan. So I have a bad plan that always becomes a good plan, once I actually look.

Last edited by bobf; 07-06-2012 at 10:28 AM.

07-06-2012, 05:46 PM   #205
Carpal \'Tunnel

Join Date: Sep 2008
Location: central nj
Posts: 7,945

Quote:
 Originally Posted by Scouse Rob That is similar to saying that the expected value of a forced all-in or fold hold'em poker hand is set when the deck has been shuffled and hole cards dealt face down because the cards that have and will be dealt are determined. (Assume your opponent must call.) That looking at your hole cards and seeing pocket aces is no different than looking or seeing 32o or pushing all in blind. After all with a random deal there is a 50-50 chance you'd win this all in confrontation before looking at your hole cards.
True that it's similar, but the difference is important, I believe. Except for the boundary cases, you aren't going to care what you see -- you will make the same decision no matter what you see, making the looking nothing more than a formality.

If you dealt out two random hands face down, you could switch back and forth all you want without changing the EV. If you then look at the hand, it actually approximates seeing the boundary cases of the slips of paper in that you can actually evaluate the value of switching vs staying.

Quote:
 Originally Posted by bobf If we play a bunch of games and always switch without looking, but we save the slips and mark our choices, then retroactively go back and calculate or AverageGain(Switch given initial slip was n) it will still tend towards 2/9n for each n, seemingly telling us we were smart to switch.
If you go back and evaluate the AverageGain(Switch) for every slip, you will find it is the same as your average win overall from staying. By grouping the results by n, it seems to be causing problems and leading to somewhat ridiculous results.

 07-06-2012, 05:50 PM #206 old hand   Join Date: Jun 2010 Location: Vancouver, BC Posts: 1,304 Re: Baffling Math Paradox Someone alluded to this earlier, but I think that the source of the "paradox" is the same source of the "paradox" behind Martingales: namely, that there is no such thing as an infinite bankroll. Everyone seemed to agree earlier in the thread that if the game were capped--i.e. if there were a maximum number you could see on the slip of paper--that now always switching is equivalent in EV to always staying. Well, by necessity any such game is capped because the person flipping the coins will not have an infinite bankroll. Therefore there is a number beyond which the flipper has to stop flipping coins and just write the maximum, because he couldn't pay out any more. So when we, as the player, look at the sheet of paper and see a number written down, part of the thought process before we switch should be, "How likely is it that this number I've just seen is the limit of the game runner's bankroll?" Clearly you should switch if you think there's no chance of that. But if you disregard that question and always switch, you should have the same EV as if you always stay. The "paradox" arises because so far we are assuming that we can ignore that question. Notice that no matter what the bankroll limit is, the strategy called "StayExcept1" beats always staying and it beats always switching as well. But now it is no longer clear that it loses to "StayExcept(1,3)" because 3 could be the maximum payout (a small chance but a chance nonetheless).
07-06-2012, 07:48 PM   #207
Pooh-Bah

Join Date: Jul 2007
Location: Vancouver, BC
Posts: 5,665

Quote:
 Originally Posted by bobf To make the obvious more obvious. Seems you are deflecting to avoid the actual point which is that if A has some (potentially large) advantages over B and B has no advantages over A then A is better. Seems strange to me that I even need to argue this point.
I'm not "deflecting" your point or whatever, I'm just saying that your whole point is a non-sequitur. You keep making these endless "which lottery ticket would you choose in real life" comparisons which don't really reflect the problem in OP. Did you read that wiki article I posted a little while ago? Because you keep throwing around words like "always" very loosely, which leads me you think you didn't read it. You have to be very specific about what you mean when you use the word "always>

Quote:
 I will take a look at that. I'm not really talking about cases, I'm talking about something equivalent to lottery tickets. Two tickets can have infinite EV but one can clearly be better than the other. One ticket could have a 99% chance of winning \$100 million and the other could have 99% chance of losing \$100 million while both have 1% chance of winning various amounts producing infinite EV and identical probability distributions for all the other amounts.
Ticket A is "better" in an abstract mathematical sense, I suppose, since \$100 million > -\$100 million. But for any practical purposes, a currency that has an infinite supply quickly becomes meaningless. People just stop accepting it for anything and move on to a finite supply currency. So from that sense \$100m and -\$100m are effectively the same and equal to \$0.

07-06-2012, 10:35 PM   #208
Pooh-Bah

Join Date: Jun 2009
Posts: 5,862

Quote:
 Originally Posted by Scouse Rob I really can't see the flaw in any of this reasoning, the thing I don't understand is the contradiction between: Don't look means EV(Switch)=EV(Stay) Look and see any integer means EV(Switch)>EV(Stay).
That's the supposed paradox. You don't need to run a long series of games or anything to see it or resolve it, though.

The correct resolution (IMHO) is that if you actually act out this game, there will be a maximum possible value (the biggest number you can fit on a piece of paper, or the biggest number that the person paying the money has in his bank account for you). And then the correct strategy is really "Switch unless you see the maximum possible integer", no paradox.

In the theoretical game where there's no maximum and we have infinite bankrolls, EV(Switch)>EV(stay) is no contradiction with EV(Switch)=EV(stay) as they are both infinity.

07-07-2012, 02:38 AM   #209
grinder

Join Date: Feb 2008
Posts: 454

Quote:
 Originally Posted by DarkMagus I'm not "deflecting" your point or whatever, I'm just saying that your whole point is a non-sequitur. You keep making these endless "which lottery ticket would you choose in real life" comparisons which don't really reflect the problem in OP. Did you read that wiki article I posted a little while ago? Because you keep throwing around words like "always" very loosely, which leads me you think you didn't read it. You have to be very specific about what you mean when you use the word "always>
I don't see any "always" in my post, so I'm not sure what you are referring to exactly.

I did read the infinity article but I the only potential infinity I see in the OP leads to a locking-up of the game on infinite number of tails. The OP implies we eventually see a heads and the game proceeds with finite values from then on.

The lottery ticket is not non-sequitar. To me it is the crux of the paradox because two views of the problem seemingly lead to contradictory results.

1. Calculating EV(having seen N) leads to switch for any N which seems to imply that LookAndAlwaysSwitch strategy is better than StayExcept1 strategy.

2. Yet if person A says "I am gonna play StayExcept1" and person B says "I am gonna play LookAndAlwaysSwitch" we can calculate a "lottery ticket" (set of \$amount-won / probability) for each and from those I see that B has no advantage whatsoever over A, while A has one (rather small) advantage over B. There would be no reason to invest in person B vs A, but there is a (small) reason, to invest in person A over person B for a finite number of games.

Quote:
 Ticket A is "better" in an abstract mathematical sense, I suppose, since \$100 million > -\$100 million. But for any practical purposes, a currency that has an infinite supply quickly becomes meaningless. People just stop accepting it for anything and move on to a finite supply currency. So from that sense \$100m and -\$100m are effectively the same and equal to \$0.
They become equal if we can play an infinite number of times because we can become as sure as we want that the 99% +/- \$100m will be erased by the 1%. But if we can only play one time then A is clearly better, since the outcome is usually +\$100m vs -\$100m and B has no compensating advantage to offset this. The same holds true for any finite number of times we play A vs B, but to a lessor degree.

Last edited by bobf; 07-07-2012 at 02:56 AM.

07-07-2012, 03:29 AM   #210
enthusiast

Join Date: Feb 2012
Location: Wooster, OH
Posts: 54

Quote:
 Originally Posted by Kittens In the theoretical game where there's no maximum and we have infinite bankrolls, EV(Switch)>EV(stay) is no contradiction with EV(Switch)=EV(stay) as they are both infinity.
Winner.

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are Off Pingbacks are Off Refbacks are Off Forum Rules

All times are GMT -4. The time now is 04:45 AM.

 Contact Us - Two Plus Two Publishing LLC - Privacy Statement - Top