Two Plus Two Poker Forums Probability Interview Question with an Asset Manager
 Register FAQ Search Today's Posts Mark Forums Read Video Directory TwoPlusTwo.com

 Notices

 Probability Discussions of probability theory

07-23-2012, 03:49 AM   #31
Carpal \'Tunnel

Join Date: Sep 2002
Posts: 8,896
Re: Probability Interview Question with an Asset Manager

Quote:
 Originally Posted by mburke05 Bruce, part of the solution was calculating the expected profit in these various strategies, hence the calculations.
For the strategy where you don't bet, your EV is \$5 since the probability that the last marble is red is 1/2, and 1/2*10 = 5. For any strategy where you MUST bet, your EV is \$4 no matter what your strategy for when to bet. That's because whenever your strategy tells you to bet, you can do equally well by not betting and committing to bet on the last marble since the last marble will have the same probability of being red as the marble that your strategy tells you to bet on. The probability of red is 1/2 for the last marble before the game begins, so your EV will be

1/2*9 - 1/2*1 = 4.

That's all you have to calculate for all strategies where you always bet or never bet.

Even a strategy that looks dumb like betting whenever the probability of red is LESS THAN 10% will still have an EV of \$4 because it is equivalent to betting on the last marble too. Note that we are assuming that we will bet, so if we never met our criteria for betting, we would have to bet on the last marble anyway.

You could even have a complicated strategy where your criteria for betting changes depending on the number of marbles left, and it would still be equivalent to just betting on the last marble.

Now the EV is the same if you always bet on the first marble, or on any other marble, since the probability of red is 1/2 for any marble. All strategies are equivalent to just picking a particular marble and betting on it. So if you have to bet, you might as well just bet on the first marble.

Note that in your example with 2 reds and 1 blue remaining, the best EV you got by betting was \$5.67 which is exactly the same as you get by betting on the first marble since

2/3*9 - 1/3*1 = 5.67

When you never bet, your EV was \$6.67 because the probability that the last marble is red is 2/3.

2/3 * 10 = 6.67.

Last edited by BruceZ; 07-23-2012 at 05:57 AM.

 07-23-2012, 12:42 PM #32 Carpal \'Tunnel     Join Date: Mar 2008 Location: Ruling with an iron pocketbook. Posts: 10,431 Re: Probability Interview Question with an Asset Manager yeah that's the number i got bruce. i think basically what they wanted me to demonstrate (that i think i failed at) was to show that the 1\$ would never be at risk in a strategy where you never bet and you're essentially free-rolling (the difference between red and blue in your favor * 1) + ([1-the difference between blue and red you chose there]*-1) which can only equal 1, and should never approach 1 i think in our model? or am i off base here. this is sort of the way i envisioned it after the interview, but maybe i'm just fundamentally misunderstanding. i also made the mistake of thinking in terms of expected revenue rather than expected profit a few times during my analysis on the phone. which he was pretty quick to correct.
 07-30-2012, 06:21 PM #33 adept     Join Date: Jun 2005 Location: Playin' It Smart Posts: 743 Re: Probability Interview Question with an Asset Manager If you're only allowed to bet once, the problem is trivial, and Bruce is right. The way I read the problem, it sounds like you're allowed to keep betting as long as you have money left. That makes it much more interesting and difficult. Then, waiting until the end is clearly not the best result, since the implied EV of potential future bets is more than enough to outweigh the EV of waiting. The question becomes whether to start betting right off, or wait to see if you can get an advantage. That isn't so obvious to me. In that case, I simply would have said, "It'd be trivial to simulate this on a computer; give me about 5 mins, and I'll have the program written and running."
07-31-2012, 02:24 PM   #34
Carpal \'Tunnel

Join Date: Mar 2008
Location: Ruling with an iron pocketbook.
Posts: 10,431
Re: Probability Interview Question with an Asset Manager

Quote:
 Originally Posted by MApoker If you're only allowed to bet once, the problem is trivial, and Bruce is right. The way I read the problem, it sounds like you're allowed to keep betting as long as you have money left. That makes it much more interesting and difficult. Then, waiting until the end is clearly not the best result, since the implied EV of potential future bets is more than enough to outweigh the EV of waiting. The question becomes whether to start betting right off, or wait to see if you can get an advantage. That isn't so obvious to me. In that case, I simply would have said, "It'd be trivial to simulate this on a computer; give me about 5 mins, and I'll have the program written and running."
yeah, this is essentially how i evaluated it too; i went on a pretty hard tangent trying to calculate the spread necessary to make a bet +ev vs no-bet, if you could continue that spread (assuming it would close eventually).

i got passed up for the job, and he said it was my "problem solving skills". i got both questions right, fwiw, same way bruce did, but i guess because i didn't interpret the question in his framework (the way you just described it) immediately i got docked majorly.

eventually i did, it just required some hard questions to get there because he really wasn't very helpful in his explanations. or cogent for that matter.

 08-01-2012, 06:46 AM #35 centurion   Join Date: Aug 2011 Posts: 121 Re: Probability Interview Question with an Asset Manager My solution to the way I understand the rules, where you can bet more than once, is that you only bet if all the balls are red and never bet otherwise. If all balls are red and you have x steps left your EV is 9x+1. Now take a step back in the tree, to when you have all red balls and one blue ball. If we can prove its optimal to not bet here, then we should always not bet when there are blue balls. Now if there are x balls left you have a 1/x chance of drawing blue and an (x-1)/x chance of drawing red. If you bet then the outcomes are 1/x you get -1 (x-1)/x you get +9 plus the continuation value of x-1 balls with one blue If you wait then the outcomes are 1/x you get the continuation value 9(x-1)+1 (x-1)/x you get 0 plus the continuation value of x-1 balls with one blue To compare the outcomes we just need to see that 1/x*[9(x-1)+1]>-1/x+9(x-1)/x Because the continuation values cancel out. This equation is true for any positive x. This is assuming that you stop whenever you draw a blue ball. The rules weren't 100% clear. Last edited by random_person; 08-01-2012 at 06:51 AM.
09-05-2012, 11:25 PM   #36
stranger

Join Date: Apr 2011
Posts: 7
Re: Probability Interview Question with an Asset Manager

The main point of the question has already been solved, ie not betting when you have 1r1b remaining, however I think there are still some mistakes being made. If we have 1r1b, our best option is to Not Bet (NB) rather than bet (B) since (\$10)(.5)-(\$0)(.5) > (\$9)(.5) -(\$1)(.5) or EV\$5 > EV\$4, as Bruce pointed out.

Quote:
Originally Posted by BruceZ
Quote:
 Originally Posted by mburke05 he wouldn't tell me what the extension of the solution was but i can tell you that the EV of profit in the scenario of not betting i got (in for example, 2red, 1 blue remaining) was \$6.67(no bet) and \$5.67 (bet). to do this i just made a probability distribution tree and calculated the possible scenarios
Of course it's \$6.67. That's because the probability that the last one is red is 2/3. In the original game, it's 1/2, so the EV is \$5. You don't need a decision tree. You're missing the key.
This, however, is incorrect. Since for 2r1b,

• 50% we will get 2r0b, or EV\$10 (a situation where not betting will guarantee us \$10)
• 50% we will get 1r1b, or EV\$5 (a situation as above)

Thus, NB will give EV\$7.50.

Quote:
 Originally Posted by BruceZ ...But the EV of waiting will always be greater than the EV of betting because whatever the probability is that the next marble is red, the probability that the last marble is red is the same, and we don't have to bet \$1 on the last marble.
That is true for every situation where the remaining number of r > b. However, for other situations such as 1r3b, 2r4b, 1r4b, 2r5b, 1r5b, ... betting has a higher EV.

If we look at 1r3b, we first have to look at 1r2b.

For 1r2b,
• B gives us EV\$2.33 (33% we win \$9. 66% we lose \$1)
• NB gives us EV\$2.50 (50% we get a 0r2b situation, ie EV\$0 [where NB would be optimal]. 50% we get 1r1b, ie EV\$5 [as above])

So, looking at 1r3b,
• B gives us EV\$1.50 (25% we win \$9. 75% we lose \$1)
• NB gives us EV\$1.25 (50% we get a 0r3b situation, ie EV\$0. 50% we get a 1r2b situation, ie EV\$2.50)

ie the expected value of betting is \$0.25 higher that the expected value of not betting.

EDIT: I'm very tired, so I may have made some errors in calculation, but I think the principles are correct. Happy to hear other thoughts on this. It was a very interesting question

Last edited by HoldemMcgroin; 09-05-2012 at 11:51 PM.

09-05-2012, 11:31 PM   #37
stranger

Join Date: Apr 2011
Posts: 7
Re: Probability Interview Question with an Asset Manager

Quote:
 Originally Posted by mburke05 How do you play the game? (Assuming your intention is to maximize profit)
I imagine the answer he wanted was the answer you and Bruce concluded, that towards the end it will be more profitable to not invest in the majority of situations since it provides a higher expected value. They also probably look more towards the steps you take to solve the problem, which, in my opinion, is asking them a ton of questions (if allowed) so you 100% know what the question is asking. Wasting 15 minutes doing something incorrect because you didn't understand the question probably didn't impress them that much

That said, I've had a few similar questions in job interviews and it's very tough to get your head around the wording, especially under time control and interview pressure!

09-06-2012, 08:16 AM   #38
Carpal \'Tunnel

Join Date: Sep 2002
Posts: 8,896
Re: Probability Interview Question with an Asset Manager

Quote:
 Originally Posted by HoldemMcgroin This, however, is incorrect. Since for 2r1b, 50% we will get 2r0b, or EV\$10 (a situation where not betting will guarantee us \$10) 50% we will get 1r1b, or EV\$5 (a situation as above) Thus, NB will give EV\$7.50.
Of course it's not incorrect. The EV of not betting is \$6.67. Your calculations are incorrect.

You get 2r0b 1/3 of the time, not 50% of the time. You get 1r1b 2/3 of the time, not 50% of the time.

(1/3)*10 + (2/3)*5 = \$6.67

as I said.

Quote:
 That is true for every situation where the remaining number of r > b. However, for other situations such as 1r3b, 2r4b, 1r4b, 2r5b, 1r5b, ... betting has a higher EV.
No, it's ALWAYS true that waiting has a higher EV than betting, even in those situations. That's why I written ALWAYS. If it didn't work in those situations, I wouldn't have wrote ALWAYS. You made more mistakes. But what I stated should be obvious without any calculations. That was the whole point of the question, to see if the interviewee understands the fundamental principle that makes the answer obvious without any calculations.

Quote:
 If we look at 1r3b, we first have to look at 1r2b. For 1r2b, B gives us EV\$2.33 (33% we win \$9. 66% we lose \$1) NB gives us EV\$2.50 (50% we get a 0r2b situation, ie EV\$0 [where NB would be optimal]. 50% we get 1r1b, ie EV\$5 [as above])
You get 0r2b 1/3 of the time not 50%. You get 1r1b 2/3 of the time, not 50%.

EV of NB = (1/3)*0 + (2/3)*5 = \$3.33

Quote:
 So, looking at 1r3b, B gives us EV\$1.50 (25% we win \$9. 75% we lose \$1) NB gives us EV\$1.25 (50% we get a 0r3b situation, ie EV\$0. 50% we get a 1r2b situation, ie EV\$2.50)
You get 0r3b 25% of the time, not 50%. You get 1r2b 75%, not 50%.

EV of NB = (0.25)*0 + (0.75)*3.33 = \$2.50

Quote:
 ie the expected value of betting is \$0.25 higher that the expected value of not betting.
The expected value of not betting is \$1 higher than betting. But that was obvious without any calculations because the probability of getting red is 25% for both the next marble and the last marble, but on the last marble we don't have to bet \$1, so the EV is \$1 higher.

Quote:
 EDIT: I'm very tired, so I may have made some errors in calculation, but I think the principles are correct. Happy to hear other thoughts on this. It was a very interesting question
The correct principles make the calculation unnecessary. Obviously all of the remaining marbles have the same probability of being red. Just like all the remaining cards in a deck have the same probability of being the ace of spades. On average, the ratio of red to total marbles after the next draw is always the same as the current ratio of red to total marbles. Since that's apparently still not obvious to some people, I will spell it out algebraically:

Let the current ratio of red to total marbles be r/(r+b). That is the probability that the next marble is red. If the next marble is red, the ratio will become (r-1)/(r+b-1). The probability that the next marble is blue is b/(r+b). If the next marble is blue, the ratio will become r/(r+b-1). Putting that together gives the expected value of the ratio after the next draw as

[r/(r+b) * (r-1)/(r+b-1)] + [b/(r+b) * (r/(r+b-1)]

= [r(r-1) + br] / [(r+b)(r+b-1)]

= r(r-1+b) / [(r+b)(r+b-1)]

= r/(r+b)

So on average, the ratio of reds to total marbles after any draw will be the same as before the draw. ALWAYS. The average of this ratio is also called the probability that the marble on the draw following the current draw will be red. So the probability of getting a red on the current draw is the same as it will be on the next draw. So there is certainly no compelling reason to bet on the current draw rather than the next draw. But this is true for all draws. So the probability of a red marble on all subsequent draws is the same as it is on the current draw. So there is never a reason to bet on any marble because the EV of all of the following marbles is the same. But there is a reason to NOT bet on any marble, because not betting pays more when the last marble is red.

This concept of the average proportion of reds staying the same is the same as the True Count Principle in blackjack which says that the average tendency of the count is to stay the same. Most people think that when the count is high, it is more likely that a high card will come out, so the true count should go down on average. Not so. A high card IS more likely to come out, but the average ratio of high cards to total cards stays the same. Just like a high ratio of reds to total marbles makes a the next marble more likely to be red, but on average the ratio will stay the same.

That is the point of the problem which you were expected to get on the spot. If you did, you got a job. Otherwise you have to work with your back like a mule, as my dad always used to say. "Study hard so you don't have to work with your back like a mule."

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are Off Pingbacks are Off Refbacks are Off Forum Rules

All times are GMT -4. The time now is 08:08 AM.

 Contact Us - Two Plus Two Publishing LLC - Privacy Statement - Top