Open Side Menu Go to the Top
Register
Card Removal/Card Bunching Solved Card Removal/Card Bunching Solved

04-19-2014 , 02:43 AM
Quote:
Originally Posted by tobakudan
Do you have any examples in mind of situation where some action is best if you were ignorant of range removal but if you were aware you could figure out that some other action is better?
Quote:
Originally Posted by TakenItEasy
Anything involving an ace will show a huge difference vs calculations that assume a random stub.

I need to get some sleep now. I'll try to include an example tomorrow.

Cheers!
Sorry it took me so long to get back to you on this. I got side tracked when I found some errors in an unrelated part of the program that caused it to become corrupt so I had to make sure that was fixed first.

Right now it is difficult to recalculate the odds using a weighted stub due to the fact that you need to calculate using partial figures. I hope to be able to show differences as part of my post-flop analysis in the training simulation.

Until then you may look at the cards remaining in the stub as some percentage above or below the average to try and make some estimates, as far as cards that work for you or against you, i.e. hit your opponents range. Also, you should be looking at cards that hit your perceived range while likely missing the range of your opponent.

After the flop, you can use these numbers to adjust the likely hood of winners/blanks/losers for your hand. The best example to illustrate this are your perceived outs when on a draw. If you see the stub should favor your outs, than you can think of it as having extra outs. For example if you see that two cards you need to fill an open ended straight draw are 25% more likely to hit each than you may look at it like 10 outs instead of 8 and play accordingly.
Card Removal/Card Bunching Solved Quote
04-19-2014 , 07:12 AM
Quote:
Originally Posted by TakenItEasy
Sorry it took me so long to get back to you on this. I got side tracked when I found some errors in an unrelated part of the program that caused it to become corrupt so I had to make sure that was fixed first.

Right now it is difficult to recalculate the odds using a weighted stub due to the fact that you need to calculate using partial figures. I hope to be able to show differences as part of my post-flop analysis in the training simulation.

Until then you may look at the cards remaining in the stub as some percentage above or below the average to try and make some estimates, as far as cards that work for you or against you, i.e. hit your opponents range. Also, you should be looking at cards that hit your perceived range while likely missing the range of your opponent.

After the flop, you can use these numbers to adjust the likely hood of winners/blanks/losers for your hand. The best example to illustrate this are your perceived outs when on a draw. If you see the stub should favor your outs, than you can think of it as having extra outs. For example if you see that two cards you need to fill an open ended straight draw are 25% more likely to hit each than you may look at it like 10 outs instead of 8 and play accordingly.
This may help show the difference between results with Range Removal and without Range Removal using the same example from before.

Top 3 rows show:
Top Row is the percentage adjustment ((RR-Rand)/Rand).
%breakdown after known Card Removal only.
%breakdown of cards in the stub after Range Removal.



Therefore, in this case, after taking Range Removal into account, you can expect to find 31% more aces in the deck than if you had assumed a random deck.

Unfortunately for the example given (BB 3B/Folded vs BTN 2B/4B with TT) the hand was folded pre-flop so I can't give a flop example. This is because a call from villain would change the Range Removal effect since his calling range would contain more aces than his folding range.
Card Removal/Card Bunching Solved Quote
04-19-2014 , 09:42 AM
This is a very interesting take on a very complicated problem. In essence, you are trying to estimate the range of a ~10-16 card "hand" (full ring game). This is a good hypothesis but I have some observations.

A) the model doesn't take into account players styles. I.e if one is playing in a tight table, there might a lot of Ax discards from mid ep, mp and even lp positions overestimating equity.

b) I think it's model has more value for 6 max and SH poker as the discard range is made up of a lesser number of discards.

C) there are FR situational spots where this is useful. I.e. Assume that it a fairly old school NL styles. A player opens early, a late position player 3 bets and the hero holds QQ. In this bunching scenario, it is more than likely one of these players have an ace in their hand And the equity of hero s hand is much greater than the proverbial "coin flip"

There are other mathematical modeling tools to try to solve this problem. However, there are more practical for post Mortem analysis rather than heat of battle estimations IMO.

I ll keep my eye on this thread.
Card Removal/Card Bunching Solved Quote
04-19-2014 , 09:59 PM
Thanks for your interest and feedback.

Quote:
Originally Posted by Gio
This is a very interesting take on a very complicated problem. In essence, you are trying to estimate the range of a ~10-16 card "hand" (full ring game). This is a good hypothesis but I have some observations.
Actual this is not a hypothesis. It's been solved via algorithm and more recently verified through Monte-Carlo Simulation with results that match up extremely close. (See post #21)

Quote:
Originally Posted by Gio
A) the model doesn't take into account players styles. I.e if one is playing in a tight table, there might a lot of Ax discards from mid ep, mp and even lp positions overestimating equity.
It doesn't need to take styles into account. As with any tool, results are based on user input of ranges for players. Accuracy is dependent on the accuracy of those ranges input but such has always been and always will be the case.

Quote:
Originally Posted by Gio
b) I think it's model has more value for 6 max and SH poker as the discard range is made up of a lesser number of discards.
The value is in the amount of deviation from random as it occurs. Different action patterns will have different amounts of deviation in different regions in the ranks of the deck. For example several folds to the button will mean the deck is rich in aces. With a raise with several callers, the middle and very top of the deck may be light, while smaller broadways and small cards are heavy.

While 6-max may have tighter folding ranges, the Raise/Call ranges are wider.

FR will have wider fold ranges but tighter Raise/Call ranges in general. Results are cumulative based on all ranges of raise/call/fold. Also for full ring, there are more ranges that can add to this effect.

Quote:
Originally Posted by Gio
C) there are FR situational spots where this is useful. I.e. Assume that it a fairly old school NL styles. A player opens early, a late position player 3 bets and the hero holds QQ. In this bunching scenario, it is more than likely one of these players have an ace in their hand And the equity of hero s hand is much greater than the proverbial "coin flip"
As I mentioned before, while we may expect some patterns to be more exaggerated based on action, their is much information to be found in even the simplest of hands and many patterns that can reveal new information.

Quote:
Originally Posted by Gio
There are other mathematical modeling tools to try to solve this problem. However, there are more practical for post Mortem analysis rather than heat of battle estimations IMO.
I've searched fairly extensively for other solutions and I'm unaware of any other software than can solve for this effect. Neither solution was very easy or obvious to find with some very big hurdles that cannot be solved through traditional methods.

However, I can't say that I've checked out every feature of every tool. If you know of any tool that can do this, please let me know. I'd be very interested in checking it out.

As far as if it's useful at the table, I'm creating a hand simulator for training that will include range removal as one of it's features among many. It won't take long to be able to associate certain betting patterns to show the light and heavy areas in the stub though quantities may vary.

I've been taking some aspects of range removal into account at the table for years even before solving it. You can make some better decisions by discounting or adding to certain ranges of cards to come.

I suspect that even for some of those who "knew it was coming" that "feeling" was probably subconscious recognition of some pattern that had a tendency to have similar outcomes. For instance, ever notice that family pots rarely hit anyone very strong and quite often it's a check down with any small pair or even ace high taking it down? This goes against the notion that many hands should produce a larger number of big hands hit. The fact is that each of their ranges are usually blocking each other as range removal shows very thin values in the middle ranges.

The action patterns are fairly limited for the majority of hands so they are easy to pick up on even if your not consciously aware of it.

Last edited by TakenItEasy; 04-19-2014 at 10:04 PM.
Card Removal/Card Bunching Solved Quote
08-26-2014 , 08:38 AM
Somehow I never saw this thread!
Quote:
Originally Posted by TakenItEasy
Since factorials must use integers
I don't know if it would be useful here, but are you aware of the gamma function?

Today I'll read the thread more thoroughly.

Edit: just noticed the context of what I quoted above
Quote:
A good example would be even if I know that there will be 3.872 aces in the deck on average, try to get the number of AA combos
Yeah I wonder what the gamma function would say. But maybe it's fine to just say 3.872*2.872 / 2 = 5.56 combos. (Just like with integer combos, you can just say 4*3 /2.) And for the probability, do the same thing with the decimal number of cards left in the deck.

What was the "round about" method you ended up using?

Edit #2: Cool! Using the gamma function, I get 5.56 again!
gamma[4.782] / 2 / gamma[2.782] = 5.56

Edit #3: the answers are different at the 5th decimal place. So I guess I would trust the gamma method more than the first one I showed.

Last edited by heehaww; 08-26-2014 at 09:03 AM.
Card Removal/Card Bunching Solved Quote
08-26-2014 , 11:44 AM
Quote:
Originally Posted by heehaww
Somehow I never saw this thread!I don't know if it would be useful here, but are you aware of the gamma function?

Today I'll read the thread more thoroughly.

Edit: just noticed the context of what I quoted aboveYeah I wonder what the gamma function would say. But maybe it's fine to just say 3.872*2.872 / 2 = 5.56 combos. (Just like with integer combos, you can just say 4*3 /2.) And for the probability, do the same thing with the decimal number of cards left in the deck.

What was the "round about" method you ended up using?

Edit #2: Cool! Using the gamma function, I get 5.56 again!
gamma[4.782] / 2 / gamma[2.782] = 5.56

Edit #3: the answers are different at the 5th decimal place. So I guess I would trust the gamma method more than the first one I showed.
Actually I thought of doing something like that but instead. I segregate the combos into pieces where 0-4 cards are removed leaving me with integers for each case, get the number of each combo then recombine by weighted average like an EV calculation.
Card Removal/Card Bunching Solved Quote
08-26-2014 , 01:44 PM
Wow looks really interesting, excited to read through this more carefully later. Thanks for sharing.
Card Removal/Card Bunching Solved Quote
08-28-2014 , 09:16 PM
In to read later.
Card Removal/Card Bunching Solved Quote
08-30-2014 , 02:40 PM
Quote:
Originally Posted by heehaww
Somehow I never saw this thread!I don't know if it would be useful here, but are you aware of the gamma function?

Today I'll read the thread more thoroughly.

Edit: just noticed the context of what I quoted aboveYeah I wonder what the gamma function would say. But maybe it's fine to just say 3.872*2.872 / 2 = 5.56 combos. (Just like with integer combos, you can just say 4*3 /2.) And for the probability, do the same thing with the decimal number of cards left in the deck.

What was the "round about" method you ended up using?

Edit #2: Cool! Using the gamma function, I get 5.56 again!
gamma[4.782] / 2 / gamma[2.782] = 5.56

Edit #3: the answers are different at the 5th decimal place. So I guess I would trust the gamma method more than the first one I showed.
Hey HeeHaww,

I tried out the gamma function for pp combinations and it does seem to match up somewhat. If speed becomes an issue, I may use it to replace my first order approximation. Thanks for the tip.

I think most of the error that comes from first order approximation is due to the average rank being somewhat misleading and not the gamma function itself.

Example:
Take a simple range of: AA,KK,AKo
We can see that on average we will have 3 As and 3 Ks left in the stub after removing this range.

Estimating combos remaining based on the average # of ranks (1st order approx):
Since we get an integer we can just use the combin function in this case to approximate how many AA combos remain:

C(3,2)= 3 or

on average we may approximate that there are 3 AA combos remaining. In the case of a real number I could use your suggested gamma function.

However if we break it down we see:
25% 2 As & 4 Ks
25% 4 As & 2 Ks
50% 3 As & 3 Ks

So

25% 1 AA combo left
25% 6 AA combos left
50% 3 AA combos left

or

.25 + .25 * 6 + .5 * 3 = .25 + 1.5 + 1.5 = 3.25 AA combos

Which is more accurate.

Therefore I'm trying to break down ranges into clusters that satisfy specific conditions as a 2nd order solution. I can work more directly with integers and then take the percentages and combine them as with the above example.

It's actually not quite that simple but I think I've got it licked but I still need to iron out some bugs.
Card Removal/Card Bunching Solved Quote
08-30-2014 , 10:35 PM
You're right, that's the only way to do it. Multi-card combos are not distributed linearly, so the average # aces remaining is not helpful for two-card combos (it might have some kind of meaning, but I don't know what). The avg would tell us the probability of an ace when drawing one card.

I still have to read the full thread . I'll do so tomorrow.
Card Removal/Card Bunching Solved Quote
08-31-2014 , 07:49 AM
Quote:
Originally Posted by heehaww
You're right, that's the only way to do it. Multi-card combos are not distributed linearly, so the average # aces remaining is not helpful for two-card combos (it might have some kind of meaning, but I don't know what). The avg would tell us the probability of an ace when drawing one card.

I still have to read the full thread . I'll do so tomorrow.
Thanks, I appreciate your checking it out.

Bear in mind that when reading this thread, I started out using the term "Card Removal". However to avoid confusion with "known card removal", I since changed the term to "Range Removal".

I've made much progress since then and would say the thread mostly talks about my first order approximation which I completed some time ago.

I hope you'll understand that I'll need to limit my discussion on the methods behind RR calculations, at least for now, though I wish that didn't need to be the case. This is because I'm currently designing a tool with Range Removal as one of the main feature.

The tool includes a pre-flop bot engine that can generate unlimited flops complete with action and ranges. This will allow you to associate pre-flop betting patterns with card distribution and combo distribution heat maps, as well as show differences in probabilities that arise, etc.

I'll be happy to discuss some effects that show up as well as run simulations for other theory threads when applicable, though I can only run 1st order predictions for now.

First Order is accurate at finding the average number of ranks left in the stub. This will yield much improved accuracy for calculating the odds of improving your hand on any street as compared to random with known card removal only that most tools use.

Limitations of first order calculations:
As we discussed it's a more of a rough calculation for combos. This makes sense since estimates for a 52 card deck is easier than for 1326 combos. However, much of the error does wash out and it's a decent approximation showing over-all patterns in the combo grid.

It also doesn't show discrempancies for suited vs OS combos, treating them as paired and unpaired instead.

I would average any error back into the results for each range ensure total combos were correct for each range. For example:

C(50,2)=1225, C(48,2)=1128, C(46,2)=1035, etc.

I did this to not allow error to accumulate over multiple ranges.

For combos, It's still far better than assuming a random distribution of unknown cards but I hope to achieve far better resolution or even 0 error with 2ND order calculations removing all of the limitations of first order approximations.

I'll still be using first order when speed is an issue.




Above is where I calculate for Known Card Removal, i.e. Hero's hand, board, dead cards.



By comparison, this is the work area for range removal. Top Left corner is the card removal grid. First order requires 3 columns of grids per range. Second Order requires 15 columns of grids per range so far.
Card Removal/Card Bunching Solved Quote
08-31-2014 , 11:14 AM
I think I'm up to speed now. Basically you're creating an equity tool that factors in range removal, and also lets you explore a set of built-in examples so that the user can gain a sense of the typical effect of range removal. Great idea, I'm surprised it hasn't been thought of before.
Quote:
Originally Posted by TakenItEasy
For combos, It's still far better than assuming a random distribution of unknown cards but I hope to achieve far better resolution or even 0 error with 2ND order calculations removing all of the limitations of first order approximations.
I gather that "second order" refers to use of the density function like you showed in your last post. In that case, yes, it ought to have 0 error unless you're forced to take time-saving shortcuts. I imagine some of the calculations might be slow.
Quote:
I'll still be using first order when speed is an issue.
If it's for speed, then nevermind gamma. I don't know what you did, but the first thing I showed might be the fastest way (and gets basically the same answer as gamma). Restated below:

If there are N aces, and you want the AA combos, take N*(N-1) / 2
If you want Ax combos, and there are M non-aces, take N*M.
For Axx on the flop, take N*M(M-1) / 2
For AAA on the flop, take N(N-1)(N-2) / 6

Maybe your tool can have 2 settings to choose from: exact (2nd-order) and approximate (1st-order). Just like pokerstove has "enumerate all" vs "monte carlo". If the user thinks 2nd-order is taking too long, he can switch.

Will your tool have turn and river calculations too? For just the turn or just the river, the average is all you need since it's a one-card combo. 0 error, and fast.

Keep up the good work!
Card Removal/Card Bunching Solved Quote
08-31-2014 , 05:43 PM
Thanks for this:
Quote:
Originally Posted by heehaww
If there are N aces, and you want the AA combos, take N*(N-1) / 2
If you want Ax combos, and there are M non-aces, take N*M.
For Axx on the flop, take N*M(M-1) / 2
For AAA on the flop, take N(N-1)(N-2) / 6
For the PPs I'll definitely be trying this out, thanks.
For the unpaired hands, that's currently what I use for first order approximation.

For the flop, currently I use an RNG to deal hands and board.

Table consists of programmable pre-flop bots and one is designated as the hero with hand shown.

Bots actions are based on raise/call range tables (fold if neither qualify) that have rules assigned that determine which range to use. Everything can be changed manually to customize the bot then name and save it. Perhaps a feature can be use to program a bot based on a given players stats.

Currently I have defaults set based on Poker Strategy recommendations for loose, standard, or tight play.

The table can be adjusted for number of players stack sizes, hero position. You can freeze hero's position or let the button move each hand.

You can plug in a specific hand to review. Perhaps just loading a hand history may automate this but you would probably want to tweak some ranges for villains. All ranges are displayed for all last actions including folding ranges.

Or hands can be dealt automatically in auto-play or batch mode.

In auto-play mode you can let it deal until hero sees a flop or until hero and at least one villain hit the board with certain criteria for draws/made hands.

Hands will run with graphics turned on and you can set the speed it will replay. This will give the user a sense of frequency for the situations they finally analyse.

From there you can change hero be any player to get different perspectives, modify stacks or change position.

You can freeze the scenario and deal new random flops that will be constrained to randomly fit all current ranges position and criteria previously set in a random like fashion. This will be more realistic in terms of frequencies of occurrence and keeping it as real as possible, I feel will be important to avoid introducing some bias.

Or you can just allow it to deal to the next flop scenario facing a new villain.

In batch mode, no hero will be assigned so that flops can be generated more quickly for any 2 or more players and no graphics. Hero can be picked afterwards.

Also I may have it only save flops with at least 2 hands meeting some strength/draw criteria. This way 5 saved flops may be representative of a hundred hands played.

I may set up smart filters so you can filter through the saved flops and view flops that meet certain criteria or just click through them.

Hands will all be saved in a session DB which includes all the table and bot setup information.

I also should be able to create a Monty-Carlo simulation for equity calculations w/wo RR.

Another cool factor is that I think it should be possible to show flop equity for pre-flop hands by using bot play.

Quote:
Originally Posted by heehaww
Maybe your tool can have 2 settings to choose from: exact (2nd-order) and approximate (1st-order). Just like pokerstove has "enumerate all" vs "monte carlo". If the user thinks 2nd-order is taking too long, he can switch.
Yeah, I think it will default to first order so you can run through hands quickly, then once you see a flop worth looking at you can run 2nd order as well as all the probability calcs.

Quote:
Originally Posted by heehaww
Will your tool have turn and river calculations too? For just the turn or just the river, the average is all you need since it's a one-card combo. 0 error, and fast.
Agree, When a flop is displayed all probability calculations will run w/wo RR.

I'll be showing both odds to improve for hero's hand and a breakdown of villains range with the range displayed in a combo grid and pie charts to show relative frequencies.
Quote:
Originally Posted by heehaww
Keep up the good work!
Thanks!
Card Removal/Card Bunching Solved Quote
09-01-2014 , 05:24 PM
This seems like cool work, but it also seems like you would have to have a very good idea of what "optimal" opening ranges look like before you started taking any conclusion that is based on them somewhat seriously. With that said, I could see it being useful for some specific situations, since you can prune out dominated pre-flop strategies to get fairly useful results.
Card Removal/Card Bunching Solved Quote
09-01-2014 , 05:28 PM
Quote:
Originally Posted by heehaww
I think I'm up to speed now. Basically you're creating an equity tool that factors in range removal, and also lets you explore a set of built-in examples so that the user can gain a sense of the typical effect of range removal. Great idea, I'm surprised it hasn't been thought of before.
It has, just not to this extent. All of the equity tools out there (that I use, at least) aready factor in range removal against your hand. The coolest part seems to be the potential that this program has to "train" you and improve your real-time performance.

@TakenItEasy: Is the software for sale yet?
Card Removal/Card Bunching Solved Quote
09-02-2014 , 02:01 AM
Quote:
Originally Posted by MT656
This seems like cool work, but it also seems like you would have to have a very good idea of what "optimal" opening ranges look like before you started taking any conclusion that is based on them somewhat seriously. With that said, I could see it being useful for some specific situations, since you can prune out dominated pre-flop strategies to get fairly useful results.
Results are independent of ranges being optimal as in theoretical optimal but if by that you mean knowing players ranges its true to a degree.

However this is always going to be the case when analyzing a hand. For example, EV calculations are based on presumed knowledge of a villains range.

However, it will provide some reasonably good basis in terms of getting a good idea of where the deck may be heavy or thin with quantifiable results for the first time. You can substitute in various ranges to see how much results change.

Certainly when ranges align in a single direction, results are going to be much more stable. Example: 7 open folds to the button or blinds, or 1 raise with several calls.

Often the action cancels range effects such as some ratio of raise/folds but again you'll be able to see where this will hold true as well. Even then, greater accuracy should be possible I believe from effectively shortening the stub. e.g. if the results end up being flat where all ranks were reduced equally, I think that just by reducing the deck will show a larger difference between turn and river odds, though I may be over-thinking this.

Quote:
Originally Posted by MT656
It has, just not to this extent. All of the equity tools out there (that I use, at least) aready factor in range removal against your hand.
You may be confusing Card Removal of known cards with Range Removal. While calculators will first remove known cards before calculating and Monte-Carlo sims do this by default. I'm pretty sure all other tools assume a random stub after that.

Quote:
Originally Posted by MT656
The coolest part seems to be the potential that this program has to "train" you and improve your real-time performance.

@TakenItEasy: Is the software for sale yet?
Thanks! Unfortunately, it could be a while.

Solving RR required using Excel. While I've made much progress developing a tool using Excel with VBA for proof of concept. A user would need to own Excel first, creating a huge cost disadvantage.

I will need to convert algorithms to another language to create a stand alone executable and would need to learn that language on the fly as I did with first learning Excel and VBA. BTW, Excel is a much deeper application than I first thought. Besides ancient languages such as Fortran or Pascal, my only other language is AHK.

I'm a quick study as well as accustomed to being something of an autodidact. But even so, I'm sure a professional programmer with good experience could achieve this much faster.

Hiring a programmer could be an option but I would need to avoid payment out of pocket as much as possible and find some other equitable terms.

As an intermediate option, I'm also looking into an HTML conversion solution for a website for first order Range Removal calculations only. Even so, it may be too large a project for this to be practical.

I'd be happy to hear any suggestions towards providing a marketable tool sooner from any experts on either the business or technical side. I do foresee Range Removal as being something of a game changer.

I've considered a higher price model and keeping it in an Excel format. This would enable a very large and flexible add-on market. Excel can be a powerful platform towards creating something like a more adaptive research tool or perhaps even a bot platform which could be more interactive which could be useful for recognizing localized maxima/minima scenarios where pre-defined constraints create unforeseen issues.

However, this could create an issue of potentially making it too powerful and perhaps even damaging the already fragile poker economy such as creating bots that could better exploit on-line poker.
Card Removal/Card Bunching Solved Quote
09-02-2014 , 05:22 PM
Solving RR required using Excel. While I've made much progress developing a tool using Excel with VBA for proof of concept. A user would need to own Excel first, creating a huge cost disadvantage.

If you do it with OpenOffice Calc (their spreadsheet thing), you won't have any of that cost disadvantage because OpenOffice is free...
Card Removal/Card Bunching Solved Quote
09-03-2014 , 12:19 AM
Quote:
Originally Posted by MT656
Solving RR required using Excel. While I've made much progress developing a tool using Excel with VBA for proof of concept. A user would need to own Excel first, creating a huge cost disadvantage.

If you do it with OpenOffice Calc (their spreadsheet thing), you won't have any of that cost disadvantage because OpenOffice is free...
Thanks for the tip, I'll look into it.
Card Removal/Card Bunching Solved Quote
03-08-2015 , 11:49 PM
Conclusions?
Card Removal/Card Bunching Solved Quote
03-10-2015 , 03:40 AM
Quote:
Originally Posted by Benni19
Conclusions?
Thanks for taking an interest. I would love to get back to this project but haven't been able to.

Unfortunately, I've been having problems with persistant hackers for the last 6 months and haven't been able to work on any project on my PC. I'd explain more but it's a long discussion and I don't want to derail this thread.

I may post about it in the computer tech forum.
Card Removal/Card Bunching Solved Quote

      
m