IDK if mods will let me keep this here or not, but I want your guys opinions and even though I haven't posted much here lately, other than BFI this is my home forum. Anyway, I am cross posting my BFI post here
https://forumserver.twoplustwo.com/3...esign-1727320/:
IDK if this is the first I have mentioned this or not, but I am building a mobile app for poker players that is primarIly a "poker tracker" and am lucky to be able to work on it as a final project in my "Mobile Application Development Class."
I have build a lot of the app already and am pretty happy with the design. But I do have one area that is nagging me and I am spread to thin to focus on it. So I thought I'd try and group think it here and maybe in the PLO forum.
Anyway, one of the tabs is a "reports" tab and one of things we are doing is a histogram w/ count on the Y-axis and $ (or currency, or big blinds) on the X-axis.
So if you have like a max win of $1000 and a max loss of $1000 with 100 entries. Picture a bell curve with histogram bars that range maybe $200 each. So the first bar would be as tall as the number of sessions you have played that fall between -$1000 & -$800. -$799 & -$600, etc.
We will probably put 1 and 2 standard deviation bars on there too.
The probably I have is I need a mathematical model (or at least a good idea) of how to create these groupings for the histogram.
So like obviously (I think) if the user has only made three entries, each histogram bar would be 1 count tall and would just be the net from each session.
But what is a good approach to divide a range (from max loss to max win) into the right (or ideal) groups. I feel like if they have entered 100 sessions there should be like 10 - 20 "groupings." So with a max win = $1000 & max loss = -$1000 each histogram bar would represent a $100 or $200 range and be as tall as the number of sessions that fell with those values: eg -$1000 -- -$800, etc.