Hey guys, I'm trying to write a program to determine a gambler's luck on a standard deviation scale over time, similar to this picture here:
You'll see the player's actual performance relative to their surmised luck superimposed on itself.
I'm stuck on one part though: For a gambler that uses many different bet sizes or gambling outcomes (i.e. betting at 2:1, 5:1, 1:5 and so forth), how do you average these Bernoulli processes? Is it as simple as taking a weighted average of everything or is there something else to it?