Quote:
Originally Posted by stinkypete
It's not only an assumption, but it's clearly a wrong assumption.
If it's clearly not uniform, then what is it? Which bias is more likely than others?
Quote:
There's a huge difference between having no information and knowing something is a uniform distribution and treating them equally is mathematically incorrect
I fail to see a difference and this is my intuition speaking, not my limited math background. When I say it's uniform I'm saying I have literally no clue what the % is and therefore, for all I know, each one is equally likely. To claim it's anything other than uniform is to claim that you have information. You wouldn't say that X% bias is more likely than Y% bias unless you had info not provided in the OP.
Quote:
It looks like the definitive paper was written on this the same year I got my master's.
Care to link or cite? Or summarize?
Quote:
Originally Posted by heehaww
A little bird has told me that besides uniform, one might instead use Jefferys prior.
Quote:
Originally Posted by stinkypete
I don't know what that means (and a quick google didn't clear it up with minimal effort)
Beta(1/2, 1/2) distribution, whereas Uniform is Beta(1,1). I don't get why one would be allowed to use anything other than Uniform for this, but said bird is much more knowledgeable than me. Perhaps the existence of more than one choice supports your argument that it's subjective. However, I still contend that
not using Bayes would not avoid subjectivity.
Quote:
Originally Posted by robert_utk
if we started with .75 bias, would our potential profit be higher, given that we only get 10 trials at this?
You mean start with a pure guess? Your potential profit would be higher (by virtue of making a real bet on the 1st flip) but your EV wouldn't, and definitely not your EG.
Quote:
Originally Posted by jukofyork
f = (h-t)/(2+h+t)
Nifty, nice work
Quote:
Originally Posted by JoeC2012
Kelly serves as a lower bound for how much you should bet, but my revised answer is that you should bet something between Kelly and 100%, skewing more toward 100% as you get close to the end of the game.
Ralph Vince has written about exactly this. I don't know of any links or free material that provide it, so some time I'll show what his formula would yield for this problem. He called it EACG (expected avg compound growth?) because he understandably didn't want to name it "Vince Criterion". When n=1, Vince = 100%; as n→∞, Vince → Kelly. I forget whether you decrement n after each completed bet, and that's an important detail, so hopefully it's in my notes because I don't have his book with me in my travels. I'm guessing you don't decrement n, otherwise the formula would imply that your utility function becomes more linear after each completed bet, which to me doesn't make sense. If I'm 70 years old and about to make what I know is the last bet of my life, and my bankroll is 1M, I'm probably not risking 100% on a 55-45 coinflip. Losing that 1M probably impacts me more than doubling it up, unless there's something specific I want to buy costing closer to 2M.
In reality, since we're gamblers on a gambling forum, it's not like these 10 bets will be the only gambling/investing we do for the rest of our lives. It wouldn't be good to Vince-bet as though n=10 if in fact n=1000.
Also like a few people ITT have said, $20 is a small fraction of our true bankroll IRL, so if we're limited to a $20 stake for this game then the growth-optimal strat is identical to the max-EV strat until maybe the later bets.
Quote:
Originally Posted by jukofyork
Just been thinking about this some more and I wonder if it's really just a question of your own personal "utility of money" function [...] in which case there really is no correct answer; just a continuum of reasonable/sensible answers.
Agree