Quote:
Originally Posted by Calhoun137
Ok here's the best way to think about it.
Let's say we are flipping coins, and each time it lands heads you give me a dollar, and each time it's tails I give you a dollar.
Then let's keep track of how much money I have made or lost. On average I expect to make nothing, but in practice I will most likely lose or win a finite amount. In fact if we flip the coin N times, then I am most likely to win or lose somewhere around the square root of N dollars. Why is this true? It's actually easy to show.
Let's call my profit/loss M. Then M starts at zero, and the average change in M after one or more flips is zero.
Here is the clever trick: let's consider the average change in M squared. Well after one flip it's quite clear that this will be 1, since +1 and -1 squared are both equal to 1.
Notice that after one flip the average profit/loss is zero, but in reality I either gained or lost a dollar. M squared however is 1, so you can see it sort of measures the fluctuations in my profit/loss.
Call M(N) my profit/loss after N flips. Then what is M(N+1) squared? Since I can either gain or lose 1 dollar, we have either
M(N+1) = M(N) + 1 or M(N+1) = M(N) - 1
lets square these two expressions and take the average, which I write as <M(N+1)>^2. If you do this, which is quite easy, you will find that:
<M(N+1)>^2 = <M(N)>^2 + 1
If you think about it, this implies that after N flips, the average of M squared will be N; therefore the square root of the average of M squared is root N, which is just a fancy way of saying the SD for flipping a coin N times is root N.
(to answer your question about the "bell curve" the answer is that the SD of a bell curve is half the width of the curve at half maximum height, the bell curve is called a Gaussian and it's really special, see the law of large numbers. For coin flips, as N becomes super large, the probability looks like a bell curve)
This is pretty awsome ty .
So am i wrong to think of SD like this -
good luck = 0
bad luck = 1
over infinite number of hands there would be equal 0's and equal 1's
and it would run out like - 01010101010101010101 (long term)
So if im running bad in a cash game the numbers would run like -
111111101111011110111 (short term)
If im running good the numbers would like -
0000000101010000000101000000
But over a lifetime of hands played the good luck and bad luck equals out?
Im not very academic so imn trying to translate to what i think in a as easy terms as possible.
This is great too because i think the people who understand standard deviation
Come to terms with variance alot easier.