Open Side Menu Go to the Top
Register
Letter to FTP: Running bad for 50,000 sngs Letter to FTP: Running bad for 50,000 sngs

09-24-2010 , 05:18 PM
lol op complaining about 300 buyins below ev in 50k sngs try 300 buyins below in 9000 sngs at 20x ur avg stake
Letter to FTP: Running bad for 50,000 sngs Quote
09-24-2010 , 05:22 PM
Quote:
Originally Posted by SayGN
Here it is with the X-Axis displayed in dollars:

[ ] x-axis
i mean seriously? didnt you take algebra 1 at some point?
Letter to FTP: Running bad for 50,000 sngs Quote
09-24-2010 , 05:36 PM
clearly i know the difference between x axis and y axis, and i made a mistake...i also had at least 1 typo in my OP (tinfoil hate) which was also pointed out in this thread.
Letter to FTP: Running bad for 50,000 sngs Quote
09-24-2010 , 05:37 PM
In probability theory and statistics, the variance is used as one of several descriptors of a distribution. It describes how far values lie from the mean. In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.
The variance is a parameter describing a theoretical probability distribution, while a sample of data from such a distribution can be used to construct an estimate of this variance: in the simplest cases this estimate can be the sample variance.

The variance of a random variable or distribution is the expectation, or mean, of the squared deviation of that variable from its expected value or mean. Thus the variance is a measure of the amount of variation within the values of that variable, taking account of all possible values and their probabilities or weightings (not just the extremes which give the range). For example, a perfect die, when thrown, has expected value (1 + 2 + 3 + 4 + 5 + 6)/6 = 3.5, expected absolute deviation 1.5 (the mean of the equally likely absolute deviations (3.5 − 1, 3.5 − 2, 3.5 − 3, 4 − 3.5, 5 − 3.5, 6 − 3.5), giving (2.5, 1.5, 0.5, 0.5, 1.5, 2.5), but expected square deviation or variance of 17.5/6 ≈ 2.9 (the mean of the equally likely squared deviations 2.52, 1.52, 0.52, 0.52, 1.52, 2.52).
As another example, if a coin is tossed twice, the number of heads is: 0 with probability 0.25, 1 with probability 0.5 and 2 with probability 0.25. Thus the variance is 0.25 × (0 − 1)2 + 0.5 × (1 − 1)2 + 0.25 × (2 − 1)2 = 0.25 + 0 + 0.25 = 0.5. (Note that in this case, where tosses of coins are independent, the variance is additive, i.e., if the coin is tossed n times, the variance will be 0.25n.)
Unlike expected deviation, the variance of a variable has units that are the square of the units of the variable itself. For example, a variable measured in inches will have a variance measured in square inches. For this reason, describing data sets via their standard deviation or root mean square deviation is often preferred over variance. In the dice example the standard deviation is √(17.5/6) ≈ 1.7, slightly larger than the expected deviation of 1.5.
The standard deviation and the expected deviation can both be used as an indicator of the "spread" of a distribution. The standard deviation is more amenable to algebraic manipulation, and, together with variance and its generalization covariance, is used frequently in theoretical statistics; however the expected deviation tends to be more robust as it is less sensitive to outliers arising from measurement anomalies or an unduly heavy-tailed distribution.
Real-world distributions such as the distribution of yesterday’s rain throughout the day are typically not fully known, unlike the behavior of perfect dice or an ideal distribution such as the normal distribution, because it is impractical to account for every raindrop. Instead one estimates the mean and variance of the whole distribution as the computed mean and variance of n samples drawn suitably randomly from the whole sample space, in this example yesterday’s rainfall.
This method of estimation is close to optimal, with the caveat that it underestimates the variance by a factor of (n−1)/n (when n = 1 the variance of a single sample is obviously zero regardless of the true variance), a bias which should be corrected for when n is small. If the mean is determined in some other way than from the same samples used to estimate the variance then this bias does not arise and the variance can safely be estimated as that of the samples.
The variance of a real-valued random variable is its second central moment, and it also happens to be its second cumulant. Just as some distributions do not have a mean, some do not have a variance. The mean exists whenever the variance exists, but not vice versa.

Suppose that the observations can be partitioned into equal-sized subgroups according to some second variable. Then the variance of the total group is equal to the mean of the variances of the subgroups plus the variance of the means of the subgroups. This property is known as variance decomposition or the law of total variance and plays an important role in the analysis of variance. For example, suppose that a group consists of a subgroup of men and an equally large subgroup of women. Suppose that the men have a mean body length of 180 and that the variance of their lengths is 100. Suppose that the women have a mean length of 160 and that the variance of their lengths is 50. Then the mean of the variances is (100 + 50) / 2 = 75; the variance of the means is the variance of 180, 160 which is 100. Then, for the total group of men and women combined, the variance of the body lengths will be 75 + 100 = 175. Note that this uses N for the denominator instead of N − 1.
In a more general case, if the subgroups have unequal sizes, then they must be weighted proportionally to their size in the computations of the means and variances. The formula is also valid with more than two groups, and even if the grouping variable is continuous.
This formula implies that the variance of the total group cannot be smaller than the mean of the variances of the subgroups. Note, however, that the total variance is not necessarily larger than the variances of the subgroups. In the above example, when the subgroups are analyzed separately, the variance is influenced only by the man-man differences and the woman-woman differences. If the two groups are combined, however, then the men-women differences enter into the variance also.
Letter to FTP: Running bad for 50,000 sngs Quote
09-06-2011 , 04:15 PM
bump. response yet?
Letter to FTP: Running bad for 50,000 sngs Quote
09-06-2011 , 05:18 PM
Quote:
Originally Posted by metsfan88
In probability theory and statistics, the variance is used as one of several descriptors of a distribution. It describes how far values lie from the mean. In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.
The variance is a parameter describing a theoretical probability distribution, while a sample of data from such a distribution can be used to construct an estimate of this variance: in the simplest cases this estimate can be the sample variance.

The variance of a random variable or distribution is the expectation, or mean, of the squared deviation of that variable from its expected value or mean. Thus the variance is a measure of the amount of variation within the values of that variable, taking account of all possible values and their probabilities or weightings (not just the extremes which give the range). For example, a perfect die, when thrown, has expected value (1 + 2 + 3 + 4 + 5 + 6)/6 = 3.5, expected absolute deviation 1.5 (the mean of the equally likely absolute deviations (3.5 − 1, 3.5 − 2, 3.5 − 3, 4 − 3.5, 5 − 3.5, 6 − 3.5), giving (2.5, 1.5, 0.5, 0.5, 1.5, 2.5), but expected square deviation or variance of 17.5/6 ≈ 2.9 (the mean of the equally likely squared deviations 2.52, 1.52, 0.52, 0.52, 1.52, 2.52).
As another example, if a coin is tossed twice, the number of heads is: 0 with probability 0.25, 1 with probability 0.5 and 2 with probability 0.25. Thus the variance is 0.25 × (0 − 1)2 + 0.5 × (1 − 1)2 + 0.25 × (2 − 1)2 = 0.25 + 0 + 0.25 = 0.5. (Note that in this case, where tosses of coins are independent, the variance is additive, i.e., if the coin is tossed n times, the variance will be 0.25n.)
Unlike expected deviation, the variance of a variable has units that are the square of the units of the variable itself. For example, a variable measured in inches will have a variance measured in square inches. For this reason, describing data sets via their standard deviation or root mean square deviation is often preferred over variance. In the dice example the standard deviation is √(17.5/6) ≈ 1.7, slightly larger than the expected deviation of 1.5.
The standard deviation and the expected deviation can both be used as an indicator of the "spread" of a distribution. The standard deviation is more amenable to algebraic manipulation, and, together with variance and its generalization covariance, is used frequently in theoretical statistics; however the expected deviation tends to be more robust as it is less sensitive to outliers arising from measurement anomalies or an unduly heavy-tailed distribution.
Real-world distributions such as the distribution of yesterday’s rain throughout the day are typically not fully known, unlike the behavior of perfect dice or an ideal distribution such as the normal distribution, because it is impractical to account for every raindrop. Instead one estimates the mean and variance of the whole distribution as the computed mean and variance of n samples drawn suitably randomly from the whole sample space, in this example yesterday’s rainfall.
This method of estimation is close to optimal, with the caveat that it underestimates the variance by a factor of (n−1)/n (when n = 1 the variance of a single sample is obviously zero regardless of the true variance), a bias which should be corrected for when n is small. If the mean is determined in some other way than from the same samples used to estimate the variance then this bias does not arise and the variance can safely be estimated as that of the samples.
The variance of a real-valued random variable is its second central moment, and it also happens to be its second cumulant. Just as some distributions do not have a mean, some do not have a variance. The mean exists whenever the variance exists, but not vice versa.

Suppose that the observations can be partitioned into equal-sized subgroups according to some second variable. Then the variance of the total group is equal to the mean of the variances of the subgroups plus the variance of the means of the subgroups. This property is known as variance decomposition or the law of total variance and plays an important role in the analysis of variance. For example, suppose that a group consists of a subgroup of men and an equally large subgroup of women. Suppose that the men have a mean body length of 180 and that the variance of their lengths is 100. Suppose that the women have a mean length of 160 and that the variance of their lengths is 50. Then the mean of the variances is (100 + 50) / 2 = 75; the variance of the means is the variance of 180, 160 which is 100. Then, for the total group of men and women combined, the variance of the body lengths will be 75 + 100 = 175. Note that this uses N for the denominator instead of N − 1.
In a more general case, if the subgroups have unequal sizes, then they must be weighted proportionally to their size in the computations of the means and variances. The formula is also valid with more than two groups, and even if the grouping variable is continuous.
This formula implies that the variance of the total group cannot be smaller than the mean of the variances of the subgroups. Note, however, that the total variance is not necessarily larger than the variances of the subgroups. In the above example, when the subgroups are analyzed separately, the variance is influenced only by the man-man differences and the woman-woman differences. If the two groups are combined, however, then the men-women differences enter into the variance also.
this
Letter to FTP: Running bad for 50,000 sngs Quote

      
m