Quote:
Quote:
variance: i still don't know what that means
I think youre staring it in the eyes
The variance of a list of numbers expresses how large the differences between the list elements are. It can be defined in several ways such as the following algorithm: compute the difference between each possible pair of numbers; square the differences; compute the mean of these squares; divide this by 2. The resulting value is the variance. The squaring is done to treat negative and positive differences alike - they need to add up, rather than cancel each other. In principle, this can be done by taking the absolute values (i.e., just dropping the signs), but squaring is more convenient for mathematicians, as the squared function is differentiable for all real numbers, and the absolute value is non-differentiable at zero.
The variance increases as the differences between the numbers increase. Hence, it is a measure of dispersion. The same result could have been obtained using another process, which is the second definition: Compute the mean; subtract the mean from each number (the outcomes are called "deviations"); square the deviations; take the mean of these squares. This will have the same outcome as the first definition, with less work. The variance increases if the differences between the numbers and mean increases. Hence, the variance can also be viewed as a measure for size of the deviations from the mean. That is, it says how far away the numbers are from their mean. If the variance is small, then most numbers are close to the mean.
If the first definition is considered again with an example, then it becomes clear that something special happens. Suppose the numbers are simply 1, 2, 3, 4.