Quote:
Originally Posted by JinX11
So, I have a distribution of numbers that doesn't appear to be normal...as in it doesn't fit the conventional bell curve. Rather, the curve peaks much further to the left and trails off for a while out to the right.
To add some additional context, across a population of several million widgets, they have a min/avg/max of 1 / 2000 / 2.5 million. The calculated standard deviation for my example population is something like 37,000+, which is a pretty large number compared to the avg.
My very basic and dumb question is: is standard deviation even a useful metric worth reporting on when the distribution is non-normal? Would it be correct that the standard deviation becomes less and less useful as the normality of a distribution decreases?
Thanks, in advance!
Standard deviation and variance are meaningful for non-normal distributions, but their meaning is not as straightforward. I would recommend ready about Chebyshev's Inequality for an application to non-normal distributions.