Quote:
Originally Posted by DcifrThs
prob an easy Q. seems logical to me but not sure how to prove it.
assume two distributions, distA and distB. they are defined by their mean and standard deviations.
both have identical first, third, and fourth moments. is it possible that they have different variances?
First, you have to be more specific about the meaning of the phrase, "they are defined by their mean and standard deviations." I will suppose that it means the following. There is a family
A of cumulative distribution functions such that
- For each pair (m,s), where m is a real number and s is a nonnegative real number, there exists a unique distribution function F(x) in A with mean m and standard deviation s.
- If F(x) is in A and a and b are real numbers with a > 0, then F(ax + b) is in A.
The question is now: if F and G are in
A, and F and G have identical first, third, and fourth moments, then does it necessarily follow that F = G? The answer is yes.
To see this, let H in
A be the distribution function with mean 0 and variance 1. Let X be a random variable whose distribution is H. If m is the mean of F and G, and s and t are the standard deviations of F and G, respectively, then Y = m + sX and Z = m + tX have distributions F and G, respectively.
Since F and G have the same fourth moments, we have E[(Y - m)
4] = E[(Z - m)
4], which gives
s4E[X4] = t4E[X4].
Note that E[X
4] = 0 implies X = 0 almost surely, which implies X has variance 0. Therefore, since X has variance 1, we must have E[X
4] > 0, which implies s = t.
Last edited by jason1990; 05-06-2010 at 09:06 PM.