Quote:
Originally Posted by nickthegeek
In IQ testing, the average score is defined to be 100.
These days it is actually the median, not the mean, that is defined as 100, but as you say that doesn't make a difference to the OPs question.
Quote:
Originally Posted by nickthegeek
It's also well known that IQ scoring distributes like (more or less) a normal distribution.
More or less.. basically each SD from the median is defined as 15 IQ points.
Quote:
Originally Posted by nickthegeek
if by "mean" we intend the expected value of the IQ distribution, than things are not much clear.
Quote:
Originally Posted by nickthegeek
- are the IQ scores of the sample modeled as iid's with 100 as average? In that case, the correct answer .... 101.
I think that would be 100% clear, as I said in an earlier post....
saying "the average is 100 for the population" is NOT the same as saying "the expected IQ for a person in the population is 100"
The distinction is an important one, but the OP was not ambiguous.
The reason I'm drawing the distinction (I'm not a life nit, promise!) is that the people who do this sort of calculation for a living (statisticians, actuaries) OFTEN make these errors in the real world, not because it's "close enough" but because they miss the significance. If the OPs question had been in an actuarial exam then 101 would have most definitely been wrong, unless the answerer explained explicitly he was assuming the population size was infinite and the impact of that assumption on the answer. The fact that the population was specified as 8th graders would have actually been a hint in an exam that the absolute population size was potentially relevant - as NoG alluded to in his post above.
A final analogy: to most people the number of decks in a blackjack shoe is irrelevant, and it can be treated as an infinite shoe. But for those whom it is relevant, it is sometimes EXTREMELY relevant.
Last edited by David Lyons; 12-03-2015 at 05:49 AM.