Interpretation of childhood I.Q.

© 2004 Paul Cooijmans

Introduction

As written elsewhere, I am opposed to expressing I.Q.'s in relation to age peers (or whatever limited groups) and regard adult deviation scores as the true I.Q.'s. This implies that, in my view, scores of children must be expressed by adult norms. As they rarely are, below I describe a rough method to convert childhood age-peer scores to adult scores. Notice that this method can not be used in cases where the score has already been expressed by adult norms!

Take the age-peer score and the child's age when the test was taken. Calculate the child's "mental age", as if the score were a mental/biological age ratio score as in the past.

Now convert the mental age to a mental/biological age ratio score of a hypothetical person of age 16. This result is an approximation of the child's adult I.Q. at the moment the test was taken.

An example

A 10-year old scores 170.

Calculate mental age: 1.70 times 10 equals 17.

Convert to ratio score of hypothetical person aged 16: 17 divided by 16 times 100 equals 106.

So this 10-year old has an adult I.Q. of 106.

Another example

A 5-year old scores 275.

2.75 times 5 equals 13.75

13.75 divided by 16 times 100 equals 86.

So this 5-year old has an adult I.Q. of 86.

Another example

An 11.5-year old scores 180.

1.80 times 11.5 equals 20.7.

20.7 divided by 16 times 100 equals 129.

So this 11.5-year old has an adult I.Q. of 129.

Conclusion

When I.Q. is seen this way, it must be understood as a property that increases during childhood and reaches its peak in adulthood. The steep increase of intelligence typical of childhood ends about age 16. The further increase thereafter is shallow.