Converting GRE Scores to D&D Intelligence

CRGreathouse said:
Roman already gave that implicitly -- he mentioned that the distributions are normal.



First, standardize the scores (setting mean=0 and stdev = 1):
(A-G)/M
(B-H)/N
(C-I)/O

If you have only one score, you're done -- just use it, or convert backward into the other forms. If you have two or three you can combine them into a weighted average. Say you have standardized score 2.1 on the GRE, -0.2 on the SAT, and 1.3 on an IQ test. If you think the SAT is about as meaningful as the GRE, and they're worth as much combined as the IQ test, calculate weighted average = 2.1 * 1/4 + -0.2 * 1/4 + 1.3 * 1/2. This can be converted back into any form desired, for example D&D Int scores (with mean 10.5 and stdev around 2).

This process is only difficult when you have different distributions.

Good idea on standardizing the distributions.

I still remember how to sum normal distributions (add the means to get the combined mean and add the variances to get the combined variance from which you can calculate the combined standard deviation). The problem is that it only works for independent normal distributions, yet my normal are not fully independent - they are positively correlated (and I have included the correlations for your perusal). I am not sure how to deal with the correlations in a mathematically correct way - I have tried adding the means normally choosing one of the distributions as a base thus leaving its variance intact, but multiplying the variance of the second distribution by 1 minus correlation with the base distribution and multiplying the variance of the third distribution by 1 minus correlation with the base distribution and then 1 - correlation with the second distribution.

Unfortunately, I have strong doubts about the statistical validity of my procedure - it was more of an intuitive way to deal with it - I don't know if it is possible. Furthermore, the results may differ based on which distribution is chosen as the base. Hence my coming here and asking for help on combining the three correlated normal distributions.
 

log in or register to remove this ad

Roman said:
Good idea on standardizing the distributions.

I still remember how to sum normal distributions (add the means to get the combined mean and add the variances to get the combined variance from which you can calculate the combined standard deviation). The problem is that it only works for independent normal distributions, yet my normal are not fully independent - they are positively correlated (and I have included the correlations for your perusal). I am not sure how to deal with the correlations in a mathematically correct way - I have tried adding the means normally choosing one of the distributions as a base thus leaving its variance intact, but multiplying the variance of the second distribution by 1 minus correlation with the base distribution and multiplying the variance of the third distribution by 1 minus correlation with the base distribution and then 1 - correlation with the second distribution.

Unfortunately, I have strong doubts about the statistical validity of my procedure - it was more of an intuitive way to deal with it - I don't know if it is possible. Furthermore, the results may differ based on which distribution is chosen as the base. Hence my coming here and asking for help on combining the three correlated normal distributions.

I think you need a correlation coefficient. If I remember correctly, the deviations are added something like sqrt(a^2 + 2rab + b^2). r is the correlation coefficient. If they are completely independent then you get sqrt(a^2 + b^2). If they are completely positively correlated, you get a+b, and if they are negatively correlated, you get a-b.

To get the correlation coefficient, however, you'll have to do some statistical analysis, or otherwise find a value for r between GRE, SAT, and IQ.


May want to doublecheck on that, it's been about a year since I took the class.
 



Thanee said:
Oh, I see. Didn't realize, that the normal is that normal. :D

It did not occur to me that this could be confused with the other meaning of the word normal as I was speaking about distributions. With hindsight it may have been better if I specified what kind of normal I am talking about.
 

The simplest way is to use percentile. If your game world uses 4d6 drop the lowest, then the distribution is as follows:

3 0
4 0.077160494
5 0.385802469
6 1.157407407
7 2.777777778
8 5.709876543
9 10.49382716
10 17.5154321
11 26.92901235
12 38.34876543
13 51.2345679
14 64.50617284
15 76.85185185
16 86.95987654
17 94.21296296
18 98.37962963

So if you test in the 98.38 percentile or above, you get an 18. If you tested in the 45th percentile, you get a 12.

The issue, however, is based upon the group which takes the GRE. I would surmise, they are of higher than average intelligence to begin with, so you might want to adjust for that by adding 5% to your percentile, for instance.
 


Remove ads

Top