Charwoman Gene
Adventurer
This is quite technical. I think about this stuff. Sue me.
I am curious about researching the expected DMG point value of a "Standard" "4d6, drop the lowest"(4d6dl) D&D character. The trick is, I don't want to use statistics, I want to count up, using integer arithmetic, the actual point values for the roll of the dice, and divide by the number of possible ways the dice can be thrown.
Now, calculating the expected value for a single 4d6dl roll is fairly simple. I've done an Excel spreadsheet that does it for me. 12.24 is the result I get. Things get complicated because:
The definition of "worthless" is +0 or lower total modifier, or no ability score above 14.
This complicates matters as now we have to look at rolls for entire characters, not just individual 4d6dl rolls. Knwing the individual expected value is useless.
I used a spreadsheet with 1296 lines to calculate the Expected Value given above, using an increasing index, MOD 6, DIV 6, to calculate what the dice needed to show. [6 positions per die with 24 independent dice, 6^24 possibilities.] I'd need a 4,738,381,338,321,616,896 line spreadsheet to calculate this. (Excel stops at 65,536.) That is 4 quintillion possibilities. Yay.
How to solve this problem?
I don't know enough hard-core (Calculus) statistics to be comfortable with them in this case.
Okay, lets say I make a java program to do this brute force. and assume we can calculate the values once out of every 20 CPU actual cycles. (I don't know how good a number that is, I'm not that much of a hardware guy.) Crap. That's like 15 million hours. Which is like 1700 years.
I guess I should look into distributed parallel computing, huh?
Donate your CPU cycles now!
Anyone see a way to get me out of this trap?
I am curious about researching the expected DMG point value of a "Standard" "4d6, drop the lowest"(4d6dl) D&D character. The trick is, I don't want to use statistics, I want to count up, using integer arithmetic, the actual point values for the roll of the dice, and divide by the number of possible ways the dice can be thrown.
Now, calculating the expected value for a single 4d6dl roll is fairly simple. I've done an Excel spreadsheet that does it for me. 12.24 is the result I get. Things get complicated because:
Point Value vs. Ability Score is nonlinear.
D&D skews scores higher by rejecting "worthless" characters.
D&D skews scores higher by rejecting "worthless" characters.
The definition of "worthless" is +0 or lower total modifier, or no ability score above 14.
This complicates matters as now we have to look at rolls for entire characters, not just individual 4d6dl rolls. Knwing the individual expected value is useless.
I used a spreadsheet with 1296 lines to calculate the Expected Value given above, using an increasing index, MOD 6, DIV 6, to calculate what the dice needed to show. [6 positions per die with 24 independent dice, 6^24 possibilities.] I'd need a 4,738,381,338,321,616,896 line spreadsheet to calculate this. (Excel stops at 65,536.) That is 4 quintillion possibilities. Yay.
How to solve this problem?
I don't know enough hard-core (Calculus) statistics to be comfortable with them in this case.
Okay, lets say I make a java program to do this brute force. and assume we can calculate the values once out of every 20 CPU actual cycles. (I don't know how good a number that is, I'm not that much of a hardware guy.) Crap. That's like 15 million hours. Which is like 1700 years.
I guess I should look into distributed parallel computing, huh?
Donate your CPU cycles now!
Anyone see a way to get me out of this trap?