• The VOIDRUNNER'S CODEX is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!

D&D 5E Int 8, Wis 8, Cha 18 Sorcerer

Hriston

Dungeon Master of Middle-earth
In Libre Calc (I don't have Excel and I have principles)
column A = numbers 3-18
Column B = frequencies 1, 3, 6, 10, 15, 21, 25, 27 (and again backwards)
Colum c= first cell "= (A1-10.5)^2*B1/206" and copied all the way to C16
C17= "=Sum(C1:C16)"
C18= "=Sqrt(C17)"
C17 gives 9.1747572816
C18 gives 3.0289861805 ~ 3.03

I didn't calculate the SD myself. I remember doing some research to find out what seemed to be the right answer. But shouldn't you have divided by 216 instead of 206?

edit: On the other hand, I've definitely seen multiple occurrences of the SD of 3d6 bein rounded to 3, so I don't think this is a matter of much contention.
 
Last edited:

log in or register to remove this ad

MoonSong

Rules-lawyering drama queen but not a munchkin
I didn't calculate the SD myself. I remember doing some research to find out what seemed to be the right answer. But shouldn't you be dividing by 216 instead of 206?

mmm, my own wis penalty showing up... n_n
turns out you were totally correct.
 


So this is probably a stupid question, how coincidental, but I'm curious why those of you who like to equate the raw intelligence score to IQ use that score to mark standard deviations and not the modifier? A character with an 8 Int when measured up against one with an 11 is mechanically identical to comparing a 9 v 10. So why not use the +/- modifiers to mark standard deviation instead? I assume it doesn't scale as well, but I was curious.

Because I'm a grognard, and my opinions on what stats mean in D&D predates 5E. In AD&D, there wasn't even any such thing as an "Int modifier." There was just a collection of capabilities: when your Int is 13 you have such-and-such percentage chance to learn spells, but you max out at 6th level spells; you need Int 18 to handle the hardest spells (9th level); Int 18 is the theoretical human maximum (unless you count aging modifiers, in which case it is Int 20); if you get Int 19 you can learn unlimited spells per level; etc.

I realize that 5E is a different game, but I'm operating under the assumption (or "preference" if you like) that it's supposed to at least theoretically support conversions of PCs and NPCs from AD&D => 5E, while keeping the same stats and therefore meanings of stats.

See here (http://www.frontiernet.net/~jamesstarlight/Statistics.html) for additional insight into how grognards think about stats.
 


Hriston

Dungeon Master of Middle-earth
So this is probably a stupid question, how coincidental, but I'm curious why those of you who like to equate the raw intelligence score to IQ use that score to mark standard deviations and not the modifier? A character with an 8 Int when measured up against one with an 11 is mechanically identical to comparing a 9 v 10. So why not use the +/- modifiers to mark standard deviation instead? I assume it doesn't scale as well, but I was curious.

No reason, and it's actually a good idea. Grouping INT scores by modifier would provide the opportunity to slap on some Gygax-style IQ labels. I think it would only require using the mid-point between the two scores. I'll put up a table if I get some time to work one up.
 

GameOgre

Adventurer
You guys have been great and my player read all your posts. He is indeed wanting to play this for role playing reasons.

Thanks again!
 

While I shall concede that he would be better rolled as a fighter or bard with those stats, take the Soldier background, give yourself a high con score and mostly enchantment spells. Any origin works, though I'm thinking chaos sorcerer to really hammer in on the silliness. Thank me later. ��

tmp_6988-NdZEGKvr1000958046.jpeg
 

Remove ads

Top