Numion
First Post
mmu1 said:*Sigh* Some of my basic math teachers are spinning in their graves, and they're not even dead yet...
So would mine .. university math professors, at that, if I made such mistakes as you think

In order for the feat to increase your chances of saving by around 10%, you'd have had to been able to save 90% of the time to begin with, which means saving on a 3 or better.
If you can save on a 15 or better (not uncommon at all for fighters making Will saves, for example), your chance of saving is 30%, so having the feat add +2 actually improves it by 33%. It adds 10% to your chance of saving, it doesn't increase your chance of saving by 10%.
Yeah, but that was not what I was talking about. I didn't mean increase in chances, as thats not really telling us much about the usefulness of the feat. Usefulness (topic of this thread) IMO means actually when the feat made a difference. All your increases aside the feat 'works' only when: a) you would've failed without it AND b) succeeded with it. Thats approximately a band of 2 numbers in the d20 roll, because the feat gives +2.
You still following?
2 / 20 means that in 10% of cases does the feat matter. 90% of rolls you would fail regardless of the feat, or you would've succeeded even without it. Basic probability math. Now, that 10% is indeed an approximation since 1 always fails and 20 succeeds, IIRC. It wouldn't chance the numbers a lot.
It might only come up on 1/10th of d20 rolls, but account for 1/4th of all successful saves
But that 1/4th is the conditional probability for the usefulness of the feat .. with the condition being that the save succeeds in the first place!. Not very intresting, since you've conditioned out an important part (unsuccesfull saves).