D&D 5E Precision Strike: How to calculate

Assuming unlimited superiority dice the probabilities of turning a hit to a miss with the following usage rules:

Using superiority dice only when missed by 1: 1
Using superiority dice only when missed by 1 or 2 : 0.9375
Using superiority dice only when missed by 1 to 3: 0.875
Using superiority dice only when missed by 1 to 4: 0.8125
Using superiority dice only when missed by 1 to 5: 0.75
Using superiority dice only when missed by 1 to 6: 0.6875
Using superiority dice only when missed by 1 to 7: 0.625
Using superiority dice only when missed by 1 to 8: 0.5625

This means your net chance to hit increase would be:
100% * 5% = 5%
93.75% * 10% = 9.375%
87.5% * 15% = 13.125%
81.25% * 20% = 16.25%
75% * 25% = 18.75%
68.75% * 30% = 20.625%
62.5% * 35% = 21.875%
56.25% * 40% = 22.5%


Now I just need to calculate how much of that probability I lose due to the chance that I see either less or more opportunities to use superiority dice than I have (as I don't actually have infinite superiority dice)
 

log in or register to remove this ad

Don't forget, though, that the die size goes up as well as the die count. At, IIRC, 10 and 18 it goes up to d10 and d12 respectively, widening the gap.

I still think that there's a degree to which player information helps to increase the efficiency beyond the "theoretical" maximum. If you only use it in those cases where it's likely to give you a benefit, and you pay careful attention to what numbers do and don't hit, you can have a halfway-decent (and improving) chance to make sure that the expended die does in fact make the difference between missing and hitting. Meaning, it turns most/all of the "close but no cigar" hits into actual hits, and it's a little hard to buy that for a range of 6 to 8 faces of the d20 you can't get better than a 22.5% increase in total hits.

(Edit: ignore this section. I've redone math for it twice now, and both times made erroneous calculations. Evidently I can't math right now.)

Incidentally, I suspect that the reason you found "miss by 4" to be the sweetspot is because the average value of d8 is 4.5--so going from "missing by 4" to "missing by 5" would be where you flip from "a majority of misses turning into hits" to "a majority of misses remaining misses." This suspicion is easy to test--simply repeat your simulation using a d10 (including the "need 1," "need 1 or 2," "need 1 to 3," ... , "need 1 to 10" part) and then again for d12. If the sweetspot increases to 5 and then 6, my suspicion is accurate.
 
Last edited:

Don't forget, though, that the die size goes up as well as the die count. At, IIRC, 10 and 18 it goes up to d10 and d12 respectively, widening the gap.

I still think that there's a degree to which player information helps to increase the efficiency beyond the "theoretical" maximum. If you only use it in those cases where it's likely to give you a benefit, and you pay careful attention to what numbers do and don't hit, you can have a halfway-decent (and improving) chance to make sure that the expended die does in fact make the difference between missing and hitting. Meaning, it turns most/all of the "close but no cigar" hits into actual hits, and it's a little hard to buy that for a range of 6 to 8 faces of the d20 you can't get better than a 22.5% increase in total hits.

(That is, we assume that (X*5)% of attacks always hit, where X is at least 1 because a nat 20 always hits; (Y*5)% of attacks always miss, possibly with the same restriction if a nat 1 always misses; and (Z*5)% of hits could potentially hit with a sufficient roll of the superiority die, where Z=(20-X-Y), 0≤Z≤19.)

Incidentally, I suspect that the reason you found "miss by 4" to be the sweetspot is because the average value of d8 is 4.5--so going from "missing by 4" to "missing by 5" would be where you flip from "a majority of misses turning into hits" to "a majority of misses remaining misses." This suspicion is easy to test--simply repeat your simulation using a d10 (including the "need 1," "need 1 or 2," "need 1 to 3," ... , "need 1 to 10" part) and then again for d12. If the sweetspot increases to 5 and then 6, my suspicion is accurate.

Theoretical Maximum is assuming 100% of monster AC's are immediately known. This likely will be close to true but not quite. Any new monster may take a couple of turns to deduce AC.

So the maximum is taking perfect player knowledge into account.

Maybe we should approach the question from this perspective. If you added a flat +8 what would your chance to hit increase by? 40%. If you added a flat +6 to hit what would your chance to hit icnrease by? 30%? Is a flat +6 to hit considered better than +1d8 to hit? I think it's apparent that it should be. What about +5? I would say +5 is better too.... I'd say +4.5 is the most that you should ever see a d8 adding to chance to hit. However it can add considerably less if you aren't adding one to every attack.

In fact, due to our knowledge of how much we miss by we can achieve nearly the same results without needing a d8 on every attack. We can definitely acheive +22.5% by only needing a d8 on 40% of attacks if we are smart about which 40% we apply it to. However, we can get really close to that 22.5% by applying a d8 to just 35% of all attacks. We still remain pretty close even if we drop down to only using it on 30% of all attacks.
 
Last edited:

Remove ads

Top