D&D General How Saving Throws broke in modern D&D

AlexofBarbaria

Explorer
I've long felt that something broke with saving throws in the transition from 2e to 3e. I knew high level saves were nerfed, leading to Rocket Tag, but today I had a shower thought where I realized more clearly what happened.

In classic D&D, attack bonuses diverge with level for the different classes. E.g. the Fighter starts out with only a 0-1 point bonus in their attack rolls compared to Wizards, but this grows with level until by 20 the difference is 9-10 points.

When the 2e Saving Throw tables were converted to Save Bonuses in 3e, the same pattern of diverging progressions was followed. E.g. a 1st level Cleric starts with a 2 point bonus to Will saves over Fighters, which grows to a 6 point difference at 20 (and more in practice, as the Cleric pumps their Wisdom). This was a mistake.

Because Saving Throws are rolled by the defender rather than the attacker, Save Bonus progressions also need to flip and converge not diverge with level.

This is because as we approach zero expected damage (0% chance to hit or 100% chance to save), the proportional difference in expected damage grows for the same linear difference in success chance. (If that doesn't make sense, hang with me).

Anybody who's ever worked out when to Power Attack in 3e will be familiar with this basic idea. Power Attack gets worse the harder your opponent is to hit, because as your chance to hit shrinks, attack mods become worth more than damage mods.

The same relationship holds for saving throws: as we approach 100% chance to save, save bonuses become proportionally more valuable. Our Save bonus is the attacker's hit penalty.

To demonstrate, imagine a failed Saving Throw is worth 10 damage.

If we're high level and I have a 90% chance of making the save, I can expect only 1 point of damage on average from this attack. If you have a 70% chance, you can expect 3 damage. We have a difference of only 20%/+4 on a d20, but you take 3x damage on average!

But if we're low level and I have a 30% chance of saving, I can expect 7 damage. If you have 10% chance, you can expect 9 damage. We have the same 20%/+4 difference as before, but now you only take 9/7 = 1.29x more damage on average.

So, in contrast to Attack Bonuses, Save Bonuses should actually converge as characters level up (assuming we want the typical Save chance to increase with level; if we want to maintain a flat chance, class differences in Saves can stay constant with level).
 

log in or register to remove this ad

Aren't save generally for half damage? A 10 point spell deals 10 or 5 (generally). There is likely some math in the save and take 5 or fail and take the 10.
 

You might take three times the damage, but the damage is still considerably less than it used to be. As the actual numbers grow ever smaller the relative percentage keeps growing, but that percentage is not really relevant, the actual damage is (unless you believe the damage dealt by monsters were scaling with the percentage of damage saved by the characters, but I very much doubt it does).
 

Aren't save generally for half damage? A 10 point spell deals 10 or 5 (generally). There is likely some math in the save and take 5 or fail and take the 10.
And, on top of that, how do you analyze spells that don't do damage but cause loss of actions - the save or sit/die spells?

Looking at 3e saves, goods saves were generally 1/2 level, bad saves 1/3 level. Good saves started with a bonus +2, bad saves +0. The thing you're supposed to compare them with is the level-based piece of the save DC, specifically, the highest level spell a same-level caster can hit you with. The good save is generally ahead of the level-factor, the bad save is behind, culminating in the good save at a base +12 being 3 points higher than the 9th level spell's +9 to the save DC, the bad save being 3 points lower than the 9th level spell. And with the DC starting at 10+spell level, the target had a 50% chance of saving, 50+15% for the maximum good save, 50-15% for the maximum bad save. From a design perspective, that's not a terrible place to start.
Then, as far as enhancing magic and feats go, defense is always cheaper in general. Iron Will gives +2 to the will save vs Spell Focus's +1 to DCs. And resistance bonuses are cheaper than stat enhancement bonuses. Again, not bad principles to work with.

The problems ended up being emergent in play and probably not anticipated. 3e's stat bonuses and ability to increase stats incentivized caster PCs dumping as much of their available point buy for their stats into their caster stat, then increasing it with stat boosters. The defense, however, had 3 stats to cover - some of which might not even match the PC's prime offensive stat and so got less love from the point buy. So there was a divide in incentivized behavior.
As a result, the bad stats really became pronounced Achilles' heels.
Then there was the issue of monster-based saves that were typically based on 1/2 hit dice and the fact that high CR creatures often had considerably more than 20 hit dice - which tended to blow the level-based increase to save DCs of +9 out of the water. 4e and 5e changed that behavior mainly by basing save DCs on CR rather than hit dice. And 5e's bounded accuracy tends to fix up the problem of casters dumping all of their enhancements on their caster stat - since now they have to stop at about +5 (or a little over with some rare magical help).

I think something really needed to be done about saving throws compared to 1e/2e because the way those worked, high level save or sit/die spells became increasingly useless against level-appropriate opponents - they were simply more likely to save and waste the caster's time because higher powered targets had (mostly) better saves across the board (and don't get me started on how shafted thieves are with the AD&D saving throw table). But 3e may have taken things too far in allowing the save or sit/die spell become too good from min-maxed casters. 5e is somewhere in the middle because bad saves really do stay bad, but save DCs tend not to get so out of hand, and concentration rules/repeated saves each round significantly blunt the power of save or sit spells by making them easier to eventually end even if the initial save fails.
 

Oddly, I think saving throws are on their way to recovery. Save vs. Petrification seems pulled out of thin air, but Proficiency with Constitution saves seems more intrinsic. If D&D would just realize that defending takes just as much effort as attacking, I might consider a 7th ed campaign.
 

I'm not sure I understand what the issue is. Is there a widespread perception that saving throws should converge more at high levels? To me it makes more sense that they would diverge. Though that might be because it's just what I'm used to. However, is the current system not working? Is saving throw progression widely perceived as a problem in games as they are actually played? It's not something that I can recall ever coming up.
 

assuming we want the typical Save chance to increase with level
Insightful analysis, but I don't really want that result.

I know it used to be that as characters and monsters got stronger they just better at resisting magic, and the source (caster, spell, etc) of the magic rarely mattered at all (other than the occasional spell that said you saved with a -4 penalty or such). But...that isn't appealing to me. While it my be fun for a while that once I'm high level I rarely ever fail a save, I just think it would make opponents feel underwhelming. And playing a caster, well that just the longer the game goes the less likely my spells become to work against our opponents--even the higher level ones I'm getting.

The 3e+ style where there is an ascending defensive skill vs an ascending offensive as a parallel to ascending AC vs ascending to hit just feels more natural and satisfying to me. (Although to be fair, other 4e AC has never kept up with attack bonuses.)

But again, good analysis, and I learned from it.
 


I know it used to be that as characters and monsters got stronger they just better at resisting magic, and the source (caster, spell, etc) of the magic rarely mattered at all (other than the occasional spell that said you saved with a -4 penalty or such). But...that isn't appealing to me. While it my be fun for a while that once I'm high level I rarely ever fail a save, I just think it would make opponents feel underwhelming. And playing a caster, well that just the longer the game goes the less likely my spells become to work against our opponents--even the higher level ones I'm getting.

The 3e+ style where there is an ascending defensive skill vs an ascending offensive as a parallel to ascending AC vs ascending to hit just feels more natural and satisfying to me. (Although to be fair, other 4e AC has never kept up with attack bonuses.)
Looks like we're waiting to hear back from OP, but I just read this into it:

"Diverging saving throws are saving throws that improve faster for one class than another class."

On one hand, there's very little reason why level 20 in each class shouldn't end up with about the same hit points and the same saving throws - they're all superheroes anyway.

On the other, isn't being a little broken the nature of D&D? Why do you think the masses rejected the perfect Matrix 4th edition? Regardless, 6th ed. does not seem to have diverging saving throws, so I think the problem, as I mentioned earlier, is ceasing to be a problem.
 

adding half prof bonus to non-proficient saves is enough to even the playing field;
+1 at 1st level
+2 at 9th level
+3 at 17th level
This was a house-rule we used for a while, but eventually dropped. On the d20 scale, having a rule for occassional +1 or +2 bonuses just seemed too fiddly so we dropped it a couple of years back. To be clear, it is a good rule and useful way of narrowing the gap (as you said), just not worth it to us.
 

Trending content

Remove ads

Top