D&D General How Saving Throws broke in modern D&D

Another option would be non-proficient saves progress as proficiency bonus - 2, so it would max out at +4 instead of +3 and you don't get any bonus in tier 1, where (as you are just starting out) I don't think a bonus to non-proficient saves should be given.

View attachment 393061

Just an alternative way to do it.
Better. Prof bonus and prof bonus +2 vs base DC of 10 vs 8.

Has the added benefit of making it work exactly like all other passive scores.
 

log in or register to remove this ad


Does it not? I haven't looked into 6e at all. 5e has diverging saving throws (2 progress according to the proficiency bonus for each class). Does it work differently in 6e?
Saves are pretty similar (the same?) in 5e and 6e. But since all classes use the same saving throw advancement (proficiency + ability bonus) there's no divergence.

. . . unless "divergence" means "not all saves get the proficiency bonus." I'm fine with that - different classes are probably better at avoiding different things.
 

I've long felt that something broke with saving throws in the transition from 2e to 3e. I knew high level saves were nerfed, leading to Rocket Tag, but today I had a shower thought where I realized more clearly what happened.

In classic D&D, attack bonuses diverge with level for the different classes. E.g. the Fighter starts out with only a 0-1 point bonus in their attack rolls compared to Wizards, but this grows with level until by 20 the difference is 9-10 points.

When the 2e Saving Throw tables were converted to Save Bonuses in 3e, the same pattern of diverging progressions was followed. E.g. a 1st level Cleric starts with a 2 point bonus to Will saves over Fighters, which grows to a 6 point difference at 20 (and more in practice, as the Cleric pumps their Wisdom). This was a mistake.

Because Saving Throws are rolled by the defender rather than the attacker, Save Bonus progressions also need to flip and converge not diverge with level.

This is because as we approach zero expected damage (0% chance to hit or 100% chance to save), the proportional difference in expected damage grows for the same linear difference in success chance. (If that doesn't make sense, hang with me).

Anybody who's ever worked out when to Power Attack in 3e will be familiar with this basic idea. Power Attack gets worse the harder your opponent is to hit, because as your chance to hit shrinks, attack mods become worth more than damage mods.

The same relationship holds for saving throws: as we approach 100% chance to save, save bonuses become proportionally more valuable. Our Save bonus is the attacker's hit penalty.

To demonstrate, imagine a failed Saving Throw is worth 10 damage.

If we're high level and I have a 90% chance of making the save, I can expect only 1 point of damage on average from this attack. If you have a 70% chance, you can expect 3 damage. We have a difference of only 20%/+4 on a d20, but you take 3x damage on average!

But if we're low level and I have a 30% chance of saving, I can expect 7 damage. If you have 10% chance, you can expect 9 damage. We have the same 20%/+4 difference as before, but now you only take 9/7 = 1.29x more damage on average.

So, in contrast to Attack Bonuses, Save Bonuses should actually converge as characters level up (assuming we want the typical Save chance to increase with level; if we want to maintain a flat chance, class differences in Saves can stay constant with level).
It seems to me that the much more effective way to address this is to do what 4e did: attacker always rolls, "saving throw bonuses" become set as defense numbers, "Saving Throws" become a duration mechanic.

This has several add-on benefits, such as streamlining the system (there's no need to remember whether an action induces a save or requires an attack roll) and opening up better, easier design space (because now you can have a buff spell that adds +1 attack or +1 defense, and it works equally well for every character, regardless of how they go about their attacks).

5e already uses saving throws as both a "does it happen" mechanic and as a "how long does it last" mechanic, so in that sense, we can retain the generic "Saving Throw" as 4e did and very little would change. This also unifies Death Saves with other kinds of saves: they are all a mechanic which says whether a negative status lingers or not, and also opens design space in the shape of the "Disease Track" which can do some cool stuff and also permits slowly-worsening conditions. (E.g. "Petrify" generally isn't instant: fail the first save and you're Slowed, fail the second and you're Dazed, and only with the third failure are you Petrified.) This allows the PCs to do something about a Very Very Bad Condition rather than simply suffering the "rocket tag" effect as you describe.

Turning "does it happen" into Attack rolls against non-AC defenses, and keeping saves (including Death Saves) as a "how long does it last"/"does it get better/worse" mechanic, would address all of your concerns and make it a lot easier to do other design space. It wouldn't be as traditional as the "saving throws" we're used to....but other than 1e->2e, every edition has done saving throws differently, so "tradition" is a bit of a weak argument here.
 

It seems to me that the much more effective way to address this is to do what 4e did: attacker always rolls, "saving throw bonuses" become set as defense numbers, "Saving Throws" become a duration mechanic.

This has several add-on benefits, such as streamlining the system (there's no need to remember whether an action induces a save or requires an attack roll) and opening up better, easier design space (because now you can have a buff spell that adds +1 attack or +1 defense, and it works equally well for every character, regardless of how they go about their attacks).

5e already uses saving throws as both a "does it happen" mechanic and as a "how long does it last" mechanic, so in that sense, we can retain the generic "Saving Throw" as 4e did and very little would change. This also unifies Death Saves with other kinds of saves: they are all a mechanic which says whether a negative status lingers or not, and also opens design space in the shape of the "Disease Track" which can do some cool stuff and also permits slowly-worsening conditions. (E.g. "Petrify" generally isn't instant: fail the first save and you're Slowed, fail the second and you're Dazed, and only with the third failure are you Petrified.) This allows the PCs to do something about a Very Very Bad Condition rather than simply suffering the "rocket tag" effect as you describe.

Turning "does it happen" into Attack rolls against non-AC defenses, and keeping saves (including Death Saves) as a "how long does it last"/"does it get better/worse" mechanic, would address all of your concerns and make it a lot easier to do other design space. It wouldn't be as traditional as the "saving throws" we're used to....but other than 1e->2e, every edition has done saving throws differently, so "tradition" is a bit of a weak argument here.
Yeah absolutely, flipping the saving throw into an attack roll rather than a defense roll makes more sense with math where the chance of failing tends to increase with level and diverges into increasingly strong/weak saves (like 3e & 5e). I can certainly see why 4e did that. This way the person rolling has a sense of progression with level, just like attack rolls. However this leads to caster dominance and rocket tag without extensive modification to high level spells/special abilities.

Personally, I'm not very attached to the old school saves themselves (the particular categories or exact numbers). I am attached to the classic spells/special abilities and how powerful they feel. Rolling a save or die, even if you only have a 15% chance of failing, is a tense roll! Similar odds to Russian Roulette :) So I prefer the old school save math (the general shape of it) simply because it works better with classic AD&D spells/SAs.
 

When the 2e Saving Throw tables were converted to Save Bonuses in 3e, the same pattern of diverging progressions was followed. E.g. a 1st level Cleric starts with a 2 point bonus to Will saves over Fighters, which grows to a 6 point difference at 20 (and more in practice, as the Cleric pumps their Wisdom). This was a mistake.

This was trivially fixed in 3E by using a standard progression table and making the classes give a +2 class bonus.
 

This was trivially fixed in 3E by using a standard progression table and making the classes give a +2 class bonus.
aka what 4e did.

The issue is simply that good saves continue to progress over time and bad saves do not... and so what starts out as a 20% difference between the saves can be like a 55-60% difference by the end.

4e solved that by ensuring the saving throw bonuses all scaled together (or to clarify your saving defenses....saving throws technically were a different mechanic in 4e), you got an initial boost from your class (maybe a feat), but otherwise all saves just got better.
 

Trending content

Remove ads

Top