D&D 5E Classes, and the structure of DPR

clearstream

(He, Him)
A very specific situation can be analyzed with probability distributions (including crits). However, the analysis for this could change significantly depending on the specific damage, hit rate and number of attacks paramaters you use. IMO, that means that specific kind of analysis in regards to this question won't be generalized enough to be beneficial.
Ah well. Here is some of the argument on blur in another context. I've looked at it a few times since, and it is very efficient where a character has good AC to begin with. D&D 5E - Bladesinger - a criticism of its design [EDIT let's not get into bladesingers though! This is linked only to share with you where some previous discussion landed.]

This is only true if you are talking relatively. Absolute matters more given the number of hits we are talking about potentially receiving. +1 AC will cause the same number of hits to miss at 20AC as at 10AC.
Another way to model this is to consider the reduction to the number of hits you are taking. If your AC is such that monsters hit you on say 18+, then gaining +1 means that for every three hits you were going to take, you instead take two. An improvement of a third or 33%. Generally, it is more important to model defenses as reduction in incoming, than as change on base value, because it is the reduction on incoming that will be experienced at the table.

One character in my current campaign is an SnB battlemaster. The player isn't overly concerned with optimisation, but they did take commander's strike, which they can use to give the party rogue a second sneak attack each round. The battlemaster simply stands next to a target, and it will be in for two turns of sneak attack damage a round (presently +6d6 per sneak attack). The battlemaster has chainmail, a shield +1, and defense, for 20 AC. The character's weakness - as a dwarf - is their move.

In a recent encounter - in ToA - the monsters had +6 to hit, and there were nine of them with two attacks each. For every eight hits he would take at AC 19, he takes seven at AC 20. An improvement of 12.5%. Their damage was about 10 a hit, and due to specifics of ToA he has 144hp. On average it would take 41 of their attacks for them to kill him at AC 20, and 36 at AC 19. Again around a 12% improvement. More tellingly, if they dog-piled him they will on average kill him in two rounds, at AC 19, but will need three rounds at AC 20. If he had plate, the benefits of defense would be even more pronounced (about 17%, instead).

These kinds of considerations are not obvious, and I guess bring me back to my theme that defenses are typically undervalued.
 
Last edited:

log in or register to remove this ad

Exactly, that's why I think it is best to think of this as the baseline or reference-structure. And - crediting other respondents - it is very important to call out that there is diversity. Think about it like this
  1. The tiers are meaningful in 5E. Tier 2 characters will see a step (or steppy) increase in mechanical power over tier 1 characters. And tier 3 over tier 2, etc.
  2. The simplest place that is visible is in fighter extra attack and warlock eldritch blast.
  3. Each class has been consciously designed to have its own approach, but the approaches can be simplified to - A) I attack many times, B) I attack fewer times but do more damage, C) I attack and also defend. Fighters are A. Rogues are B. Rangers are C. Note that 'defend' is where you see even more diversity because it often spills over into explore (e.g. ability to hide).
  4. This simplifies the task of design as it means a designer knows the approach a class is using, and they know how much power to build into its features at each level. You can observe this most easily I think in the designs for sub-classes such as fey wanderer, if you compare its features to other more straightforward classes.
The designers introduce enormous variation as offsets from those backbones, but the intents always ring through. Look at every ranger 11th level feature from PHB, XGE and TCoE and compare those to every fighter 11th level feature. Cost the barbarian d12 HD in ASIs, and consider that against the value of spell slots and spell levels. You will see underlying skeleton bringing cohesion everywhere!
How can I get you to do a full analysis on all this? This is amazing!
 

HammerMan

Legend
you may have struck on the main MATH problem of 5e. not all classes match for damage, and the classes that do are not even with out of combat abilities...

Wizards (any caster really but lets stick to basics) get up grades to cantrips at 5 11 and 17, so at will they keep up, but are a bit weaker then most martial classes... BUT each spell level is a HUGE boom of damage, and once we are talking 5-11th level they have enough slots to not need those cantrips... BUT they also have access to every movement/social/exploration ability in one way or another with those slots...
 

This is only true if you assume there is never any advantage or disadvantage and everything is a straight roll with no modifiers.

Even when this is the case, when the numbers of hits is low already this dramatically changes the damage taken. For example, going from a 19 required to hit to a 20 required to hit will cut the number of hits in half and cut the amount of damage dice rolled against you by a full third. That is the difference between 100 damage and 67 damage on a series of attack rolls (and it would be less than 67 if some of that damage is a bonus and not from dice).

If you put disadvantage on top of that +1 it cuts the number of hits by 98% and the damage dice taken by 97%. That is the difference between 100 damage and 3 damage.
I think both perspectives are relevant.

In the 5e bounded accuracy, relative increases are the main factor if you are attacked often by creatures with low to hit values (many low level crossbow men or something).
If you are to be perforated by many bolts, being hit by 20% to 25% less means the difference between life and death, since your effective hp increases.

If you are facing one heavy hitter, probably with a status effect that takes you out of combat, effective hp are irrelevant and you just have a 5% better chance not to be taken out of combat.
 

clearstream

(He, Him)
Disadvantage has the most absolute affect on reducing hits when you are at 50% to be hit. Why - because the difference between h and h*(1-h) is maximized on the scale of 0 to 1 when h = 0.5. *h=chance to be hit
Hopefully you'll see my post above, dealing with your other concerns. On this one, I can give a simple example of why the picture is not quite as you have it.

Consider when I am only hit on a 20. It's obvious that 20^2 = 400. So the reduction in hits is from 1:20 to 1:400.

To give an idea of what that means experientially, I run my campaign weekly and we probably miss a dozen or so sessions a year: call it 40 sessions per year. 'About 5 rounds' is a fair approximation for length of combats. They can be longer. Seldom shorter. Characters are high-tier 2 and a fighter like our dwarf is probably attacked at least ten times a combat, seeing as many foes have multiattack.

So it is quite plausible to say that over a year, a character in my campaign is going to be attacked 400 times. If they had sufficiently high AC and blur, they will have gone from being hit once each fortnight, to being hit once each year.

Of course, characters don't have that level of AC, however the point stands - and is hopefully illustrated above - that applying disadvantage to attackers when you have a high AC is really stronger than applying disadvantage when you have low AC. As can be understood by seeing that 10^2 = 100 and 20^2 is 400. If a creature misses a character on ten numbers out of twenty, disadvantage makes it 100 numbers out of 400. If they can hit a character on one number out of twenty, disadvantage makes it 1 number out of 400.
 

HammerMan

Legend
Hopefully you'll see my post above, dealing with your other concerns. On this one, I can give a simple example of why the picture is not quite as you have it.

Consider when I am only hit on a 20. It's obvious that 20^2 = 400. So the reduction in hits is from 1:20 to 1:400.

To give an idea of what that means experientially, I run my campaign weekly and we probably miss a dozen or so sessions a year: call it 40 sessions per year. 'About 5 rounds' is a fair approximation for length of combats. They can be longer. Seldom shorter. Characters are high-tier 2 and a fighter like our dwarf is probably attacked at least ten times a combat, seeing as many foes have multiattack.

So it is quite plausible to say that over a year, a character in my campaign is going to be attacked 400 times. If they had sufficiently high AC and blur, they will have gone from being hit once each fortnight, to being hit once each year.

Of course, characters don't have that level of AC, however the point stands - and is hopefully illustrated above - that applying disadvantage to attackers when you have a high AC is really stronger than applying disadvantage when you have low AC. As can be understood by seeing that 10^2 = 100 and 20^2 is 400. If a creature misses a character on ten numbers out of twenty, disadvantage makes it 100 numbers out of 400. If they can hit a character on one number out of twenty, disadvantage makes it 1 number out of 400.
We had a DM ask a player to lower there AC because it was so high that if the DM through a monster or what ever that could hit him on a 16, he would hit most of us on 7-8's and one of us on a 5 (yes the two PCs neither was me had an 11pt AC difference). That was one of the last times we played 3.5. It wasn't the worst AC difference either, but it was the time the DM just said "Look, I could give everyone 2-3 magic items, or you can do this... I see no reason to play the escalation game"
The PC gave his +2 ring of defense to the lower AC player, and traded his armor with the DM for a +1 less but having fire and acid resistance.... and still had the highest AC at the table...

in 4e I asked the swordmage and ranger to trade out weapon expertise. because in a game with a tactics warlord they were too high of attack... both did without question but both were in the above game too and we all kinda learned not to power game too much way back on that day.

right now we occasionally find cool synergies that power us up, by we try to keep the party mostly in parity... and we do so as PCs and DMs
 

jgsugden

Legend
There is a lot of White Room work here with cracks in the paint showing different colors.

If you're going to do this type of analysis, I suggest you prepare by doing the following exercise. When you run your PCs, track effective damage per PC. Effective damage is damage dealt to an enemy that brings their health lower down to 0, but disregards all damage dealt that goes below 0. Track it per round of combat. Be sure not to disregard any results when you do this - if your GWM doesn't get to attack in the last round, or if they find that they're just wrapping up individual weenies rather than facing a big threat, don't ignore those situations. It is enlightening to really pay attention to the details here.

GWM: The utility of GWM is highly variable from game to game. If you're fighting beasts all the time, then it is great against those low ACs. If you're fighting high AC enemies all the time, it can be nearly a waste of a feat. However, most people focus on DPR and assume a 'middle of the road' AC distribution. Even when you do so, they fail to account for several handicaps of GWM. First - overkill. If an enemy has 7 hps and you're doing either 2d6+5 or 2d6+15, GWM is not valuable to you - and people do not discount the value of the DPR of GWM attacks for the reality that it inflicts more overkill than lesser damage attacks. Second, reducing your chance to hit increases the variability in combat - which opens the door to more bad luck streaks. I've seen far too many big weapon fighters go down due to a 'run of bad luck' that would not have been so bad had they not been using GWM.

Monk Damage: Monks are not intended to stay up to speed with other classes in damage dealing. They're just not. They have other special abilities that allow them to be effective, like stun, but they're intended to deal less damage than other classes - when concentrating on one foe. Their ability to split up their damage into three or four pools early on will reduce the damage lost to overkill, and keep them effective combatants from a damage perspective, but they arenot intended to be high damage PCs.
 

FrogReaver

As long as i get to be the frog
Another way to model this is to consider the reduction to the number of hits you are taking. If your AC is such that monsters hit you on say 18+, then gaining +1 means that for every three hits you were going to take, you instead take two. An improvement of a third or 33%. Generally, it is more important to model defenses as reduction in incoming, than as change on base value, because it is the reduction on incoming that will be experienced at the table.
That's the model I referred to when I said 'relative'. Having a relative 1000% increase sounds great - unless it's going from 1 penny to 10 pennies. Taking 33% fewer hits sounds great until you realize that in most circumstances it means taking around 1 fewer hits (or less).

IMO, it's not the relative reduction players experience at the table it's the absolute reduction in a given encounter/adventuring day that they experience.

In a recent encounter - in ToA - the monsters had +6 to hit, and there were nine of them with two attacks each. For every eight hits he would take at AC 19, he takes seven at AC 20. An improvement of 12.5%. Their damage was about 10 a hit, and due to specifics of ToA he has 144hp. On average it would take 41 of their attacks for them to kill him at AC 20, and 36 at AC 19. Again around a 12% improvement.
There's some important context missing. How many times did he actually get attacked in that encounter? My guess is that it was significantly lower than 36+ attacks. Probably only 20 attacks. (and maybe not even that many). In which case he would on average save 1 hit worth of damage (or less). 1 hit prevented from the enemies which you said would have been for about 10 damage. Preventing 10 damage in a single encounter on a 144 hp character isn't very impressive IMO. Still better than GWF but not particularly impressive.
 

FrogReaver

As long as i get to be the frog
Hopefully you'll see my post above, dealing with your other concerns. On this one, I can give a simple example of why the picture is not quite as you have it.

Consider when I am only hit on a 20. It's obvious that 20^2 = 400. So the reduction in hits is from 1:20 to 1:400.

To give an idea of what that means experientially, I run my campaign weekly and we probably miss a dozen or so sessions a year: call it 40 sessions per year. 'About 5 rounds' is a fair approximation for length of combats. They can be longer. Seldom shorter. Characters are high-tier 2 and a fighter like our dwarf is probably attacked at least ten times a combat, seeing as many foes have multiattack.

So it is quite plausible to say that over a year, a character in my campaign is going to be attacked 400 times. If they had sufficiently high AC and blur, they will have gone from being hit once each fortnight, to being hit once each year.
So it's one encounter of 5 rounds per session that he gets attacked 10 times in that session. Even if you only got hit on a 20, that means you are being hit less than 1 time per session (about 1 hit every 2 sessions). How valuable is it really to go from being hit 1 time every 2 sessions to being hit 1 time every 40 sessions (due to advantage)? I don't think it's valuable at all.

Now consider the scenario where you are being hit on an 11+ (50% of the time). That means you get hit 5 times per session and instead you go to being hit 5 times per 2 sessions (with advantage). IMO that's particularly valuable.

Why? Because 5 hits a session means you are taking significant damage. Halving significant damage is a significant result. Taking 1/20th of very little damage still means you are taking very little damage.

Of course, characters don't have that level of AC, however the point stands - and is hopefully illustrated above - that applying disadvantage to attackers when you have a high AC is really stronger than applying disadvantage when you have low AC. As can be understood by seeing that 10^2 = 100 and 20^2 is 400. If a creature misses a character on ten numbers out of twenty, disadvantage makes it 100 numbers out of 400. If they can hit a character on one number out of twenty, disadvantage makes it 1 number out of 400.
I understand the math. I disagree with your analysis of what that math means in relation to D&D tactics because you are missing the 'absolute' context for the number of hits and amount of damage you are preventing. Thus, based on that context I expanded in this post (above) - I suggest that applying disadvantage to enemy attacks is most beneficial when you are in the middle AC ranges (50% chance to be hit give or take a bit)
 
Last edited:

FrogReaver

As long as i get to be the frog
I think both perspectives are relevant.

In the 5e bounded accuracy, relative increases are the main factor if you are attacked often by creatures with low to hit values (many low level crossbow men or something).
If you are to be perforated by many bolts, being hit by 20% to 25% less means the difference between life and death, since your effective hp increases.

If you are facing one heavy hitter, probably with a status effect that takes you out of combat, effective hp are irrelevant and you just have a 5% better chance not to be taken out of combat.
Agree with much of this. I'd just add that PC's aren't dropping down to 0 hp all that often and if they aren't dropping down to zero very often then extra defense (AC or disadvantage to enemy attacks) is largely meaningless. It doesn't matter if you buff up to being able to take 1,000,000 eHP damage if you could take 200 eHP damage before and were never (or rarely) dropped to 0 at that eHP level.
 

clearstream

(He, Him)
IMO, it's not the relative reduction players experience at the table it's the absolute reduction in a given encounter/adventuring day that they experience.
For me it's more how it enables a character to be played. I find that typically in RPG, stacking toward extremes has a bigger impact on that than ameliorating middling capabilities.

There's some important context missing. How many times did he actually get attacked in that encounter? My guess is that it was significantly lower than 36+ attacks. Probably only 20 attacks. (and maybe not even that many). In which case he would on average save 1 hit worth of damage (or less). 1 hit prevented from the enemies which you said would have been for about 10 damage. Preventing 10 damage in a single encounter on a 144 hp character isn't very impressive IMO. Still better than GWF but not particularly impressive.
I found in some previous discussions on probabilities that it helped people to frame them in terms of how often they would arise over their game sessions. The most apposite inquiry is - over how many sessions will a character likely be subject to 20 versus 400 attacks in your own campaign?
 

clearstream

(He, Him)
I understand the math. I disagree with your analysis of what that math means in relation to D&D tactics because you are missing the 'absolute' context for the number of hits and amount of damage you are preventing. Thus, based on that context I expanded in this post (above) - I suggest that applying disadvantage to enemy attacks is most beneficial when you are in the middle AC ranges (50% chance to be hit give or take a bit)
Possibly we agree that it is most importantly about how it enables a character to be played. I argue that - "typically in RPG, stacking toward extremes has a bigger impact on that than ameliorating middling capabilities." What do you find?
 

FrogReaver

As long as i get to be the frog
Possibly we agree that it is most importantly about how it enables a character to be played. I argue that - "typically in RPG, stacking toward extremes has a bigger impact on that than ameliorating middling capabilities." What do you find?
If you are working under the premise of enabling a playstyle then I am working under the premise of efficiency.

I agree with you that having a fairly high ac and adding more or adding disadvantage would crease a bit different playstyle. Whereas adding more ac or disadvantage to a middling ac wouldn’t change your playstyle significantly.
 

FrogReaver

As long as i get to be the frog
GWM: The utility of GWM is highly variable from game to game. If you're fighting beasts all the time, then it is great against those low ACs. If you're fighting high AC enemies all the time, it can be nearly a waste of a feat. However, most people focus on DPR and assume a 'middle of the road' AC distribution. Even when you do so, they fail to account for several handicaps of GWM. First - overkill. If an enemy has 7 hps and you're doing either 2d6+5 or 2d6+15, GWM is not valuable to you - and people do not discount the value of the DPR of GWM attacks for the reality that it inflicts more overkill than lesser damage attacks. Second, reducing your chance to hit increases the variability in combat - which opens the door to more bad luck streaks. I've seen far too many big weapon fighters go down due to a 'run of bad luck' that would not have been so bad had they not been using GWM.
GWM has an often overlooked mechanic as well - when you drop an enemy to 0 hp you get a bonus action attack. So while the -5/+10 isn't good on enemies with low hp, that part of the feat is exceptionally good in that situation.

Monk Damage: Monks are not intended to stay up to speed with other classes in damage dealing. They're just not. They have other special abilities that allow them to be effective, like stun, but they're intended to deal less damage than other classes - when concentrating on one foe. Their ability to split up their damage into three or four pools early on will reduce the damage lost to overkill, and keep them effective combatants from a damage perspective, but they arenot intended to be high damage PCs.
Monks tend to be able to apply debuffs that increase party damage while still doing moderate damage themselves. Those debuffs that increase party damage never seem to get attributed to the monk's DPR. Which is a general tendency you'll see by many - ignore the impact of something that isn't easy to calculate.
 

clearstream

(He, Him)
GWM has an often overlooked mechanic as well - when you drop an enemy to 0 hp you get a bonus action attack. So while the -5/+10 isn't good on enemies with low hp, that part of the feat is exceptionally good in that situation.
I noticed that, too. It's a significant factor if contemplating an argument that suggests GWM is 'wasting' damage by overkilling creatures.

Monks tend to be able to apply debuffs that increase party damage while still doing moderate damage themselves. Those debuffs that increase party damage never seem to get attributed to the monk's DPR. Which is a general tendency you'll see by many - ignore the impact of something that isn't easy to calculate.
I also notice this. There is a general tendency to ignore complexities, and this is another good example. One thing I do in my theorycrafting is use a VTT to playtest cases, to see what I am overlooking. I also run a regular campaign for six players, but the focused playtests let me set up very specific cases, which my campaign might only rarely see.
 



clearstream

(He, Him)
How can I get you to do a full analysis on all this? This is amazing!
Are you familiar with the MtG mechanical colour pie - explained here by Rosewater. From breaking down the distribution of features over classes, it feels like something similar is going on.

Each class has a role, which appears to be articulated in terms of where it gains what (e.g. in some cases the norm for a subclass is to gain a defensive feature at level x, and additional damage at level y). As noted, each class opts into one or other basic approach to combat, on a skeleton of HD, ASIs, subclassing levels, DPR steps, and spellcasting.

I'm working on a sheet to lay this out visually so that it is easy to see.
 

Are you familiar with the MtG mechanical colour pie - explained here by Rosewater. From breaking down the distribution of features over classes, it feels like something similar is going on.

Each class has a role, which appears to be articulated in terms of where it gains what (e.g. in some cases the norm for a subclass is to gain a defensive feature at level x, and additional damage at level y). As noted, each class opts into one or other basic approach to combat, on a skeleton of HD, ASIs, subclassing levels, DPR steps, and spellcasting.

I'm working on a sheet to lay this out visually so that it is easy to see.
Amazing man, can't wait to see it!
 

clearstream

(He, Him)
Over four posts I'll present my class deconstruction, with a few notes as to what I think is going on (by design intent or otherwise). I focus on mechanics and - predominantly - combat. I've divided the deconstruction into -
  1. Sustain (aka defenses) - class features that keep a character in the combat
  2. Offence - broadly speaking, D&D combat is concluded by decrementing foe hit points
  3. Exploration - this covers making progress in the wider game world, including both explore and social pillars
  4. Sub-classes - this looks at how sub-classes map to classes
The yardstick I use to evaluate features is an ASI. Half an ASI is 1pt, a whole ASI is 2pts, and a double ASI is 4pts. This is approximate, somewhat opinionated, and in places I suspect the design intent was a feature at more or less value than what is actually on offer. Speaking of design intent - a fundamental assumption is that classes represent intentional design - not chance or accident, but it is not supposed that the designers aimed (or even could have hoped) for perfect balance. Hypothetically, the classes will be roughly balanced in value, i.e. worth a similar number of ASIs, by intent.

Starting here then, with sustain -

Class deconstruction - sustain.png


Patterns of note
  • Taking a d6 hit die to be free, d8, d10 and d12 are costed against the Tough feat, taking into account that they provide healing - via spending HD in rests - as well as hit point maximum
  • Bard, cleric, druid and warlock are all what I think of as d8-caster classes, and they are united in other important ways also; they invest 16-19pts in sustain
  • Sorcerer and wizard make the minimum possible investment into sustain, preserving their points (ASIs) for other things; they invest 4-5pts in sustain
  • The other classes all make moderate investments in sustain - 22-33pts - with monks and rogues at the lower end (I might be undervaluing uncanny dodge and evasion)
  • Barbarian stands out with 54pts invested in sustain; notice that red 'relentless' - more on that later.
So this is post one. Three to go.

[Footnote: I stop at 11th because I did this work for my upcoming 5E E6 campaign, which caps at level 6+5. If this work proves fruitful I might complete tier 3 later.]
 

Level Up!

An Advertisement

Advertisement4

Top