D&D 5E (2014) Tidbit for monster design

Goodness. I had meant to reply some time back but I must have gotten distracted by something along the way. Sorry about that.

For a while, my thinking was like yours. I assumed WotC was calculating monster XP values and then using those to assign CRs, but now I'm not so sure. My uncertainty comes from looking back on monsters from the DnD Next playtest.

Throughout many of the early DnD Next playtest packets monsters had levels assigned to them as well as XP values. Importantly, those XP values weren't fixed within a given level. One level 1 monster might be worth 100 XP and another might be worth 150 XP. In essence, this could give better resolution, i.e., more data points, for how WotC calculates XP.

Now, from reading through the playtest packet release notes it's clear that the XP formula changed a few times over the course of the playtest. But my hope/expectation was that the formula would converge towards my calculation over time. That's not quite what the data showed.

To cut to the chase, the figure below summarizes my findings. It plots monster XP divided by monster HP and DPR against the sum of their armor class and attack bonus (averaged by CR or level). In terms of my XP equation, it shows how the part in parenthesis changed over time.
Screenshot 2025-10-31 at 11.02.51 PM.png

In the earliest playtest I have data for, labeled PT4 in the above plot, the sensitivity to changes in armor class and attack bonuses were about twice as strong as they ought to be. That got worse in PT5, but it improved for PT6. The default monster stats in the DMG improved on this again, with a final sensitivity that was very close to my theoretically derived values. This is what I was expecting to see.

However, much to my surprise, when I looked at published monsters in 5e the trend was very different. For official 5e monsters below CR 20 there is virtually zero sensitivity to changes in armor class and attack bonuses. At least, on average across CRs (within each CR the sensitivity still exists). Then suddenly, above CR 20, the expected sensitivity from the DMG shows back up.

It took some time to wrap my head around what was happening, but it looks like they changed their formula to use relative scaling instead of absolute scaling. In my formula, as well as in the earlier playtests, the average adjustment for armor class and attack bonus increased with monster level/challenge rating because the baseline values of those stats also increase. However, for official 5e monsters this value only increases when the monster's stats are stronger than intended for their CR, which is based around a 65% chance of hitting or being hit when facing level appropriate PCs. The reason the expected scaling returns for CR 20+ monsters is because PCs stop at level 20.

This kind of relative scaling strongly suggests they're using a system, not unlike the one outlined in the DMG, to determine monster CR rather than calculating XP values directly and then assigning a CR based on that. Such an approach isn't inherently bad, but it does have side effects. For example, low CR monsters will be weaker than expected based on their XP values, especially when pitted against higher level PCs.

This is actually something that's come up in my analysis for some time now, but without any good explanation as for why until now.
 

log in or register to remove this ad

For a while, my thinking was like yours. I assumed WotC was calculating monster XP values and then using those to assign CRs, but now I'm not so sure. My uncertainty comes from looking back on monsters from the DnD Next playtest.

Looks like we both ended up reversing our interpretations, lol!

It took some time to wrap my head around what was happening, but it looks like they changed their formula to use relative scaling instead of absolute scaling. In my formula, as well as in the earlier playtests, the average adjustment for armor class and attack bonus increased with monster level/challenge rating because the baseline values of those stats also increase. However, for official 5e monsters this value only increases when the monster's stats are stronger than intended for their CR, which is based around a 65% chance of hitting or being hit when facing level appropriate PCs. The reason the expected scaling returns for CR 20+ monsters is because PCs stop at level 20.

This kind of relative scaling strongly suggests they're using a system, not unlike the one outlined in the DMG, to determine monster CR rather than calculating XP values directly and then assigning a CR based on that. Such an approach isn't inherently bad, but it does have side effects. For example, low CR monsters will be weaker than expected based on their XP values, especially when pitted against higher level PCs.

This is actually something that's come up in my analysis for some time now, but without any good explanation as for why until now.

I have a few thoughts.

First thought is that I'm not ready to abandon the XP-based CR assignment yet, because the accuracy of the results (using any version of XP-based that I've tried) with my basic test monsters blows everything related to using the DMG method out of the water for accuracy. Of the main three results I've played with (your exponential calculation, your linear calculation, your linear calculation with -1 instead of -2, and my calculation that I've describe below) one of them gives the wrong CR for 2 of the 45 monsters, two of them give the wrong CR for only 1, and one of them gives perfect results. The DMG guidelines give the wrong result 18 times, and my revision 15 times. That's not even in the same ballpark.

My second thought is actually questions about some of the methodology I read on your site awhile ago, and need some clarification on. You're clearly very aware that the monster calculations take account of modifiers to effective ac, hp, attack bonus, and dpr, but what I wasn't clear on is whether you applied all of those factors into the monster stats before running them through the calculations. It seemed at one point you were acknowledging those factors mattered, but not really calculating them because the results were pretty close even without them, though I may very well have been misinterpreting what you were saying. But this matters enough that it might very well explain why the absolute scale works with your math at lower levels (when these factors are less common) but not at higher levels (when these factors are ubiquitous and an important part of the monster's features).

As one really simple example of how this could matter, take saving throw proficiency. The DMG says to give a +2 or +4 to effective AC if they have a certain number of proficiencies, which works okay at mid levels but less well at low or high levels (which is something I noticed about the math in the DMG in general). The spreadsheet they actually use that I posted a messy reconstruction of from screenshots back on page 2 includes "Saving Throw Bonuses" as an input field. Regardless of whether that phrasing suggests that you are supposed to tally the actual bonuses (ie, number of save proficiencies X expected proficiency bonus) or just enter the number of proficiencies, in either event, it's likely that their formula takes the tallied bonuses, divides them by 6, and then adds the result to their effective AC, because that's how it should work mathematically. A low CR creatures with a 2 save bonuses (like the banshee) should add only 0.67 to their effective AC, but a high CR creature with 4 save bonuses (like the empyrean) should be adding 4.67 points of effective AC. That's a pretty big deal at high levels (and a pretty big difference from low levels).

And that's for a calculation we can guess pretty close to what should be done, and potentially isolate variables and test it. There are other calculations for which I don't have enough information to even determine how we should do them. For instance, we know from Jeremy Crawford (and the tidibits in the accidentally released Adventures' League doc a few years ago that was apparently for their business partners rather than consumers, since it makes references to the spreadsheet and instructions on how to use it), that conditions count as effective damage. Jeremy Crawford tells us a bit about how to determine this, and gives us the example that scorching ray is the damage value they use for turn denial conditions. However, we don't have any other hard identifications of how much effective damage other conditions count as.

We also don't know exactly how they handled secondary damage, but (and this is a complete aside), from the entry fields on the spreadsheet it looks like they did it wrong. In order to accurately determine the value of the wyvern's poison damage which only takes effect after you are hit and fail a save, we have to multiply its damage value by the expected chance of failing the save--which on some monsters will be quite different than the expected chance of being hit by the delivering attack. 2024 monsters don't have to take account of that, since secondary damage always applies, but 2014 ones do, and unless the full usage instructions for the spreadsheet tells people how to calculate that before entering the data (which I find unlikely unless they say to just multiply secondary damage by 0.65 and hope it comes out close enough), their results entirely omit that critical step.

Back from the aside, the point I was trying to get at, is that I'm not sure we can determine how well any of our formulas work for non-basic monsters without understanding how these other variables apply. That's why I've limited myself purely to the 45 basic monsters that I know are unaffected by anything beyond actual AC, HP, Attack Bonus, and Hit Points, and have no secondary damage attacks or imposed conditions. I even left out monsters that could fly, just in case it somehow matters.

Anyway, trying to reign in my excitement to talk about all these variables and get back on task...were you able to include any of that in your comparisons of monsters to your projections? And if you were able to determine how any of that should apply, I'd be very grateful if you could share some of those rules, because I haven't even started on them yet.

My third thought is the formula I came up with that I said I'd share earlier. When I was messing around trying to find a formula to match all of the basic monsters, I tried this:

XP = ((HP / 4) / (0.65 + (13 - AC))) * ((DPR / 0.65) * (0.65 + (0.05 * (AB - 3))))

The results for that one were as good as the exponential calculation: they both failed on only 1 monster. In this case it was the stegosaurus, which is not the same one that the exponential one failed on (that one doesn't work on the cyclops) but is one of the ones that the linear version fails on. I'm a little concerned about how it might work at higher levels because it seems to have a shallower slope than the other formulas, but it's possible that the standard values I chose (AC 13, AB 3) are slightly off. However (and forgive my messy notation--math is not my area), you should be able to see what I'm doing with it. It doesn't introduce any approximations, it just includes the modifications that should apply to HP based on AC, and that should apply to DPR based on AB individually before multiplying HP and DPR together. I just now tested it by tweaking the AC and AB numbers. None of the tweaks (I tried bumping each value one point up or down in various combinations) made it work perfectly, but half of them gave me the same results: only the stegosaurus was off (the others gave a higher number of bad results, some of them by quite a bit).

I had started to think that perhaps by applying relative scaling here we could make it work. Since there is no field in the spreadsheet for entering CR, we know that the formula doesn't rely on you knowing it ahead of time (though you have to guess for certain portions like damage resistance). But it's quite possible that the formula starts by multiplying the raw HP and DPR without modifying for AC and AB, and then uses the result of that to determine a scaling tier which determines which AC and AB formulas it will use. That's easy in a spreadsheet. However, increasing the values as you go up, as per the DMG, actually makes the problem worse rather than better using my formula. You'd have to adjust them in the opposite direction from the DMG to fix these values, which seems very odd.

A couple other thoughts have just occurred to me after messing with the formulas as I was writing this. One is that stegosaurus stands out as a problem. The only formulas that can deal with it are your exponential one (which has a steeper slope, and is too high on the cyclops), and your linear one with a -1 rather than -2 (which somehow magically works for everything, even though, if I understand correctly, it relies on two approximations in its derivation). It fails on everything else. I'm starting to believe that maybe a mistake was made in the data entry for the stegosaurus when they were making Volo's Guide. In this kind of analysis, I'm loathe to dismiss something as an anomaly, as it could just as easily be evidence that the formulas are all wrong. But I'm at least considering the possibility. If the developer had entered a 36 instead of a 26, or a 9 instead of a 7, or a 15 instead of a 13 or a 96 instead of a 76, it would give us the listed CR 4. Going forward, I may flag the stegosaurus as a potential error in the published math and test out formulas that work on everything except the stegosaurus to see how they hold up with more complicated monsters.

I have already forgotten my other thought, but if you could fill me in on if/how you included the variables that can adjust effective AC/AB/HP/DPR in your analysis of the fit of the MM monsters against your projections, that could help out a lot.

Thanks!
 

Remove ads

Top