tomedunn
Explorer
Goodness. I had meant to reply some time back but I must have gotten distracted by something along the way. Sorry about that.
For a while, my thinking was like yours. I assumed WotC was calculating monster XP values and then using those to assign CRs, but now I'm not so sure. My uncertainty comes from looking back on monsters from the DnD Next playtest.
Throughout many of the early DnD Next playtest packets monsters had levels assigned to them as well as XP values. Importantly, those XP values weren't fixed within a given level. One level 1 monster might be worth 100 XP and another might be worth 150 XP. In essence, this could give better resolution, i.e., more data points, for how WotC calculates XP.
Now, from reading through the playtest packet release notes it's clear that the XP formula changed a few times over the course of the playtest. But my hope/expectation was that the formula would converge towards my calculation over time. That's not quite what the data showed.
To cut to the chase, the figure below summarizes my findings. It plots monster XP divided by monster HP and DPR against the sum of their armor class and attack bonus (averaged by CR or level). In terms of my XP equation, it shows how the part in parenthesis changed over time.
In the earliest playtest I have data for, labeled PT4 in the above plot, the sensitivity to changes in armor class and attack bonuses were about twice as strong as they ought to be. That got worse in PT5, but it improved for PT6. The default monster stats in the DMG improved on this again, with a final sensitivity that was very close to my theoretically derived values. This is what I was expecting to see.
However, much to my surprise, when I looked at published monsters in 5e the trend was very different. For official 5e monsters below CR 20 there is virtually zero sensitivity to changes in armor class and attack bonuses. At least, on average across CRs (within each CR the sensitivity still exists). Then suddenly, above CR 20, the expected sensitivity from the DMG shows back up.
It took some time to wrap my head around what was happening, but it looks like they changed their formula to use relative scaling instead of absolute scaling. In my formula, as well as in the earlier playtests, the average adjustment for armor class and attack bonus increased with monster level/challenge rating because the baseline values of those stats also increase. However, for official 5e monsters this value only increases when the monster's stats are stronger than intended for their CR, which is based around a 65% chance of hitting or being hit when facing level appropriate PCs. The reason the expected scaling returns for CR 20+ monsters is because PCs stop at level 20.
This kind of relative scaling strongly suggests they're using a system, not unlike the one outlined in the DMG, to determine monster CR rather than calculating XP values directly and then assigning a CR based on that. Such an approach isn't inherently bad, but it does have side effects. For example, low CR monsters will be weaker than expected based on their XP values, especially when pitted against higher level PCs.
This is actually something that's come up in my analysis for some time now, but without any good explanation as for why until now.
For a while, my thinking was like yours. I assumed WotC was calculating monster XP values and then using those to assign CRs, but now I'm not so sure. My uncertainty comes from looking back on monsters from the DnD Next playtest.
Throughout many of the early DnD Next playtest packets monsters had levels assigned to them as well as XP values. Importantly, those XP values weren't fixed within a given level. One level 1 monster might be worth 100 XP and another might be worth 150 XP. In essence, this could give better resolution, i.e., more data points, for how WotC calculates XP.
Now, from reading through the playtest packet release notes it's clear that the XP formula changed a few times over the course of the playtest. But my hope/expectation was that the formula would converge towards my calculation over time. That's not quite what the data showed.
To cut to the chase, the figure below summarizes my findings. It plots monster XP divided by monster HP and DPR against the sum of their armor class and attack bonus (averaged by CR or level). In terms of my XP equation, it shows how the part in parenthesis changed over time.
In the earliest playtest I have data for, labeled PT4 in the above plot, the sensitivity to changes in armor class and attack bonuses were about twice as strong as they ought to be. That got worse in PT5, but it improved for PT6. The default monster stats in the DMG improved on this again, with a final sensitivity that was very close to my theoretically derived values. This is what I was expecting to see.
However, much to my surprise, when I looked at published monsters in 5e the trend was very different. For official 5e monsters below CR 20 there is virtually zero sensitivity to changes in armor class and attack bonuses. At least, on average across CRs (within each CR the sensitivity still exists). Then suddenly, above CR 20, the expected sensitivity from the DMG shows back up.
It took some time to wrap my head around what was happening, but it looks like they changed their formula to use relative scaling instead of absolute scaling. In my formula, as well as in the earlier playtests, the average adjustment for armor class and attack bonus increased with monster level/challenge rating because the baseline values of those stats also increase. However, for official 5e monsters this value only increases when the monster's stats are stronger than intended for their CR, which is based around a 65% chance of hitting or being hit when facing level appropriate PCs. The reason the expected scaling returns for CR 20+ monsters is because PCs stop at level 20.
This kind of relative scaling strongly suggests they're using a system, not unlike the one outlined in the DMG, to determine monster CR rather than calculating XP values directly and then assigning a CR based on that. Such an approach isn't inherently bad, but it does have side effects. For example, low CR monsters will be weaker than expected based on their XP values, especially when pitted against higher level PCs.
This is actually something that's come up in my analysis for some time now, but without any good explanation as for why until now.