D&D General Explain Bounded Accuracy to Me (As if I Was Five)

Furthermore, "accuracy" isn't just your bonus to attack in proportion to the AC you're trying to hit.

It's also your bonus to skill usage in proportion to the skill DCs you're trying to hit, and any other parts of the system which express growth. And those parts have been compressed or flattened as well--but in ways that, frankly, kind of end up being the worst of both worlds.

A super-expert character is rolling with Advantage, Expertise, and a magical bonus or two (since magic is allowed to break the "no fiddly bonuses" rule whenever and wherever it feels like.) That's going to end up as something like (2d20k1)+12+5+1d4, or something to that effect, which is relatively likely to land in the 34+ range (as in, more than 50% of results are between 34 and 41), meaning stuff that's supposed to be damn near "going beyond the impossible". Yet Peter Paladin is rolling Stealth at the exact same 1d20 (with disadvantage)-1 that he was rolling back at level 1. An across-the-board nat-1 (a less than 1 in 1600 event) is identical to the Paladin rollng a crit.

So we still have the problem that a super-focused expert can achieve some stupidly high results...as in, comparable to what ultra-expert 4e characters could achieve, which the whole point was to get away from that, while at the same time these rules have created a situation where the Paladin not only sucks at stealth but gets progressively worse at it due to the party naturally focusing on tougher enemies over time.
This isn't a response to this post, but I forgot to respond to your other post. On static bonuses(+1, +2, +3), etc., I got buy in from my players by presenting it like this. I let them know that advantage was too big a boost for a lot of instances where I think there should be a bonus, so I can give them no bonus at all, or I can give them a lower static bonus instead. They wanted the static bonuses. By presenting the bonuses as additive instead of a nerf to advantage, I got buy in.

Maybe that will help or maybe not, but I wanted to put it out there. :)
 

log in or register to remove this ad

That's deceptive. It's like saying your chances of heart attack are doubled if you eat peanuts!!!! You've doubled that .01% chance. I just showed you objectively that 3e was almost twice as fast and yet you try to tell my that doubling .01% is faster. It's not. The true accuracy of 3e was much faster, even if the fraction is not. There is more to accuracy than a fraction.
I'm not talking about the full bonus but the rate of growth.

2e's rate of accuracy growth was not slow. It was faster than 5e's. You feel your accuracy growth a lot more too.
 

Furthermore, "accuracy" isn't just your bonus to attack in proportion to the AC you're trying to hit.
They were only addressing the attack tables/ THAC0, etc invovling attacks, so I didn't bring up "accuracy" in terms of skill checks.

It's also your bonus to skill usage in proportion to the skill DCs you're trying to hit, and any other parts of the system which express growth. And those parts have been compressed or flattened as well--but in ways that, frankly, kind of end up being the worst of both worlds.
Yeah, I've never been happy with "bounded accuracy" in terms of skills.

A super-expert character is rolling with Advantage, Expertise, and a magical bonus or two (since magic is allowed to break the "no fiddly bonuses" rule whenever and wherever it feels like.) That's going to end up as something like (2d20k1)+12+5+1d4, or something to that effect, which is relatively likely to land in the 34+ range (as in, more than 50% of results are between 34 and 41), meaning stuff that's supposed to be damn near "going beyond the impossible". Yet Peter Paladin is rolling Stealth at the exact same 1d20 (with disadvantage)-1 that he was rolling back at level 1. An across-the-board nat-1 (a less than 1 in 1600 event) is identical to the Paladin rollng a crit.
Yeah, such an insane expert (certainly possibly at 17+ levels) has about a 75% chance to do the "Nearly Impossible". But for many players, a tier 4 super expert with advantage should be able to hit DC 30 with the much loved 65+% chance of success which runs rampant through 5E from early on.

And those same players prefer that a PC in heavy armor with DEX 8 and no stealth proficiency SHOULD fail at stealth most of the time. Level doesn't matter at that point. Now, if Peter Paladin wants to be better at stealth, proficiency goes a decent way to increasing the odds, especially at higher levels.

Peter Paladin only has about a 6% chance to beat DC 15 without proficiency. Give him a +2 proficiency bonus and it doubles to over 12% (still crappy...). Give him guidance and advantage, like Edgar Expert has, and now Peter Paladin has a 47% chance of beating DC 15. Make Peter Paladin tier 4 and he even has a 42.5% chance to beat DC 20.

So, Peter Paladin in heavy armor, dumped DEX, and no proficiency in stealth will suck at DEX (Stealth) checks, regardless of level, because no resource has ever been put into it. But that's pretty much how it should be, shouldn't it???

So we still have the problem that a super-focused expert can achieve some stupidly high results...as in, comparable to what ultra-expert 4e characters could achieve, which the whole point was to get away from that, while at the same time these rules have created a situation where the Paladin not only sucks at stealth but gets progressively worse at it due to the party naturally focusing on tougher enemies over time.
Well, we're dealing with extreme cases here. I didn't think bounded accuracy was really meant a super-focused expert from being really great at the stuff they are, well, supposed to be really great at. And of course the Paladin isn't getting worse, it is that his opponents are paying better attention. ;)

What I thought bounded accuracy was supposed to do with skills is make reasonably hard DCs still a bit of a challenge later on?

I'm not talking about the full bonus but the rate of growth.

2e's rate of accuracy growth was not slow. It was faster than 5e's. You feel your accuracy growth a lot more too.
Again, because in AD&D it sucks to begin with. In the end, you're about the same place you are in 5E. But in 5E you start there and stay there. Now, that assumes were are discussing level-appropriate opponents. In both cases, with lower level stuff, your chance of hitting increases. Which in AD&D is nice since before you were lucky to hit a low-level creature, but later on you are 50-50 or better. However, in 5E, you are going from already good (65% or so), to ludicrously good (80%+) when encountering low-level creatures at higher levels.
 

This isn't a response to this post, but I forgot to respond to your other post. On static bonuses(+1, +2, +3), etc., I got buy in from my players by presenting it like this. I let them know that advantage was too big a boost for a lot of instances where I think there should be a bonus, so I can give them no bonus at all, or I can give them a lower static bonus instead. They wanted the static bonuses. By presenting the bonuses as additive instead of a nerf to advantage, I got buy in.

Maybe that will help or maybe not, but I wanted to put it out there. :)
Sure, but this is (more or less) just chucking Advantage altogether. And as much as I may dislike the way 5e uses it, I don't think it's a bad mechanic. E.g., I thought it was quite nice back when 4e "invented" it...as the Avenger "damage" bonus via accuracy.

The problem is, finding a way to integrate a more diverse bonus structure without totally binning what Ad/Dis actually bring to the table. Doing that is damn near impossible without ripping up the rules by their roots and rewriting a lot of small rules across the entire player-facing surface of the game. A frankly Herculean task.

Hence why I said that it's easier to remove something in this case ("just treat all bonuses as sources of Advantage, and they don't stack") than it is to try to add something that isn't there. Because the key addition isn't that something other than Advantage appears in one place or another; the key addition is the depth, which is a property that necessarily supervenes on the whole of the system, not on any individual rule. (Much as "texture" is something that individual atoms and molecules don't have, but which frex table surfaces do have, even though atoms and molecules are what comprise the things we call "table surfaces.")

You can polish a textured surface to a mirror shine. It's much harder to add a textured surface to something already polished to a mirror shine.
 

Just one of many reasons why I say 5e may have started with a lot of design ideas, but it had few to no actual design goals, and the further the design went along, the further it moved away from having them. Indeed, many of the design ideas it actually had ended up either not panning out, or getting aggressively eliminated before they could come into their own because of the ridiculous "if it doesn't please 70% of the fanbase immediately we can't use it" standard. (A standard, notably, they had to break once the playtest went private in the lead-up to publication, because they'd dithered for so long they no longer had the time to iterate anymore.) That's why they repeatedly went back to try to fix Ranger, because it wasn't up to par, and 5.5e is going back and addressing (IMO inadequately, but addressing nonetheless) other problematic elements like Berserkers and Champions, who have high usage rates but low satisfaction rates relative to the rest of the game.

And more than once, they got caught putting all their eggs in one basket (e.g. Specialties), only to have that fall through and leave them scrambling for an alternative, quietly ditching elements they'd actually spoken positively of before (like martial healing, which Mearls at one point tweeted would be in the game, and DMs who didn't like that could simply tell players they weren't allowed to take that option.)


I have often said things to the effect that calling the D&D Next surveys and polls low-quality would be insulting to low-quality statistics. I wouldn't be at all surprised if they did absolutely nothing to check for whether the data they'd collected actually came from representative samples. (Given their "math is easy, feel is what's hard" attitude, I strongly suspect most WotC employees at the time viewed statistics as boring and pointless. It would help explain why several obvious mathematical issues--such as the ghoul surprise, where the issue with saving throws went unheeded by the designers up until it bit them in the ass, unexpectedly, during a live demonstration game--went completely overlooked until the designers literally couldn't look away.
5e's design goal is to be popular, with as little objectionable content (to anyone) as possible. Unfortunately, being straightforward about that goal would have in all likelihood damaged its success. They have, I believe, succeeded admirably in that goal. How we feel about that is irrelevant.
 

They were only addressing the attack tables/ THAC0, etc invovling attacks, so I didn't bring up "accuracy" in terms of skill checks.


Yeah, I've never been happy with "bounded accuracy" in terms of skills.


Yeah, such an insane expert (certainly possibly at 17+ levels) has about a 75% chance to do the "Nearly Impossible". But for many players, a tier 4 super expert with advantage should be able to hit DC 30 with the much loved 65+% chance of success which runs rampant through 5E from early on.

And those same players prefer that a PC in heavy armor with DEX 8 and no stealth proficiency SHOULD fail at stealth most of the time. Level doesn't matter at that point. Now, if Peter Paladin wants to be better at stealth, proficiency goes a decent way to increasing the odds, especially at higher levels.

Peter Paladin only has about a 6% chance to beat DC 15 without proficiency. Give him a +2 proficiency bonus and it doubles to over 12% (still crappy...). Give him guidance and advantage, like Edgar Expert has, and now Peter Paladin has a 47% chance of beating DC 15. Make Peter Paladin tier 4 and he even has a 42.5% chance to beat DC 20.

So, Peter Paladin in heavy armor, dumped DEX, and no proficiency in stealth will suck at DEX (Stealth) checks, regardless of level, because no resource has ever been put into it. But that's pretty much how it should be, shouldn't it???
Except that he sucks at stealth checks against level 1 enemies. He sucked at them even when the characters were first level. Now that they're 20th, he's effectively getting guaranteed failures--and has learned absolute bupkis if he were to go back and deal with weaker threats.

That was not true in 4e, and the universality of the half-level bonus is precisely why. When the party is going out and doing things that are within or just beyond their comfort zone, his Stealth will be not great. (I would know, I have played such a paladin, though his name was Seth.) But if at, say, level 15 (analogous to 5e level 10), he were to need to go sneaking through areas populated with the kinds of threats he'd faced at level 1? He would be better at stealth than before. He would, in fact, have actually learned a thing or two. It wouldn't be enough to really make him all that good at it, a total bonus of +7 at level one is okay but not great, meaning he'd have solid chances to sneak past such things. (This, I must admit, I have not seen, but that only because my 4e games have been curtailed more than once by DMs having IRL issues that pulled them away from TTRPGing for the foreseeable future.)

Well, we're dealing with extreme cases here. I didn't think bounded accuracy was really meant a super-focused expert from being really great at the stuff they are, well, supposed to be really great at. And of course the Paladin isn't getting worse, it is that his opponents are paying better attention. ;)

What I thought bounded accuracy was supposed to do with skills is make reasonably hard DCs still a bit of a challenge later on?
The explicit aim was to reduce the size and amount of bonuses characters could receive, so that the numbers would be lower (and thus easier to do math with) and progression more tightly controlled. A PC getting more than +25 by level 18 in 5e is equivalent to a 4e character having +50 by level 27, something even ultra-experts hyper-specialized in one and only one skill would struggle to achieve. A much more typical skill bonus for many characters would be...well, about equal to their character level (half from half-level bonus, the other half from training, ability score, items, etc.), so around 27. Meaning, for a 5e character, roughly Expertise with a +0 modifier or Proficiency with a +5 modifier, and nothing more.

Hence, despite explicitly trying to curtail extreme bonuses and keep numbers within a neat, tidy, narrow range...bounded accuracy has actually not done all that much to bound accuracy. Instead, what it bound was off-label stuff. That stuff barely moves, and may even stay essentially flat across a character's career. Your weaknesses never get less weak, unless you radically refocus your character to address them, paying a steep price to do so. Meanwhile, your enemies get stronger; hence, instead of a treadmill, we have people straight-up losing a Red Queen's race.

Again, because in AD&D it sucks to begin with.
You probably wouldn't be surprised to know that I feel that 5e bonuses and effects suck to begin with (seriously, "competence" is now apparently succeeding about 15 percentage points more often!) So if AD&D is supposed to suck when treating 5e as one's baseline...

5e's design goal is to be popular, with as little objectionable content (to anyone) as possible. Unfortunately, being straightforward about that goal would have in all likelihood damaged its success. They have, I believe, succeeded admirably in that goal. How we feel about that is irrelevant.
Popularity isn't a design goal any more than sales are a design goal.

You cannot point to a part of the design and say, "This is what causes popularity." Popularity necessarily is much later down the chain of cause and effect than the goals phase of game design. Designers certainly hope that their designs will be popular. They will, in all likelihood, ask players about how they liked various things. But popularity itself is not and cannot be a design goal.
 


So, Peter Paladin in heavy armor, dumped DEX, and no proficiency in stealth will suck at DEX (Stealth) checks, regardless of level, because no resource has ever been put into it. But that's pretty much how it should be, shouldn't it???
Yes and no.

Mechanically it makes sense.

Experientially, it suggests that, despite all his time adventuring, Peter Paladin has somehow learned nothing about stealth, like he has made a conscious effort to glean no insights from his life experience outside of a very narrow scope.
 
Last edited:

Sure, but this is (more or less) just chucking Advantage altogether. And as much as I may dislike the way 5e uses it, I don't think it's a bad mechanic. E.g., I thought it was quite nice back when 4e "invented" it...as the Avenger "damage" bonus via accuracy.
Pathfinder invented it as a step above Skill Focus.
 

Well, as I wrote, you can cut hit points in half (or make people roll and apply CON to level 1 only), etc. to reduce the bloat and it plays great IMO.

Now, there are caveats, of course. The biggest being you have to be acceptable to spells and other damage being much more lethal. However, since it applies equally to PCs and opponents, it works both ways.

A perfect example is the sleep spell. At 5d8 hit points worth of creatures, it is impossible RAW in 5E to put an ogre to sleep unless you upcast, to 5th level!!! before you have a 50-50 chance to put an ogre to sleep! Compare that to AD&D where sleep carried a 50-50 chance to put the ogre to sleep.

So, cut the hit points in half for the ogre, and you get 29 (round down as the default in 5E). Now, a 1st-level sleep has about a 1 in 8 chance to put the (29 hp) ogre to sleep. Not 50-50, but much better! And a 2nd-level sleep spell has better than 50-50.

Fireball is another great example. In AD&D, your base 5d6 fireball has about a 30% chance (after factoring in saving throw) of taking out a 19 hp ogre. But, in 5E the base 8d6 fireball has no chance to take out a 59 hp ogre, but a reduced 29 hp ogre has about a 35% (after saving throw) of being taken out. Much closer to the AD&D days. FWIW, in 5E you'd have to upcast fireball at about 7th level (!) to have that 30ish % chance to take out the 59 hp ogre.

Combat goes faster as well, obviously, so you never really feel bogged down in a slog (if you ever did...). An orc with 7 hp instead of 15 hp can easily be taken down with one hit, but since damage remains unchanged, is still a very dangerous opponent. Healing, is, of course, more effective as well since hit points are reduced.

Overall, it has a much better "feel" IMO.
During D&D Next (before 5e officially came out) - we used a house rule where your Size category determined your maximum HD.

Tiny = 1 HD
Small = 3 HD (Halflings, they'd get other benefits)
Medium = 6 HD (The rest of the standard playable races)
Large = 10 HD
Huge = 15 HD
Gargantuan 21+ HD

Creatures may have had some prevention depending on the monster.
i.e. You could give dragons a version of Heavy Armour Mastery for their scales...etc
 

Remove ads

Top