D&D 5E L&L: Mike Lays It All Out

If your character just died he or she had bigger concerns then a wasted feat slot.

The solution is simple pick a feat at first, and then when you get to choose a ability score increase later retrain the feat into an ability score increase. Problem solved. Most people who are playing the basic verison aren't going to be worried about this stuff anyways so no big deal with they don't do this.

Anywho I had a thought about feats that one could use the right feats to simulate ADEU if some feats give say encounter powers, the way some feats grant at will cantrips.
 

log in or register to remove this ad

It depends on the range of ACs.

If I am trying to hit a fixed AC, then the greater the value of X, the less difference the increase from +X to hit, to +X+1 to hit, makes to my expected damage output.

So if "bounded accuracy" means "enemies ACs don't grow by very much over the course of the game", then increased bonuses to hit deliver ever-diminishing returns.

The reason that a +1 to hit is so good in 4e is that accuracy is constant across the game, due to scaling: the chance to hit is always around 10/20 to 15/20 (depending on details of build, level and role of opponent faced, etc) and so a +1 to hit is adding as much as 10% to the expected output and the expected chance of landing powerful efffects (forced movement, debuffs etc).

But D&Dnext so far seems to have fewer powerful effects (at least for martial types, who are slated to get more of these feat/stat boosts), and bounded accuracy may mean that as you go up in level your chance to hit approaches 18 or 19 in 20, at which point the benefit of another +1 to hit is pretty marginal.
So, in every edition except 4e, your chance to hit generally unerringly approached 95% by dint of leveling. 3e has a couple of wacky interactions there, with both full attacks and power attack, as well as the ability to abruptly fluctuate AC by 20+ (ex: 'the dragon casts shield, improved mage armor, haste, and cat's grace before the combat' 'The pit fiend has a level of barbarian, so is in mithral full plate +5' etc.)

Even in 4e, it's actually pretty easy to narrow your chance to hit steadily closer if you plan around it.

In this particular system, bonuses to hit are very few and far between, while ACs are pretty static (and too low, but we'll see if that changes or not). So, you might be looking at a spate of 10 levels in which your chance to hit is expected to go up 2. Getting a +1 from a stat or feat is much more of a big deal in that case, because it's the only opportunity for improving accuracy.

Of course, it could all change if they decide that everyone will get to hit 95% of the time, whee.
 

Mike Mearls has suggested that the system will be designed with the expectation that characters have a sixteen in thier primary stat.

Also take into account advantage/disadvantage. If your playing a character who has a primary stat of 16, but easily gain advantage because of a feat vs. A person with a primary stat of 20, but almost never gets advantage, that should just about equal out.
 

If you change how ability score and modifiers work, you're going to mess up the rules framework a lot, and this just to solve a very marginal problem of someone who's worried about dying before the next bump? :uhoh:

If you don't want to bump +1 an even score, don't do it! Bump another (odd) score instead of your highest. Take a feat.

This "problem" was there already before they announced to bundle ability score bumps into feats, it was there because of the +1 from races, it was there because of the +1 from classes (not always you can get these two on the same stats), and it was there because of the level-based ability bump (unless you had at least 2 odd scores). I don't think we had the same level of complain about these, and they were already there...

If gamers are so offended by this even-score bumping, maybe it would be just better to eliminate every single case of ability score bump in the whole game. How about that? Making them feats gives you the privilege of choice, so that you never have to bump an even ability score if you don't want to. And people are complaining...
 

It depends on the range of ACs.

If I am trying to hit a fixed AC, then the greater the value of X, the less difference the increase from +X to hit, to +X+1 to hit, makes to my expected damage output.

So if "bounded accuracy" means "enemies ACs don't grow by very much over the course of the game", then increased bonuses to hit deliver ever-diminishing returns.

The reason that a +1 to hit is so good in 4e is that accuracy is constant across the game, due to scaling: the chance to hit is always around 10/20 to 15/20 (depending on details of build, level and role of opponent faced, etc) and so a +1 to hit is adding as much as 10% to the expected output and the expected chance of landing powerful efffects (forced movement, debuffs etc).

But D&Dnext so far seems to have fewer powerful effects (at least for martial types, who are slated to get more of these feat/stat boosts), and bounded accuracy may mean that as you go up in level your chance to hit approaches 18 or 19 in 20, at which point the benefit of another +1 to hit is pretty marginal.


Right, but you also have to keep in mind that D&D uses a flat roll: 1d20. As such, even with more flat math, a +1 in D&D tends to be a bigger deal than a system in which there is a bell curve. I agree that there comes a point where another +1 isn't as important for attacks, but I feel that I'd need to see the full game in motion before being able to determine how many +1s I can give up.

It's also interesting to note that -supposedly- not all classes will have the same amount of feats. In my mind, this brings up a question concerning how things could potentially be abused once multi-classing comes into play. Could I start as a fighter so that I could quickly pump up a mental stat before switching over to a casting class? Sure, that means I'm sacking a few levels of spell casting, but it also means I may end up with more feats to play with -with which to modify my spellcasting ability.
 

Right, but you also have to keep in mind that D&D uses a flat roll: 1d20. As such, even with more flat math, a +1 in D&D tends to be a bigger deal than a system in which there is a bell curve.
Not quite true. In bell curve distribution, a +1 is more valuable if your chances of success are near 50%, and less valuable if your chances are either very high or very low.
 

Not quite true. In bell curve distribution, a +1 is more valuable if your chances of success are near 50%, and less valuable if your chances are either very high or very low.

Exactly; my point was that diminishing returns are more thoroughly showcased in such a system. You can clearly see how the value of the +1 diminishes as the game progresses. Let's say I'm playing a roll under system which uses 3d6, and I set 10 as the 50% mark via my choice of game mechanics. A person who is untrained at a skill and has a low default of skill 5 has less than 10% chance of rolling what they need; skill 6 is better, but still less than 10%. Likewise, on the high end of the skill spectrum, a skill 15 character has somewhere around a 95% chance of success; adding another +1 to that character (for a 16) bumps him up to around 98%... not a huge difference.

With a flat roll, I have an equal chance of rolling every number on the die each time I roll. Granted, the range of what I can roll and still succeed or fail is going to change depending upon the challenge at hand. D&D levels tend to assume I get a certain amount better over time, and it's that part which I have some concern about. Even considering the more flat math, how much will the full game assume I take a certain amount of +1 bonuses over time? How far can I get from that assumed power level and still function? This leads to other considerations as well:

As Pem said (and rightly so,) 4th Edition made an attempt to keep to-hit right around 50% regardless of level. If Next were to keep that model, +1s would remain important throughout the game. If Next does not do that and PCs simply get better as they level (with their to-hit threshold becoming wider,) it makes +1s less relevant at higher levels, but it brings up other problems. One of those problems is that D&D has static defenses; as such, if everyone is running around with virtually no chance of missing, we end up with combat turning into a contest to see who can take the most HPs* the quickest. Assuming that is the case and also assuming monsters do not keep pace (by virtue of being built differently than PCs and running on a different system,) then we run into a problem similar to what early 4th Edition had in which the PCs were so much better at combat than their adversaries that it prompted a math rehash via Monster Manual 3 to fix the game. Assuming that is the case, but assuming monsters do keep pace instead, then I feel that there is a risk of running into some of the problems that high level 3rd Edition had in which combat became something akin to rocket tag -if you could win initiative and go first, you could attack and kill the enemy before they had a chance to do anything because their chance to defend against what you're doing is so low.

I'm sure there are plenty of ways to avoid that. As I said before, I'm in no way claiming to be an expert. I'm simply discussing what my perception and thoughts currently are. It may be that I'm completely misguided in my thinking. Until I see parts of the game which aren't available for me to see yet, it's hard to say.


*Edit: That was worded somewhat poorly. Combat usually is won by who can take the most HP the quickest. What I meant was that -in a game where everyone was assumed to hit and everyone also has static defenses; meaning they do get hit... I'll be honest. I know what I want to say, but I'm unsure how to word it. Think of it this way: Imagine you were watching a soccer game in which neither team was allowed to play defense and simply took turns scoring until time ran out.
 
Last edited:

Splitting to-hit and damage across even-odd scores smooths out the jagged DPR steps without affecting the expected to-hit percentages, one bit. I.e. you can have your math and eat it too.

They just need to open up pages 8-13 of the AD&D 2nd Edition PHB for inspiration on how to achieve even/odd value. Even only bumps was an innovation of 3e that simplifies things and spread out the bonuses by only three points (15 -> 12), but produces artifacts in its own. All they need is more columns that go up per bump on each stat. Problem solved. Actually, 15 -> +1, 16 -> +2, etc rather than 12 -> +1, 14 -> +2 solves a lot of the issues that people complain about with the 3d8 or 4d6 drop lowest die rolling methods. I think AD&D probably had it right. In a flat math system, why is it so important that someone who has a 12 strength has a +1 to hit? Making bonuses and penalties only kick in < 7 or > 14 makes more highly variant die rolling methods work much better.
 
Last edited:


[MENTION=58416]Johnny3D3D[/MENTION], I can't XP your post but it was a good one. And the bit whose unclarity you were worryng about was clear to me (and your soccer analogy also makes perfect sense).
 

Remove ads

Top