• The VOIDRUNNER'S CODEX is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!

Game design trap - Starting too close to zero.

Minigiant

Legend
Supporter
What you're discussing then is not about starting away from 0, but the size of the interval between what we consider baseline and proficient. There's no reason that +0 is the baseline, and indeed it isn't in many cases. (IIRC a 3.5 individual with average strength, no BAB and lacking weapon proficiency is looking at -4 to hit. A 1st-level fighter with weapon focus is probably around +6.)

The size of this interval varies quite a bit depending on which edition you're referencing and which task specifically. If anything, I would say the range of starting values for 4th edition skills is too high. Bonuses over +10 are not uncommon for 1st-level characters (a good stat, training, and any bonus from background or race gets you there easily), but neither is a -1 modifier. How much it should vary is subjective.

My point is the mentality of seeing 0's. Players see zeroes and they don't even bother trying. A +1 or +2 might get a player to have their character try something the PC isn't great at. One of my most enjoyable time as a DM was a 4E game where the players hit level 6 and started to touch those dump skills without me forcing it or them waiting for retries. Sure they almost always failed but they were willing to try.

Something about zeroes. They affect how people think. Zeroes affect how players and DMs expect to see things. It'll effect their views on races, on classes, on skills, on proficiency range, on HP... without thinking about the consequences of their newly formed preferences. They might want everything to start at zero but never compare the result to the difficult class they dreamed up. Then bam, the high target of 18 is too becausefocus is placed on the upgraded zeroes instead of starting at the +8 and discovering how to get there.

But zeroes screw with your brain. Unless the player is a method actor or a goofball, I've seen it placed in this love before game and hate during game mentality.
 

log in or register to remove this ad

What you're discussing then is not about starting away from 0, but the size of the interval between what we consider baseline and proficient. There's no reason that +0 is the baseline, and indeed it isn't in many cases. (IIRC a 3.5 individual with average strength, no BAB and lacking weapon proficiency is looking at -4 to hit. A 1st-level fighter with weapon focus is probably around +6.)

The size of this interval varies quite a bit depending on which edition you're referencing and which task specifically. If anything, I would say the range of starting values for 4th edition skills is too high. Bonuses over +10 are not uncommon for 1st-level characters (a good stat, training, and any bonus from background or race gets you there easily), but neither is a -1 modifier. How much it should vary is subjective.

It is true, but the difference is pretty trivial for any likely numbers. 6.0/6.02 is very close to 1. (referencing your other post, sorry forgot to quote it).

Indeed for linear situations like to-hit the only question is a matter of how big the range is.

I think CJ is quite right on the damage (and other things) aspect though. You would really want to be away from zero such that you can have both a reasonable absolute difference and a smaller relative difference between best and worst.

Personally I like the idea of stat bonuses starting with 1 and just being 1/2 your stat. I know many people will hate the average +5 to +9 you'll get in a typical stat range, but for things like to-hit you're ALREADY in that range with 4e. So just eliminate some of the other not very useful bonuses, like proficiency. If there's very little bonus for level you could easily end up with a fairly narrow range. Lets say you have a +1/3 levels to-hit and a total level range of 18 levels. That's only +6 for level, max. Now maybe there's a few ways you can get a static bonus (say a really special weapon could give a +1, maybe some classes get +1, etc). So you might have say +3 max there. You're only at a range now from +5 worst-case (level 1 guy with 10 STR) to 9+6+3 = +18 for a level 20 guy with everything maxed and 18 STR (and lets just ditch the stat boosts beyond that, eh). Even if a guy can get a 22 STR with some sort of magic that would still max the range at +20. I think that works and keeps things simple.

The nice thing here is the wizard's to-hit with his dagger doesn't go down relatively to the fighter at higher levels, except maybe a couple points. Now, when you go to DAMAGE the same bonuses can apply, so the wizard's 1d4+5 is relatively not super far off from the fighter's 1d8+9 at the same level. Attacking with the dagger is a viable option, unlike in 4e where it is just basically laughable. You can now scale defenses and hit points around these numbers easily enough.
 

Crazy Jerome

First Post
Personally I like the idea of stat bonuses starting with 1 and just being 1/2 your stat. I know many people will hate the average +5 to +9 you'll get in a typical stat range, but for things like to-hit you're ALREADY in that range with 4e. So just eliminate some of the other not very useful bonuses, like proficiency. If there's very little bonus for level you could easily end up with a fairly narrow range. Lets say you have a +1/3 levels to-hit and a total level range of 18 levels. That's only +6 for level, max. Now maybe there's a few ways you can get a static bonus (say a really special weapon could give a +1, maybe some classes get +1, etc). So you might have say +3 max there. You're only at a range now from +5 worst-case (level 1 guy with 10 STR) to 9+6+3 = +18 for a level 20 guy with everything maxed and 18 STR (and lets just ditch the stat boosts beyond that, eh). Even if a guy can get a 22 STR with some sort of magic that would still max the range at +20. I think that works and keeps things simple.

I'd be ok with that if there are plenty of viable options for starting at less stat totals, but still capping them out as you had listed--either in amount or modifier. In other words, somewhere between level 1 and level max, a fighter is going to get around 20 Str, but he might start anywhere from around 13 (Basic style) to 16 to 18. No matter where he starts, getting to 20 is about the best he can do.


That allows room in the math for:
  • Stats are pretty lousy at first, you attack bonuses start around +5 or +6, and you grow steadily throughout the career. You get stat boosts occasionally and then the normal but slight growth from the other factors.
  • Stats are great at first, and you are pretty much stuck with what you start with. You've got a heftier mod up front, but you now grow more slowly through the career, being limited to those other factors.
In that game, some people might even be happy mixing both. You roll for stats, and how poorly or well you roll only matters initially. Every level, it matters less, because the guy that started behind steadily closes the gap.
Of course, with the right kind of ability score caps and increases, you can get that kind of effect whether you get the base numbers off of zero or not.
 

Lanefan

Victoria Rules
On the skill point thing, context, context! I mentioned it in passing as an example from 3E of the kind of problem. It was disputed. I then elaborated. None of the rest of the things I have discussed have anything in particular to do with 3E. So what have you read here that makes you think I'm stuck on 3E?
Sorry, I read the elaboration and took it to be the main discussion.

As for the hit point example and how long it takes to put a foe down, it is true that changing the starting damage alone won't get you there. The smaller range and generally lower hit points of early D&D is also a huge help--really the main help. Changing the lower bound is more about truncating the extremes. After all, you could have 20 hit points, do 1d20, and get a range from 1 to 20 rounds. I think most people would find that too far the other way.

And of course how much randomness we want in that process is somewhat of a playstyle question. The difference between 1d4+1 and 1d6 brings up the same issue. Ideally, the ranges chosen would support some variation.

Let's consider your examples for the d4 guy.
The example I used was straight d8, not d4...but everything else is right. :)
The one with straight d4 versus 20 hit points will take from 5 to 20 rounds, and average of 8 rounds. (I realize that the extremes in the 5 to 20 are extremely unlikely here.) Meanwhile, our d4+10 guy versus 70 hit points is an average of just under 6 rounds, for an extreme range of 5 to 7. Pretty tight--probably too much.

Of course, we are assuming in this example that the d4 guy is going to get that same +10. (He probably will not.) And we are assuming that the 5 round average was important enough to preserve, so that the hit points scaled from 20 to 70. Maybe the 70 hit points should be somewhat less, the d8+10 guy is still thus seeing more variance, and his nice d8+10 attack is putting down opponents a bit faster than 5 rounds on average.

No matter where you set it, however, if you make the die all important for long periods in the starting range, the fact is that d4s might as well not exist.
I'd say quite the opposite. As soon as the bonus number gets bigger than the die size being rolled the die roll starts becoming less and less relevant. Using your example of someone doing d4+10 per hit thus giving a possible result range of 11-14, it's easy to see that what matters is the constant +10 rather than the extra 1-4 tack-on.

And this largely defeats the purpose of a dice-based game in which random chance is intended to play a significant role.

And maybe that would be a better answer for the kind of game you want. I think that was mentioned earlier. Maybe get rid of d4s and d6s as common attacks, and go with things like d8 as the starting point, with d10, 2d6, etc. occurring more often. That puts the average damage well away from zero, and you get the same nice effects as bumping the base mod to +4.
The base mod. doesn't need to be +4 if the foe has fewer h.p. to begin with.
Minigiant said:
And we get druids who cant build themselves a shelter to wait out a rainstorm or a fighter with a ~60% chance of missing the orc. And that's why I hate low level. You can't do jack reliably unless you powergame because all the bonuses are +0 and +1 but the DCs are much higher.
Call it a severe playstyle or preference difference if you like, but I just can't agree with the assumption that a PC has to always be successful at what it does. Yes, a low-level fighter might miss the orc 60 or 70 or even 80% of the time. So what? What makes the fighter a hero is that she's out there fighting the orc at all, where lesser people hide in their homes behind barred doors.
Minigiant said:
Something about zeroes. They affect how people think. Zeroes affect how players and DMs expect to see things. It'll effect their views on races, on classes, on skills, on proficiency range, on HP... without thinking about the consequences of their newly formed preferences.
Here again it comes down to player (and DM) attitude on this one point and speaks to a larger issue of entitlement and expected success. I can't help you there.

Lan-"if nobody ever made mistakes the game - and real life - would be pretty dull"-efan
 

Pyromantic

First Post
It is true, but the difference is pretty trivial for any likely numbers. 6.0/6.02 is very close to 1. (referencing your other post, sorry forgot to quote it).

Indeed for linear situations like to-hit the only question is a matter of how big the range is.

That post was intended in part to show the real complexity behind the study of probability. The difference between 6 and 6.02 is very minor, but that particular example is very narrow since it will almost always require 6 attacks.

d4 damage per attack to do 5 or more damage is a better example to show how far off the method can be. Just taking 5 divided by the average damage of a d4 (2.5) would suggest 2, but the actual number of expected hits is around 2.44.

I think CJ is quite right on the damage (and other things) aspect though. (snip...)


As I said before I'm not opposed to the idea in principle, but I think we should first establish how much variance you want to see in a quantity like damage. While I tend to agree d4-1 for the wizard and d10+4 (as examples) for a fighter might be too big a range in melee damage potential, there are other ways to potentially narrow that range. These are some ideas off the top of my head that could be implemented as standard or readily accessible options:
  • Weapon proficiencies for weapons with better damage types.
  • At-wills, so that even if they aren't doing better than d4-1 damage in melee, they can routinely do more damage on their turn anyway.
  • Attacks based on stats other than strength, either in whole or in part. You could allow everyone to add their best stat modifier to damage in addition to strength for example.
Any of these are potentially options, and IMO probably better than increasing all stat modifiers.
 

Chris_Nightwing

First Post
My issue is not that the chance is identical, it is that the chance of the talented, the skilled, and the untrained are practically identical. Then making the chances not match the fluff.

Over many threads here and over places, I've seen many people suggest that an untrained has a +0 and the trained at a little +2; granting a mere 10% increase of chances.

Part of this is because people think Zero when they think untrained. But when when these same people are asked what trained is and you offer a +5 they go "TOO HIGH!" and opt for the +1 or +2 option.

Then you ask for the DC they come up with high values of 15 and up.

And we get druids who cant build themselves a shelter to wait out a rainstorm or a fighter with a ~60% chance of missing the orc. And that's why I hate low level. You can't do jack reliably unless you powergame because all the bonuses are +0 and +1 but the DCs are much higher.

I think it's pretty easy to retrain people to think about DC. The reason people reject +5 is because they then think, and realise that there's already a +4 difference from stats between the best and worst person in their party and they don't want that to became +9. I'm not sure what the consensus is on natural vs. learned skill ability, but clearly the balance isn't right for that many people to complain. It might be poll-worthy to find out what range of differences people prefer in skill checks, especially with the D&DN playtest suggesting the raw stat would be the skill modifier (which doubles the range of ability difference from 3/4e).
 

I'd be ok with that if there are plenty of viable options for starting at less stat totals, but still capping them out as you had listed--either in amount or modifier. In other words, somewhere between level 1 and level max, a fighter is going to get around 20 Str, but he might start anywhere from around 13 (Basic style) to 16 to 18. No matter where he starts, getting to 20 is about the best he can do.


That allows room in the math for:
  • Stats are pretty lousy at first, you attack bonuses start around +5 or +6, and you grow steadily throughout the career. You get stat boosts occasionally and then the normal but slight growth from the other factors.
  • Stats are great at first, and you are pretty much stuck with what you start with. You've got a heftier mod up front, but you now grow more slowly through the career, being limited to those other factors.
In that game, some people might even be happy mixing both. You roll for stats, and how poorly or well you roll only matters initially. Every level, it matters less, because the guy that started behind steadily closes the gap.
Of course, with the right kind of ability score caps and increases, you can get that kind of effect whether you get the base numbers off of zero or not.

I guess you might call it a personal preference, but I'll be quite happy to see stat boosts go the way of the dodo. Maybe there's a way to get a SMALL one, but I'd rather see stats reflect your basic innate talent and let other things increase. It is related to RP. If I am a dumb guy or a clumsy guy I'd rather stay that way. I can see getting a bit stronger or quicker or something perhaps, but IMHO keep it in the 3-18 range, or maybe 3-20 say.

One of the issues with 4e has always been that you have these couple of stats you can boost, and it just opens up a large chasm between PCs as you level up. This relates to the skill bonus variation too, which if it is somewhat problematic at 1st level is just gigantic at high levels.
 

Crazy Jerome

First Post
As I said before I'm not opposed to the idea in principle, but I think we should first establish how much variance you want to see in a quantity like damage. While I tend to agree d4-1 for the wizard and d10+4 (as examples) for a fighter might be too big a range in melee damage potential, there are other ways to potentially narrow that range. These are some ideas off the top of my head that could be implemented as standard or readily accessible options:
  • Weapon proficiencies for weapons with better damage types.
  • At-wills, so that even if they aren't doing better than d4-1 damage in melee, they can routinely do more damage on their turn anyway.
  • Attacks based on stats other than strength, either in whole or in part. You could allow everyone to add their best stat modifier to damage in addition to strength for example.
Any of these are potentially options, and IMO probably better than increasing all stat modifiers.

I'm not terribly concerned about the exact method used, as long as the damage expression gets further off of zero. For that matter, moving the average sufficiently away from zero would probably give most of the benefits.

That is, for my purposes, the dagger in the 1st level, relatively low Str hands, doing around 1d4+4 or 1d6+3 or 1d8+1 or even 1d10 are all good. Or perhaps 1d4+3 or 1d6+2 or 1d8 would be sufficient. Either way, you then adjust more skilled 1st level attacks and higher level attacks from there (slowly). If I understand Lanefan's stated preference, if forced to adapt to something like this, he'd probably prefer one of those later numbers than the earlier ones. I could easily go with any of them for a particular campaign. The mod of around +4 is merely a convenient way to talk about this.

I don't agree that providing at-wills based on other stats fully solves this issue. It sort of does the minimum, in that now everyone has a decent damage expression, they can function. But if that moves certain flavorful options--such as 1st level wizard with dagger--from "might do occasionally when pressed" to "would rather play dead than try, ever"--you've effectively removed that option from the game. There's a fine line between "lousy but possible" and "don't even try"--and I'd like the boundaries of that line to encompass as much territory as it reasonably can. :D

The other things you listed are certainly valid approaches to solving the whole shebhang.
 

JohnSnow

Hero
I think what the OP is getting at here is not really so much about the escalating numbers, but about making sure that there's enough design space at the low end of the system.

Taking the combat example, if a "trained" fighter gets a bonus to hit at 1st-level of +5, then there's theoretically 4 levels of improvement that are better than no bonus, but less than that of someone trained. Let's say an average strength 1st-level commoner starts at +0 to hit. The fighter's bonus needs to be enough above that to allow the cleric, the rogue, and the wizard (and all the other classes) to fit "between."

The degree of granularity you can accomplish in the system is limited by the availability of whole numbers. If the system starts with a trained fighter getting only a +1 bonus relative to the commoner, there's not enough design space to fit the other classes "in-between" the two. I need the fighter to have at least a +2 bonus, at which point, I could put the wizard at 0 (or even -1) the commoner at 0, and the thief and cleric at +1.

Theoretically, of course, this can be accomplished with any appropriate range of whole numbers. But the wider the spread of numbers is, the more design space exists. Moreover, ratios can be important. By starting with larger numbers, we can theoretically preserve a decent ratio, while still allowing for enough design space to solidly differentiate the to-hit bonuses or hit points of the classes (or weapon damage codes, skill training levels, or whatever).

Take the way health is assigned in the Dragon Age RPG, for example. The Mage starts at 20 + Con (roughly equivalent in magnitude to a Con bonus in D&D) + 1d6. The rogue starts 5 higher than the mage and the warrior starts 5 higher than THAT. From there, each class gains the same number of health (minus differences in Con) at each level.

What this means is that the warrior starts out with slightly less than 50% more health than the mage, and the rogue is between the two. And, due to the relative emphasis each class places on their Con score, that distinction remains roughly consistent as the characters go up in level (without getting too far into the statistics). So, now we have enough design space for three classes, without them being too far apart, and which allows for each one to take more than a single hit in combat before dropping.

There would be similar space (3 levels) in making the range of hit dice run from d4 to d8, or d6 to d10, (or even d8 to d12), but there's not a lot of distinction between them. And, moreover, the closer you start to 0, the more the first couple levels of competence gained end up mattering. For example, if a 6th-level character has 6 times the hit points of a 1st level one, that says something very different about the system than if he has twice the amount.

Sure, you could achieve the same result by having each class gain, say, 1, 2, or 3 hit points as they level up (or any three numbers), but by starting the numbers larger, you open design space for different things to be accounted for (whole numbers, remember?), while still producing a flatter power curve.

Make sense?
 

I think what the OP is getting at here is not really so much about the escalating numbers, but about making sure that there's enough design space at the low end of the system.

Taking the combat example, if a "trained" fighter gets a bonus to hit at 1st-level of +5, then there's theoretically 4 levels of improvement that are better than no bonus, but less than that of someone trained. Let's say an average strength 1st-level commoner starts at +0 to hit. The fighter's bonus needs to be enough above that to allow the cleric, the rogue, and the wizard (and all the other classes) to fit "between."

The degree of granularity you can accomplish in the system is limited by the availability of whole numbers. If the system starts with a trained fighter getting only a +1 bonus relative to the commoner, there's not enough design space to fit the other classes "in-between" the two. I need the fighter to have at least a +2 bonus, at which point, I could put the wizard at 0 (or even -1) the commoner at 0, and the thief and cleric at +1.

Theoretically, of course, this can be accomplished with any appropriate range of whole numbers. But the wider the spread of numbers is, the more design space exists. Moreover, ratios can be important. By starting with larger numbers, we can theoretically preserve a decent ratio, while still allowing for enough design space to solidly differentiate the to-hit bonuses or hit points of the classes (or weapon damage codes, skill training levels, or whatever).

Take the way health is assigned in the Dragon Age RPG, for example. The Mage starts at 20 + Con (roughly equivalent in magnitude to a Con bonus in D&D) + 1d6. The rogue starts 5 higher than the mage and the warrior starts 5 higher than THAT. From there, each class gains the same number of health (minus differences in Con) at each level.

What this means is that the warrior starts out with slightly less than 50% more health than the mage, and the rogue is between the two. And, due to the relative emphasis each class places on their Con score, that distinction remains roughly consistent as the characters go up in level (without getting too far into the statistics). So, now we have enough design space for three classes, without them being too far apart, and which allows for each one to take more than a single hit in combat before dropping.

There would be similar space (3 levels) in making the range of hit dice run from d4 to d8, or d6 to d10, (or even d8 to d12), but there's not a lot of distinction between them. And, moreover, the closer you start to 0, the more the first couple levels of competence gained end up mattering. For example, if a 6th-level character has 6 times the hit points of a 1st level one, that says something very different about the system than if he has twice the amount.

Sure, you could achieve the same result by having each class gain, say, 1, 2, or 3 hit points as they level up (or any three numbers), but by starting the numbers larger, you open design space for different things to be accounted for (whole numbers, remember?), while still producing a flatter power curve.

Make sense?

That would seem to be pretty much the gist of it. The long and short is that both absolute difference matters and the RATIO matters. If say fighters have 3x the hit points of wizards, then if wizards start with 2 hit points and fighters start with 6 and a sword does a d8 damage the fighter is really not 'tougher' than the wizard to any appreciable degree because the absolute difference is fairly (but not quite) trivial. Either PC is pretty likely to go down in a hit. Drop 10 more points on each one's hit point total though and the absolute difference is the same (4 points) but the ratio is now 7:8, making the two almost equally tough. In either case at level 1 the difference is small, but in the later case assuming the fighter adds 6 points per level and the wizard 2 they will diverge more slowly in relative terms, which is probably good (not that I'm advocating any specific numbers or ratios mind you).
 

Remove ads

Top