mmadsen said:
Well, it's a question of who's attacking and who's defending. For a skilled fighter who's already 90% likely to hit his opponent, a +1 means he's 95% likely to hit his opponent. As I pointed out, that "5% increase" means he hits 1.06 times as often -- which we might call a "6% increase" in hits. For an unskilled combatant who's only 5% likely to hit that same opponent, a +1 means he's now 10% likely to hit. That same "5% increase" means he hits twice as often -- which we might call a "100% increase" in hits.
Which is what you said earlier, and which isn't really the point. In both cases, +1 equates to an additional 5% chance of hitting,
as you point out. The proportional increase is irrelevant, at least to the point I'm making. The original poster said that there's a point in D&D where a +1 bonus doesn't mean anything, and that's simply not true. A +1 bonus
always means something because a d20 is linear.
mmadsen said:
My point is merely that "5% increase" doesn't necessarily mean 5% more hits, and that the actual multiplier varies depending on the original to-hit probability.
I'm not talking about an increase in
number of hits. Not to mention, your "90% vs 5%" example applies to all characters regardless of skill level. There are times when a skilled fighter will have a 5% chance to hit, and when an unskilled commoner will have a 90% chance. The value of the bonus as an addition in a given situation remains the same regardless: a 5% shift.
mmadsen said:
Even without a bell-curve probability distribution, that +1 has much, much less effect for an attacker with a high bonus than for another attacker with a lower bonus (assuming the same opponent).
It has the same effect. It increases their chance to score a hit by 5%. That this 5% might not matter much when a skilled fighter is attackng a bound-and-gagged opponent or when a commoner tries to swing at Orcus (rough equivalents of the example you've set up) doesn't really speak to how a +1 bonus affects die rolls on a flat districution.
mmadsen said:
An attack bonus of +37 vs an AC of 47 is the same as an attack bonus of +0 vs an AC of 10 regardless of what dice you roll (as long as it's dice plus bonus vs. AC).
Yes, 50% = 50%. That's not my point. The point of my example was to show you that in both these cases --i.e., a lowly commoner and an epic-level warrior-- the +1 bonus means the same thing: a 5% addition to their chance to score a hit with a single attack. Ditto if either was in a situation where their chances were 5%, 25%, or 75%. Neither character is going to choose to forego the bonus, because the bonus matters. The only time the bonus doesn't matter is when you're deep into auto-hit/auto-miss territory (unless you're tlaking about mechanics that don't use that, like skill checks, in which case it always matters).
In a curved distribution, however, the closer to the narrow ends of the curve, the less value the bonus has. Assuming a 3d6-roll-under system, the +1 means maybe a .3% addition at the extremes, which isn't even worth the bother. In the middle, it could mean as much as 13%, very much worth the bother.
Now, you could argue that a
+1 sword is of little use at very high levels, when you take into account things like the immunities possesed by high-CR monsters or that it's going to look shabby next to the +5 flaming burst ghost touch vorpal greatswords PCs can afford by then... but that doesn't really have anything to do with the value of a +1
bonus. If I'm that +37 fighter battling that AC47 demon, I'll take all the bonuses I can get. E.g., why PCs don't stop flanking opponents just becasue they're 19th level.
D&D/d20 is linear. It scales. The system knows this. There is no point in level advancement where bonuses become less useful. There are only
situations (and generally pretty extreme ones) where they are more or less useful, and such situations are available at al levels of advancement.