Stalker0 said:
Can you explain how you did the math on this one? Basically what I did was assume that a +1 to attack increases your average damage by (base) x .05....basically over the long run you are hitting 5% more often. Once the base is high enough that the average damage increases by 1, then its equal to damage +1...which is 20 in this case.
It's a little lengthy, but I'll try...
There are two ways at looking at %s. One is the easy way. One is the insane difficult aneurism-inducing way. They way you're doing it isn't the easy way. It can be done that way, but it's much much more difficult. In fact, I might not even be able to do it that way without a few days to kill. They way I look at it, a D20 roll of 11 is not 5% higher than a 10. it is 10% higher, because 11 is 110% of 10.
Also, there are two different types of "damage" you have to keep in mind. There's average damage, and then there's average damage per hit. I did not distinguish between those two things in my post above, and I hope that doesn't cause confusion. Average damage per hit is what most people think of. 1d8+6 has an average damage per hit of 10.5. Actual average damage though depends on that, and the chance of you actually hitting, or:
Average Damage = (Average Damage per Hit) x (% chance of hitting).
Chance of hitting is not a static number like Average damage per hit is. It depends on the difference between your attack bonus and the enemy's AC; aka what you need to roll on a d20 to hit. Your average damage will be much less for something you need a 19-20 to hit than it will be for something you need a 2-20 to hit.
Therefore, whether or not an increase in damage or to-hit is more beneficial (increasing average damage) will depend on your average damage per hit, and the d20 roll needed to hit.
Each of them is equally important in Average Damage. IE, doubling Average damage per hit and leaving chance to hit alone is the same thing as doubling chance to hit and leaving average damage per hit alone, keeping in mind that chance of hitting can never go above 95%.
Now, the math that I used to come up with my above post is a bit long, and would be near unreadable in a message board format, so what we'll do instead is 2 case studies: the break even point (where +1 to damage is the same as +1 to attack), and one demonstrating the trend, where there is a difference in importance between the 2.
BREAK EVEN POINT
Ok, lets say your attack bonus is 15, and the enemy's AC is 26. You weapon does 2d6+3 damage. You need to roll an 11 or higher to hit. Your average damage per hit is 10.
(21- min roll to hit) = (21-11) = 10. [10/20 => you have a 50% chance of hitting].
Average damage = (average damage per hit) x (chance of hitting) = 10 x 50% = 5.
Now, this is the break even point, where +1 to hit or to damage are worth the exact same.
+1 to hit => You now hit on a 10-20.
(21-min roll to hit) = 21-10 = 11. [11/20 => you have a 55% chance of hitting]
Average damage = 10 x 55% =
5.5
+1 to damage => Your average damage per hit is now 11.
Average Damage = 11 x 50% =
5.5 (The same average damage reached by increasing to hit by 1).
SHOWING TREND
Ok, lets say your attack bonus is 15, and the enemy's AC is 26. You weapon does 2d6+8 damage. You need to roll an 11 or higher to hit. Your average damage per hit is 15.
+1 to hit => You now hit on a 10-20.
(21-min roll to hit) = 21-10 = 11. [11/20 => you have a 55% chance of hitting]
Average damage = 15 x 55% =
8.25
+1 to damage => Your average damage per hit is now 16.
Average Damage = 16 x 50% =
8.00 In thise case, +1 to hit is worth slightly more than +1 to damage. This holds true with what I said.
If (21-min roll to hit) = average damage per hit: + to attack and damage are treated equal.
If (21-min roll to hit) < average damage per hit: + to attack is worth more than + to damage.
If (21-min roll to hit) > average damage per hit: + to damage is worth more than + to attack.
Hopefully this was somewhat understandable.