(from SoD:Y or N)System Legacy Issues

Reynard

aka Ian Eller
Supporter
I think I agree with a fair bit of what Ariosto is saying - namely, that pre-2nd ed versions of D&D presupposed quick and failry frequent generation of PCs, and that keeping parts of the game system in 3E (like save-or-die) while changing other parts (quick PC creation, hp totals, damage caps etc) can reduce the game's ability to provide satisfactory play to many players - both those looking for old-style play and those looking for story/PC-intensive play.

This is something I have been thinking a lot about as I run my PF game and create adventures (many of which might be described as "old school").

There are a lot of elements in D&D 3.x and PF that hark back to earlier editions, but don't necessarily quite fit because of other changes to the systems -- D&D-isms that don't really fit D&D anymore, if that makes sense.

Before I go too far, though, I want to acknowledge that 4E is actually better about this. While a lot of the terminology is the same, the actual system elements are not and much of 4E was apparently built from the ground up. It is one of the reasons some of us don't like 4E much (among others) but I want to give credit where credit is due: because of the "re-imagining D&D" nature of 4E design, it has fewer legacy mechanic issues.

Anyway, as an example let's look at the lowly dagger. In another thread, Bullgrit asked how often you AD&D and B/X wizard stabbed things with a dagger, either after his one or two spells were cast or in order to save them for the "big fight". Fire and forget magic users aside, one legacy mechanic issue this brings up is what 1d4 points of damage means between editions. In AD&D and B/X D&D, 1d4 was a significant amount of damage, and not just for a single level of play. Even creatures rolled hit points and those hit points were lower in general, so the possibility that a spell-spent mage whipping darts or daggers could still contribute to combat success was very real. Goblins were weak, and were "level appropriate" enemies for months or years of play. 2E started the initial hit point inflation without commensurate weapon damage increase, and 3E took that ball and ran hard with it. Suddenly the dagger wielding mage wasn't just desperate, he was a danger to himself and others and a liability to the party.

Similar things can be said about all weapons and characters engaging in melee of course, and by extension a lot of spells that cause damage. Magic Missile was a powerful spell, especially against "minions" (note that it did 1d6+1 in B/X and I think AD&D). Spiritual Hammer was a solid offensive cleric spell for much the same reason. Unfortunately, except for the big ass two handed weapons, weapon and spell damage remained static from edition to edition as hit points increased.

A similar example of legacy mechanics is the poor shield. In earlier versions of the game where a +1 to hit was a rare and valuable bonus, a +1 to AC was equally valuable. Moreover, the more abstract nature of combat -- prior to maneuvers and feats and the like -- meant that the shield's C bonus was "complete". That is, +1 AC effectively modeled a shield insofar as the combat system itself was concerned. But the combat system grew more complex and more granular as time went on, and the shield remained a simple, relatively small AC bonus, reducing its utility (especially as related to the previously noted two handed weapons). In addition, it's defensive values were not attributed throughout the combat system. (In PF, I let a proficient characters shield bonus increase his CMD and Reflex save versus area effect damage attacks; shields are the most commonly employed arm in the history of mankind, across all cultures for a reason.)

There are lots of other areas where tradition, legacy mechanics and "sacred cows" remain inherent in the game without taking into account all the associated systemic changes.

Further examples? Thoughts? Counter arguments?
 

log in or register to remove this ad

The hit point inflation in 3E seemed designed to adjust the relative power effectiveness of fighter types and spell wielders.

Consider that spell damage suffered under 2E (maximum die were capped for magic missile, fireball, et al.).

3E hugely inflated hit point totals, both PC and creature (compare the maximum 88 hp ancient red dragon with 3E's 660 hp average, for example).

Fighter-types got a few stacking increases to damage -- improved stat modifiers, Power Attack, stacking bonuses, etc. In my current 3.5 game, the dwarven fighter has exceeded 200 hp from a single blow and seems to get 30-50 points per blow quite commonly.

Spell wielders got an increase in depth (more spells per day), a slight reduction in penetration (prevalence of spell resistance seemed to increase and saving throw targets generally were lowered -- especially for low-level spells). Damage remained identical to 2E and those few spells with hit point/hit dice caps (cf. Power Word: Kill) did not have their caps increase.

This shift in relative damage capability has led spell-wielders to focus far less on damage-dealing spells in favour of "ignore hp" spells.
 

A further example would be the shift toward save-or-lose spellcasting in 3E.

As you have pointed out, in 3E hit point totals skyrocketed while damage--spellcasting damage especially--stayed about where it was. At the same time, the core saving throw mechanic underwent a major shift.

Pre-3E, there was no concept of a "save DC." You just rolled a saving throw, and you either made it or didn't based on your class, level, and the type of save. It didn't matter whether the guy targeting you was a world-class archmage or a bumbling apprentice, your chances of making your save against the spell were the same either way.

As a result, by the time you had access to spells like finger of death, the chance of their actually working on a level-appropriate target was fairly small. In 3E, however, they kept all the old save-or-lose spells but implemented a new system where save DCs kept pace with enemy saving throws.

Result: Save-or-lose spells got a massive power boost right at the same time that direct-damage spells were getting the hell nerfed out of them. Cue the rise of the Batman wizard who wouldn't be caught dead throwing fireballs.
 

A further example would be the shift toward save-or-lose spellcasting in 3E.

As you have pointed out, in 3E hit point totals skyrocketed while damage--spellcasting damage especially--stayed about where it was. At the same time, the core saving throw mechanic underwent a major shift.

Pre-3E, there was no concept of a "save DC." You just rolled a saving throw, and you either made it or didn't based on your class, level, and the type of save. It didn't matter whether the guy targeting you was a world-class archmage or a bumbling apprentice, your chances of making your save against the spell were the same either way.

As a result, by the time you had access to spells like finger of death, the chance of their actually working on a level-appropriate target was fairly small. In 3E, however, they kept all the old save-or-lose spells but implemented a new system where save DCs kept pace with enemy saving throws.

Result: Save-or-lose spells got a massive power boost right at the same time that direct-damage spells were getting the hell nerfed out of them. Cue the rise of the Batman wizard who wouldn't be caught dead throwing fireballs.


I generally agree though I think the chance a target failed its save in the older editions was higher than an unoptimised spellcaster in the newer editions.

I'm away from my reference material so my numbers may be off a couple of points.

Consider the ancient red dragon again. It had 11 hit dice and monster saving throws increased +1 / 2 hit dice IIRC. So it had a total +5 on its save compared to say an orc. Save versus spell was a tough category -- 19 for the orc so about 14 for the dragon. The orc fails 90% of the time versus any spell and the dragon fails 65% of the time.

The red dragon of 3.5E has 40 HD and all favoured saves so its saving throw bonus is about 20-25 points higher than a base orc. Saving throw DC are based on spell level and attribute modifiers. The orc against a 1st level spell cast by a beginning character will face a DC around 15 and probably gets a +1 or better bonus on the save in addition and fails 65% of the time. The dragon is facing a 9th level spell cast by a high level character with about a +12ish attribute modifier for a DC of 31. The dragon needs to roll between a 6-10 to save (assuming it has no other bonuses) and fails about 25-50% of the time.
 

Further examples? Thoughts? Counter arguments?

I think I generally agree with you but don't feel that your examples are particularly strong. The general nerfing of fireball and the rise of save or suck and buffing spells as a consequence that others have mentioned is probably the best example.

Anyway, as an example let's look at the lowly dagger...one legacy mechanic issue this brings up is what 1d4 points of damage means between editions. In AD&D and B/X D&D, 1d4 was a significant amount of damage, and not just for a single level of play. Even creatures rolled hit points and those hit points were lower in general, so the possibility that a spell-spent mage whipping darts or daggers could still contribute to combat success was very real. Goblins were weak, and were "level appropriate" enemies for months or years of play.

This is all to some extent true depending on what the house rules were at your table; 1d4 damage could indeed kill many lower creatures. It however wasn't a particularly significant amount of damage and its contribution to success relative to just about anything was small. It would be a long time before a M-U could take on an ogre or even a bugbear with a lowly dagger, and generally speaking, if that was the M-U's only combat option he'd be better off fleeing. You tried to hit a target with darts not because it was significant but because you had nothing better to do.

What I think you are missing is the effect of magic items on this calculation. Even a dagger +2 can triple or more the expected damage output and transform your M-U into something that can if necessary toe-to-toe orcs and hobgoblins. It wouldn't be your first option, but sometime between 3rd and 6th level it became reasonable.

At low levels, I don't really think that 3e inflated the hitpoints that much. Hobgobins are still going to have 1d8+1 hit points. Kobolds still have 1d4. At the levels we are comparing, the effectiveness of a dagger is still pretty high and the 3e Wizard has a better 'to hit' progression than the 1e M-U as well. The low level Wizard with a magic staff or dagger is comparable in effectiveness to his low level 1e ancestor. It's not until you start seeing big Con bonuses that the Wizards melee damage is nerfed, but not significantly more than M-U melee damage at higher level relative to the foes. You don't try to take on a 12HD hydra with a dagger.

Magic missile has never been a good low level spell in the since of being 'optimal'. In 1e AD&D it did 1d4+1 damage, usually not enough to kill a hobgoblin or even an orc. Compare 'Sleep' which would and did frequently take down whole parties of orcs and goblins. I don't ever recall Spiritual Hammer being anything but a way for the cleric to be less bored in between uses as a hit point battery.

And the lowly shield has always had poor utility compared to a halbred, glaive-guisarme, or two-handed sword (especially factoring in the weapon vs. AC tables). D&D has always modeled the shield poorly. The only reason then and now to wield as shield was and is magic. A starting shield was +1 to AC and reduced expected damage by about 5%. You were always better with a two-handed weapon. But a +5 shield was +6 to AC, reducing expected damage by a huge percent and importantly getting your AC into the region where almost everything regardless of HD needed a 20 to hit you. It's also worth noting that in 1e, a shield did under certain conditions give you a bonus to 'reflex saves' (like vs. breath weapons). Third 3e's 'large shield' with its +2 bonus actually has alot more utility than 1e's shields ever did outside of obscure house rules.

There are I think artifacts and I'm suffering through the '1d6 is no longer a large amount of damage' problem with the current version of my house rules myself, but I think that you overstate the change over time here. My house rules have this problem only because I'm inflating starting hit points relative to 3e. I never encountered it in stock 3e.
 

I generally agree though I think the chance a target failed its save in the older editions was higher than an unoptimised spellcaster in the newer editions.

I think that they were about the same in the case of unoptimized spellcasters, but that such a thing in practice didn't exist.

Consider the ancient red dragon again. It had 11 hit dice and monster saving throws increased +1 / 2 hit dice IIRC. So it had a total +5 on its save compared to say an orc.

Bad example here because the dragon gained saving throws by hit points rather than HD IIRC. A 11 HD ancient dragon saved as if it had 19 HD or somewhere around there. But, let's assume your numbers roughly accurate for a 11 HD adult.

Also, just for the record, an orc saved vs. spell on a 17 - 19 was for kobolds and the like. So the 11 HD ancient dragon was saving on something like an 8 (assuming I'm not forgetting any more of its special rules), and that would have likely been its worst save.

The big difference in my experience is that in 3e there was no expectation that you'd actually get better at saving throws (as in 1e) and in fact, you generally expected to fail more of them as you got higher in level!!
 
Last edited:

Remove ads

Top