• NOW LIVE! Into the Woods--new character species, eerie monsters, and haunting villains to populate the woodlands of your D&D games.

D&D 5E what is it about 2nd ed that we miss?

To give just one illustration of how inane the desire for your PC modelling tools to also be a world simulator are, because of this desire for consistency it is hardcoded into the rules of D&D 3.X that an equally skilled lawyer and barmaid will earn equal amounts of money because that's how the profession skill works.
Actually, that skill does make special note of circumstances as heavily modifying the outcome.

And even if it did work as you say it does, so what? The results don't need to be realistic. This is a fantasy world with elves and dragons. It's not supposed to be realistic. The important thing is that the rules are consistent, and they must be consistent if we are to use them to model anything at all!

And if you (as DM) do not like the rules in any given area - if you think it's silly for a barmaid to make as much money as a barrister - then you can change them out for a more detailed model. The rules in the book are a simplified model of the world, taking into account certain assumptions, but that's not the same thing as being abstract; the rules do hold for normal circumstances that are likely to apply, and if those assumptions can no longer be safely assumed (as in your world curvature example), then that's why we have a DM there to intervene.

What would be insane would be to ascribe different realities - not just different scales of resolution, reflecting the same realities - to different characters, based not on anything within their world that could be a real distinction between them, but based on who is controlling them at the table.

If a wide variety of individuals with +20 to hit can each take between 10 and 30 arrow wounds without dropping, and even an unskilled novice with only +5 to hit can still take 3 arrows and keep going, then it would be insane to suggest that someone else with +20 to hit would die from a single arrow.

It doesn't need to be an OotS style of causal relation with actual levels, but the ability to hit and take a hit are clearly both linked to something that actually exists within the world (such as skill at arms, or actual combat experience), and whatever complex formula actually governs that stuff simply does not allow for anyone to drop from a single arrow hit.
 

log in or register to remove this ad

What would be insane would be to ascribe different realities - not just different scales of resolution, reflecting the same realities - to different characters, based not on anything within their world that could be a real distinction between them, but based on who is controlling them at the table.
I don't think D&D has ever quite gone there. I mean, the status quo style does, sorta, but it's a legitimate style, none the less.

"different scales of resolution, reflecting the same realities" though, works fine.


It doesn't need to be an OotS style of causal relation with actual levels, but the ability to hit and take a hit are clearly both linked to something that actually exists within the world (such as skill at arms, or actual combat experience), and whatever complex formula actually governs that stuff simply does not allow for anyone to drop from a single arrow hit.
But arrows can quite efficiently kill someone. And they do in fiction. Just not typically the hero or the important-to-the-plot villain. Unless your fiction is being written by GRR Martin instead of JRR Tolkien or MAR Barker. ;P
 

Not that it should need to present a consistent scientific-style model of 'how the world works,' when the world is both imaginary and fantastic, but, really, you're just approaching the model from the wrong angle, as if there were some objective reality being modeled. There isn't.
If it can't model an objective reality, then it marks a huge step back from every previous edition of the game. The point of an RPG is that it provides an objective model. If I wanted to tell a story, then I would play FATE or Savage Worlds or something.

So back on topic, this is something that 2E definitely handled better than 4E or 5E.

A good blacksmith doesn't have to be a high level character in 5e. Tool proficiency and a high stat would do it. Expertise if you want to be really good. What's odd about 5e is that the best Blacksmiths - the best everythings - are Rogues and Bards.
That gets back to your definition of "good", though. Is +5 good enough to qualify? I mean, you can hit a DC 25 with that, which most people cannot, though you might still fail a DC 10 check (about 20% of the time). If you set the bar at never failing a DC 15 check, then you would need Expertise and a proficiency bonus of +5, in addition to a stat of 18 or higher.

(And because of the way 5E measures proficiency bonus, relative to CR rather than Hit Dice, this theoretical blacksmith would probably have around 500 HP in order to generate enough defensive CR to make up for its lack of offensive CR.)
 

But arrows can quite efficiently kill someone. And they do in fiction. Just not typically the hero or the important-to-the-plot villain.
In some worlds, arrows can kill someone in one hit. This is true in our own reality, as well as in Middle-Earth. It's also true of Forgotten Realms, pre-Spellplague.

In worlds that reflect the 4E ruleset, a single arrow is only likely to kill those NPCs who are afflicted with minion disease, by which any wound from any weapon is instantly fatal (as would be a fall from more than ten feet). For anyone not suffering that condition, it would take an exceptionally lucky shot to fell even a kobold, and even a greenhorn should expect to withstand two arrows without dropping.
 

I'd actually say that 5e tracks a lot close to 4e than you give it credit for. When I look at the stat blocks it makes me think of 4e NPC design. It is incredibly easy to take something which exists and refluff it. There is nothing about 5e NPCs that need them to have feats, backgrounds, or skills any more so than a 4e NPC. Calculating the CR can be a challenge though.

It's been my impressions for NPCs. I expected the veteran to be a fully stated out fighter. But it's more like a DM took the stats of a guard, added a 2nd attack, double-tripled the HP, increased AC a bit, tweaked a few stats and DONE. It's a 1 minute job max, totally off the cuff.

... in reality it was probably given a bit more thought. But a 1 min rush job would have been almost identical. When I saw that, I felt I was allowed not to "try too much" when stating enemies...
 

The Sliding Scale of Freeform-NPC-Building versus Structured-NPC-Building
Freeform <-> 4e <-> 1e <-> 2e <-> 5e <-> 3e <-> Structured

I'd actually say that 5e tracks a lot close to 4e than you give it credit for. (...) There is nothing about 5e NPCs that need them to have feats, backgrounds, or skills any more so than a 4e NPC. Calculating the CR can be a challenge though.

You have a point. I suppose 2e and 5e could be swapped. I also suppose 1e and 4e could be swapped too.

The main reason I put 5e farther along the NPC Structure Scale is because:
(a) I strongly dislike Challenge Rating in general
(b) 5e has subclasses and skills in the core books. Whereas 2e didn't have kits or NonWeaponProfs until much later after core.
(c) Character Building in general was much simpler in 2e, due to simple lack-of-options in 2e core. Whereas in the 5e era players "expect" a lot from characters in general. So there is a chance that 5e people will mix char-gen rules with NPC-gen rules.

But you're right in that 5e directly advertises "Use Rulings instead of Rules" (which is the designers' way of saying Buy the Rules but Feel Free to Ignore Them).
So could be:
Freeform <-> 1e <-> 4e <-> 5e <-> 2e <-> 3e <-> Structured
 

I'm gonna try to wade through the loaded discussion of "leveling mechanics" and "suspension of disbelief" and OotS and Modelling. Hope my constitution modifier is high enough :-)

The 4e approach is to start with the world then layer the mechanics over the top.

I would go farther and say that 4e starts with the STORY. My understanding is that 4e uses a story-first approach, and "spotlight" mechanics. 4e used a lot of abstractions, like hitpoints, hitpoint-recovery, healing surges, generic "attacks", turn based combat, initiative, powers that only worked "in combat", minions, solos, controllers, and of course the artwork which was sometimes above realism.

I'd say 4e's goal was to be story-first, and function as a storytelling tool. I really like 4e for this approach.

If it can't model an objective reality, then it marks a huge step back from every previous edition of the game. The point of an RPG is that it provides an objective model. If I wanted to tell a story, then I would play FATE or Savage Worlds or something.

So back on topic, this is something that 2E definitely handled better than 4E or 5E.

I would argue that the point of RPGs is to (1) express the author's thoughts and (2) make money :-)

But I think you're right that 2e tried to be more. And that 2e was used as a "reality model" more than 4e was. At the dawn of modern entertainment, creators went with what was acceptable, and wanted to avoid "exceptions". 2e authors wanted to be compatible with existing homebrew settings and existing mideval fiction. So there was more of an emphasis on "believability" and "compatability with existing culture" during the 2e era.

I think you are mixing "objective modelling" with "exception-based design" though. IMO no D&D version is an acceptable model, using HP and turn-based combat and such. But I agree 2e was much better with consistency across tables, simplicity with character creation, and avoiding exception-based design. After all, "Exception-Based Design" was 4e's middle name!
 

5e has subclasses and skills in the core books. Whereas 2e didn't have kits or NonWeaponProfs until much later after core.
Actually, the 2E PHB had two options for how to handle skills - backgrounds, and Non-Weapon Proficiencies.

If you had a background in something, then the DM would just let you automatically perform tasks associated with that background. Or sometimes you might be allowed a check, if the DM thought your background experience was relevant to a task at hand.

Non-Weapon Proficiency was an alternative to backgrounds, where everyone had a number of Proficiency Slots based on class and level, with the ability to spend one slot on Cooking or two Slots on Literacy.

Later books expanded on the NWP system such that you could, for example, spend three Slots on a Banshee Wail that would instantly kill all living creatures within 30 feet (or whatever). When Skills & Powers came out, it replaced the slots with a point system.

If you want a game that didn't have anything like background skills, you would have to look earlier than 2E.
 

If a wide variety of individuals with +20 to hit can each take between 10 and 30 arrow wounds without dropping, and even an unskilled novice with only +5 to hit can still take 3 arrows and keep going, then it would be insane to suggest that someone else with +20 to hit would die from a single arrow.
Divorcing this from any analysis of RPG design, and just looking at it on its own terms, it makes no sense.

It makes no sense in terms of the history of the game. In AD&D, for instance, a high level druid may well have more hit points that a high level ranger (more d8s, and both may have no better than 16 CON), yet the ranger will have a higher to hit bonus (better table, more likely to have 17 STR). And thieves and clerics have comparable hit points per XP earned (d6 vs d8, but thieves are on a much more generous XP table and go to 10d6 at 160,000 XP rather than 9d8 at 225,000 XP). But clerics have better to hit bonuses (mostly because they start with 10 rather than 11 to hit AC 10). 0-level mercenaries have d4+3 hp, which is the same average as a fighter (d10), yet attack as 0-level humans and so have a comparative -1 to hit.

And that's not even getting onto the monster to-hit tables, which are incredibly ramped up at low HD before tapering off (in comparison to fighters) at the upper end. Nor onto the fact that NPC half-orcs attack on the monster table rather than the character class tables.

And it makes no sense in terms of scientific method: just as, in special relativity, space and time are relative but the space-time interval is constant, so in the model you're now discovering there is no constant variation of hit points relative to attack bonus, but there is a genereally constant relationship between hit points, damage dealt and attack bonus. (In other words, high level minions have lower hit points but better attack bonus than their lower-level analogues, and deal somewhat comparable damage; elites hold attack bonus constant but double both hit points and damage; etc.)
 


Into the Woods

Remove ads

Top