EzekielRaiden
Follower of the Way
But that isn't the only way to create party interdependence. 4e was--by far--the best edition yet made for actually inducing party interdependence, and it had many things you reject, while avoiding many things you claim are necessary.Two words. Party interdependence.
Assuming the default style of play to be a group of characters going out on adventures (an assumption that doesn't seem too controversial) then doesn't it make sense to design in such a way as to encourage and support that style?
If each character is good at a quarter of the things they can do, then put four of 'em together in the right combination and between them they're good at everything.
That, and I'm quite fine with bigger parties than just four even if there's just four players.
So, again, we're left with something that doesn't actually seem to make a difference: a past edition of D&D successfully achieved the goal you seek without applying the harsh restrictions you're claiming are required.
I mean, I think there are three components here."Most of the people most of the time" still isn't "everyone all the time", which seems to be what some want.
1: Everyone should actually be getting to participate in some kind of meaningful way at...pretty much all occasions. "Meaningful" doesn't mean "powerful", but it does mean that having you there is contributing as opposed to being an irrelevant lump or an active hindrance. (Note, active hindrance; it's fine if attempted things fail and thus create hindrances, it's not fine if your primary "contribution" is to make things worse in most cases.)
And IME, when you force players into ineptitude in most cases, they just get very frustrated. Yes, in a minority of cases, they'll genuinely find something clever and creative to help contribute, but in the majority, either their creativity is simply not up to such a daunting task, or no amount of creativity could ever be up to such a task. Either one makes the player feel like, at best, a mere booster, and at worst useless or even an active hindrance to their friends.That said, IME even when a character is mechanically inept in a situation (e.g. an Illusionist in a party facing a bunch of illusion-immune undead) its player still finds ways to participate in-character and have the character do things.
This is a leisure-time activity. Folks shouldn't be sitting around waiting for the few moments where they get to contribute, nor should it be Olympic hurdle-jumping for them to even find ways to contribute. But, again, meaningful contribution is not the same as power--it just means you are, in some observable way, able to really help, up to the limit of randomness and player skill.
It's still dilution, something you have otherwise been utterly adamant about except in this one case...with the one class that is the most susceptible to usurping others' roles. Do you not see how your absolute "NO healers except Cleric" stance clashes with your "Oh, Wizards can be artillery too, that's fine" stance?Terminology issue, my bad. When I say combat I usually default to melee; and a wizard who wants to start dropping artillery into melee combats won't be around for long once the survivors among his allies get to him.
Further, artillery can also be useful against structures. Foes hiding in a wooden building? Fireball it, and see how long they stay in there.
The question is not whether it can be stronger, let alone perfect. The question is why this level of hyper-restriction is required in order to achieve your stated goal of party interdependence. Again, 4e had extremely high party interdependence, despite characters being hardier than 3e or 5e at low levels.* Yet 4e included Swordmage, a class with magical and combat abilities. It included Shaman, a class that could heal like Clerics can. If this is so, what reason forces us to choose such incredibly strident hyper-limited classes--where characters will be frequently left with little to nothing to contribute beyond some well-spoken roleplay--in order to achieve the stated goal of party interdependence? We have examples that seem to show otherwise, so what's going on?It's also a question of defining each class' niche. If for example divination spells could be entirely removed from arcane casters and given to divine casters, they then become part of the divine-casters' niche. If charm-dominate effects could be removed from generic wizards and given solely to Illusionists, they become part of the Illusionsts' niche. Lather rinse repeat until generic wizards are left with artillery and spot damage, buffs (e.g. Fly, Invisibility, etc.), and some oddball stuff like Identify at one end and Wish at the other. And artillery becomes their niche; as such, Clerics lose spells like Flamestrike, Call Lightning, etc.
Even with that, niche protection will never be perfect. But here, perfect is the enemy of good enough, and it can still be made a lot stronger than it is right now.![]()
*Believe it or not, because of the +Con bonus, 5e characters can easily overtake their 4e counterparts. I have done the math but it's longwinded, so TL;DR: at merely decent Con (+2 to +3), 4e characters get overtaken by their 5e equivalents relatively quickly, and only pull ahead again because 5e stops at level 20. At 5e's maximum Con (+5 modifier), 5e characters eventually always exceed their 4e counterparts--sometimes substantially, like by 15% or more. But this is off-topic.