D&D 4E What AI thinks about 4th Edition

And although we now know that WotC folks in-house did have marching orders to pull in MMO players, per @mearls, it's not like D&D didn't have these de facto roles before. I remember very clearly during the peak of 3E character optimization, that wizards controlling the battlefield was always seen as the optimal approach to the game, and leaving the damage dealing to other classes.
I think this is a really good point. The roles in the game largely came out of very early play. I remember being told, "Stay behind the meat shield!" ... in the 1970s. The biggest role that we didn't really see until later was the striker. Sure, certain classes could do more damage (Rangers were a good example), but I think it would take until 3E (or possibly 2E skills and powers, I don't remember this) where the Rogue got sneak attack instead of backstab for that role to get assigned to them.
 

log in or register to remove this ad

I think this is a really good point. The roles in the game largely came out of very early play. I remember being told, "Stay behind the meat shield!" ... in the 1970s. The biggest role that we didn't really see until later was the striker. Sure, certain classes could do more damage (Rangers were a good example), but I think it would take until 3E (or possibly 2E skills and powers, I don't remember this) where the Rogue got sneak attack instead of backstab for that role to get assigned to them.
While that's somewhat fair, I think there's still room to argue that for combat, Rogues were about hitting enemies precisely, which is a Striker-role aspect. It isn't the only aspect, to be sure, but it's something you see in several specific Strikers (Rogues, Rangers, Avengers, to a certain extent Monks).
 

Having played EverQuest and World of Warcraft, I think that although the terminology largely borrows from soccer, as you say, @EzekielRaiden, it's striking how close the set-up is to EverQuest (where controllers, in the form of the obligatory Enchanter class, at least at the time 4E was developed, are very much a thing). But even at its peak, EverQuest had been played by far fewer people than WoW ever was, so it was rarely in the 4E conversation that I saw (but I also was playing Castles & Crusades during that period, so I may have missed it).
It is worth noting that, while "Controller" isn't strictly its own role in WoW, it has its place--or at least it did up through when I stopped playing (late Cataclysm). That is, Mages could sheep, Hunters had a trap (sleep, maybe?), Rogues could Sap, I think Warlocks could "Banish" certain types of foes, etc. This CC wasn't strictly mandatory, but I remember running The Vortex Pinnacle and observing how significant it was to CC specific mobs in most packs.

I do think you're correct that there's a strain of old-school MMORPG design that moves in the same direction, but I would argue that that old-school MMO design specifically came from D&D, rather than the other way around. A lot of the design lessons MMOs have learned over the years...pretty much do come from needing to shed excessive adherence to old-school D&D design elements that were not super well-liked.

Now, conversely, many computer games today whether they are single-player or mulitplayer (particularly the one I play, FFXIV) are grappling with the reverse issue. They've smoothed out the experience so much that it becomes boring for anyone who isn't extremely casual, unless you do the bleeding-edge highly difficult content. Such "midcore" players have been left in the dust, even though they actually make up a sizable portion of the playerbase.

The correct response is to find a midpoint, not to conclude "ah, so because Elden Ring sold well and people are complaining about things being too easy, obviously we must make everything a meatgrinder!" That's not productive. Instead, what is productive is finding a better mdipoint for the needs of current and future audiences. Finding where the dividing line between "frustrating difficulty" and "rewarding challenge" lies. Finding the opposite side's dividing line, between "digestible and approachable" and "nothing to learn and no value to be gained" as well. Working to fill the space between--approachable but rewarding challenges, digestible but still somewhat demanding experiences.

Designers are beginning to realize that it's not bad to have a game that players need to learn to play. Unnecessary impediments and unproductive difficulty should be addressed! But removing everything that resists player action results in a bland, dull experience that is unfulfilling and far too easily dumped for the next all-too-easy experience.

And although we now know that WotC folks in-house did have marching orders to pull in MMO players, per @mearls, it's not like D&D didn't have these de facto roles before. I remember very clearly during the peak of 3E character optimization, that wizards controlling the battlefield was always seen as the optimal approach to the game, and leaving the damage dealing to other classes.
Correct, though I must beg your pardon for taking some of what Mearls says with a grain of salt. He has...done and said things which have eroded my trust in his judgment and, more importantly, his biases.

(And, I'd argue, all of these roles still apply in 5E and OSR games, although there's a lot more wiggle room since the math is less strict and not every class is so finely tuned to carry out their functions.)
Oh, most assuredly. There is no Fighter that doesn't have fairly strong ability to stand in the front and act as a meatshield. I disagree that strictness of math is even relevant here though, and would need to know what you mean by "so finely tuned"--being a Leader pretty much just boils down to "can heal a few times per combat, and can support allies in other ways, like with saving throws, condition removal, or granting attacks". Being a Defender pretty much just boils down to having the ability to punish those who choose to flaunt your marks, and being able to take a few more hits than others (and shrug off more attacks than others).

Which is one of the reasons why it's pretty infuriating the way folks talk about 4e's roles. This was straight-up something WotC did all the time in 3e, including specific advice for ways to play the various classes along these lines. You always--in both 3e and 4e--have the ability to ignore your base features if you want, and you always have the ability to branch out or grow in new directions. Some of those directions are harder than others, e.g. a Fighter who wants to do Leader things is going to have a long road ahead of them, but that's no different in 5e (indeed, arguably worse, since 5e offers so little in the way of alternate options and the Battle Master is a piss-poor substitute for an actual Warlord with actual healing, something Mearls explicitly said we would get and then reneged upon, hence my comments above.)
 

The roles aren't MMO-inspired. They come from soccer. That's why support characters are called "Leaders", a term that isn't ever used in MMOs (except maybe Neverwinter...because it's 4e-based)--but which is used in soccer, to refer to supportive players who run ahead of the ball to help line up shots for the player with the ball....who is known as a "Striker." Likewise, "marking" is a soccer concept too, meaning despite the many (many, many, many, many) complaints about it being "martial mind control" it actually is an IRL thing! Further, Defenders aren't "tanks". They are good at managing enemy attention, but they can't afford to take all attacks. Attacks need to be distributed across the group, otherwise even the Defender will die.

The skill system thing is simply wrong. "Proficiency" is a 5e thing.

The whole thing about Ritual Casting is a blatant lie. Nothing was "nerfed"--ironically, a word that actually is from MMOs!--but rather it was made widely accessible and not tied to daily resources, instead costing either money or materials. Traditional spellcasting classes (Artificer, Bard, Cleric, Druid, Psion, Wizard) all got Ritual Caster for free, so it simply is not true that any "nerfing" occurred...unless you are specifically biased against 4e.

Several things are presented in a relatively neutral light, but the above are pretty clear examples of anti-4e bias leaking into its "summary".

I will, at least, give it credit for not giving any of the wildly biased accounts of how Healing Surges work. I think it's a bit of a shame that it mentioned nothing about the most important benefits of Surges, but at least it didn't say anything outright false about them. Less an example of "bias" and more an example of ignorance--explaining only the barest minimum of detail when a little bit more would have been much more informative and useful.
Yup, good summary. Building on it a bit...

4E "Trained" for skills giving a flat +5 bonus is a pretty different concept from 5E Proficiency. Where in 5E being proficient is an on/off switch that scales with level IF it's "on", 4E instead has Trained for skills, which gives a substantial bonus but that bonus shrinks in importance as characters level up. It's still significant, but the general half level bonus everyone gets means that at high levels characters are broadly competent, with Trained characters in a skill being better, but the differentiation not being as stark. Whereas in 5E if you never acquire Proficiency with a saving throw, weapon, or skill you are generally way behind those who are (and it gets worse at higher levels), due to there being no other level-based increase.

4E does have proficiency bonus for weapons, but it's basically a flat +2 or +3 depending on the weapon, if you are proficient with it.

A tangent of this, for me, is IMO the significant flaw in the 3E and 5E saving throw systems, where there is a big disparity between strong saves and weak saves on higher level characters. In 5E you can easily have a 14th level character with a +0 total bonus to a particular save, where another has a +10, and in 3E you could commonly see a similar +15 to +5 disparity. This is another way casters in these editions can be even more powerful, due to the ability to target weak saves and the difficulty of covering all bases (even covering most of them normally requires some min/maxing).

This is an issue 4E mitigated quite a bit, between the half level bonus to defenses and deriving the ability bonus from the better of two different ability scores for each of Fort, Ref, and Will. The latter feature gives greater flexibility in character builds and mitigates the issue of Dex and Wis (and to some extent Con) being "god stats" in 5E.

TSR editions made saves primarily based on character level, secondarily on magic gear, and only to a limited extent giving bonuses from ability scores. Which also meant disparity in saves was less of an issue than we see in 3E and 5E.
 
Last edited:


A tangent of this, for me, is IMO the significant flaw in the 3E and 5E saving throw systems, where there is a big disparity between strong saves and weak saves on higher level characters. In 5E you can easily have a 14th level character with a +0 total bonus to a particular save, where another has a +10, and in 3E you could commonly see a similar +15 to +5 disparity. This is another way casters in these editions can be even more powerful, due to the ability to target weak saves and the difficulty of covering all bases (even covering most of them normally requires some min/maxing).
I think this is part of the knowledge that was lost after 4th edition. In 3X, there were big differences in saves, but you could at least mitigate them somewhat. 4E addressed this problem with weaker saves, where you would have differences, and yet they would be tighter.

5E ... if you don't get that save to start with, you're stuck with that bonus for the campaign. All while your enemies have spell DCs increasing. So, unless you're going to spend a feat that's an incredibly important resource, I guess you'll just fail. In my opinion, which all of this is of course, it's one of the worst design systems in the edition.
 

I think this is part of the knowledge that was lost after 4th edition. In 3X, there were big differences in saves, but you could at least mitigate them somewhat. 4E addressed this problem with weaker saves, where you would have differences, and yet they would be tighter.

5E ... if you don't get that save to start with, you're stuck with that bonus for the campaign. All while your enemies have spell DCs increasing. So, unless you're going to spend a feat that's an incredibly important resource, I guess you'll just fail. In my opinion, which all of this is of course, it's one of the worst design systems in the edition.
Fair point that 3E at least expected you to acquire a protection item (amulet or ring) to give you a flat across the board enhancement bonus to all saves. So by 14th you'd probably have a +3 item, anyway, so the disparity would still be 10 points, but the floor would be higher. But 5E keeps the disparity while ditching the expected magic items.
 

Remove ads

Top