Wasn't automatically gaining spells on leveling up an optional rule in the AD&D DMG for specialist wizards? I could be wrong though, it's literally been a decade and a half(?).
I'm actually starting to think gaining a new spell when you gained a new spell level went back to 1e, though maybe it was optional. In 2e, it was a spell every level, and I'm pretty sure, in the PH, yes.
Yeah, but the GM has explicitly fairly total control over what you can buy, especially as ye olde magical shoppe was not part of the setting. So you don't have that same menu effect as 3.5. There is no expectation that he allows players to buy stuff or pick anything so spell access is 100% with the DM.
Yep, in 3.5 the pendulum had swung all the way over to player entitlement with wealth/level and make/buy expected. All PCs benefited - casters could spend feats to essentially trade exp for gp when making items, but, if anything, non-casters benefited more from being able to custom-choose their items, since it made expected-for-the-level supernatural abilities and power combos readily available.
These two are interesting, and worth further discussion. I think that the relaxing of spell-casting restrictions has helped those classes a lot.
The pendulum metaphor can be applied along different dimensions. Player entitlement and DM empowerment are rather broad ones. Narrow it to restrictions/limitations on PC spellcasting and you do get a very different picture. Spellcasting has gotten easier, safer and less restrictive with every edition of D&D. No pendulum, it's all downhill, a well-greased slipper-slope. In 4e, casting a ranged spell was exactly as easy/safe/unrestricted as using a bow. I honestly thought we were looking back up at that slope from the middle of the valley, that it couldn't possibly have gotten any easier on casters. I was wrong. In 5e, casting a ranged spell right in someone's face is perfectly OK - there's not AoO, no advantage on their save, nothing. Using a bow in melee, OTOH, is at disadvantage. It's actually easier to cast the spell than use the weapon. I can't imagine how 6e could top that, but I suspect it just might try.
What's been less consistent is the degree to which spell power was reigned in to match the increasing facility of spell use. 2e actually did pull in some spells a bit, for instance, while also making them a little easier. 3e made casting much easier via maxxing the Concentration skill, reined in some spells, but also made spells more potent overall by slowing the improvement of saves and scaling the DCs with spell level - casting became both much easier, and, net, more powerful. In 4e, casting became even easier, but spells/day and spell power were radically reduced, bringing them almost into line with non-caster abilities - casting become even easier, but much less powerful. 5e made casting yet easier, and also more available with more spells/day, and greater flexibility, and made spells individually more powerful, and many saves don't scale, at all, while DCs scale with caster's character level (not even 'caster level').
So you can see how there's both a continuous trend
and a pendulum at work.
Bottom line, I think we can see overall that the pendulum empowers players of spellcasters much more than players of non-spellcasting classes.
I'm not saying the pendulum has not swung, just that players of spell casting classes are significantly more empowered than players of non spellcasting classes in 5E.
Sure.
Yes, viable is different but more or less equally subjective. If the difference is important to you, then I will state it differently. Making sub-optimal choices was quite possible in 4e, and it could lead to combinations that were much less viable than in 5e.
There were optimal and sub-optimal choices in every edition. In 3.5 the gap between them was enormous, even less-optimally power-gamed PCs could be rendered non-viable alongside more skillfully optimized ones, there were builds so broken, even optimizers turned away from them, and an un-optimized character could be outright non-viable. In 4e it was modest - the un-optimized PC was quite viable even along side optimized one, and the occasionally broken builds were 'updated.'
In 5e, it's irrelevant: You can optimize your PC or not, it won't often matter compared to the decisions your DM makes. There are a few fairly explicit things you /can/ optimize for - DPR, breaking the action economy, general Tier-1-versatility toe-stomping - not to the degree of 3e, but enough so's you'd notice - but even then, the DM is Empowered to cut the legs right out from under your build. 3.5 or 4e you could optimize to 'win' vs the expected CR/EL. In 5e, in service to fast combat, that expectation is set so low you don't need to, but the DM is left with so much latitude, it doesn't matter if you do.
Not a misconception actually
It absolutely is. It's based on the claim that the rules cause the exact same task to have a DC that scales with the level of the character performing the task. That is false.
Period.
I am sure you are aware, 4e was based around "level appropriate" challenges, specifically encounters.
4e, like 3e & 5e, had encounter guidelines. They worked rather more predictably, but they were still guidelines, not rules. No matter who slavishly you chose to follow them, they did not make the DC of the exact same task scale with the level of the character performing it, they simply recommended more difficult tasks to provide the same level of challenge to a higher level character.
Because the character had, in fact, advanced.
I have no idea where the 8 comes from.
Theoretical gap between max'd and neglected.
Then your comparison of expecting a ranger not to cast spells, really? You can't see the difference between a situation wherein reading the class description makes it absolutely clear (a whole subsection on spellcasting) that rangers use magic, and then somehow "expecting them not to" and one where reading the class description tells you that you are proficient with a weapon, and thinking using one might be a "viable" option.
I think the comparison is perfect, yes. In the case of the ranger, you assume the player reads the whole class description, to note that it uses magic, rather than getting as far as weapon proficiencies and jumping to the conclusion that a weapon-user must not use magic, at all. In the case of the fighter, you assume that the player reads weapon proficiencies, and jumps to the conclusion that the character will be equally capable with all weapons, rather than finishing his read of the class and seeing what it's features actually make it good at.
On a non-absurd note, non-caster have not been erased, people still want to play them, and I have no idea what "deserving choices" has to do with anything I said.
You denied that there was a non-caster class distinction in 5e. Now, clearly, there are casters in 5e. The only way to take what you said is that there are no non-casters. Thank you for coming around to the fact that there are. They are technically sub-classes, and they all specialize heavily in DPR and are notably lacking in player 'empowerment' (less 'entitled') compared to casters, but they clearly still exist.
The reality is that the martial/spellcaster dichotomy does not exist
You just
insisted that it still does. There are 5 non-magic-using sub-classes in the PH. Two of them are indisputably martial. Two more virtually so. The berserker, perhaps, is debatable. That 5 is less than 30+ is hardly at issue. That's fewer choices, right there.
I don't think that second sentence is true.
The ability to make a "bad" character is a result of the system requiring "building", where different elements of a PC's build interact (eg Stats, skills, class abilities, spells, etc) to produce the character's overall mechanical effectivenss, which is itself applied in relatively complex action resolution systems (eg Init, to hit vs AC, damage, action economy, etc).
A system that lacks that sort of intricacy - say, where all you do is stick numbers against descriptors - won't let you build a bad character.
It's possible to anti-power-game, or counter-optimize - intentionally create an execrable character - in any system where it's possible to optimize. And, you can optimize in any system where you have any choice, at all. The only question is how much choice, and how well-balanced are those choices, and what is the base-line un-optimized/not-counter-optimized ordinary character going to fall.
Classic D&D, especially before 2e, had relatively few player choices, more or fewer depending on how the DM chose to run things, as well. In some cases, you might roll your stats randomly & in-order, so class & race were your only choices (in some variants, you even rolled race), in others you might have a 'random' generation system with a lot of latitude in placing and adjusting the results. Aside from that, class was the main choice you made. If you were a caster you chose which spells to memorize, and everyone chose gear. That was about it. Characters died or became effective or 'broken' based on the luck of the dice, skill of the player, and whim of the DM. Mostly the last. Players may have been un-entitled/dis-empowered, but DMs were empowered six ways to Sunday. The way games were described back in the day - Monty Haul, Killer - were entirely based on DM proclivities. (It seemed to me at the time that both running and playing were a lot of fun, but part of that may have been the thrill of discovery. After the first 5 years or so, I settled into mostly running. I think maybe there was just something unique/special about that period of the hobby, the 70s, which I missed, and 80s, that hasn't been possible since.)
In 3.x/PF, you have a tremendous number of choices, more than any other edition, by no small margin. Many of them are traps, and (relatively) small (still pretty substantial) sub set of the millions of possible combinations of those choices are sufficiently powerful to render the rest non-viable by comparison. It's very player-entitled (empowering), and system-mastery-rewarding, and a PitA to DM.
4e didn't stick around long enough to accumulate as many choices, but it still had tons, but they were more neatly balanced and tightly structured, there weren't so many possible combinations (probably by orders of magnitude) and the base-line un-optimized PC was perfectly viable, even along side all but the most broken (and fairly swiftly 'updated'/errata'd/'nerfed') of optimized builds. There weren't nearly so many 'traps' but still a lot of 'chaff,' and a few annoying 'must haves,' especially in feats. For the DM, the saving grace in the face of all that player entitlement was that it was at least easy to run, if even to the point of being easy to fall into a formulaic 'rut.'
5e has far fewer choices, not since early 2e have players been so limited (disempowered/un-entitled), and choice of concept restricts choice of (sub)class, which grant differing levels of entitlement (empowerment/agency/whatever), and balance isn't even really a thing at the level of player-mediated choices. OTOH, it's DM-empowering as all get-out. For a DM with the skills/talents to take advantage of that empowerment, it's fun & easy to run (and not prone to formulaic ruts, since formulae
will fail you), even if you need to 'grow' into it, I'm sure it can be a very positive experience. (And, yes, 5e is the closest post-1e D&D has come to re-capturing that 'something' that made it so remarkable in the 80s.)
In D&D, suppose that all you had to do in building your character was stick a number (say, from an array) against each ability name, and pick either one weapon category or one school of magic. And resolution was all opposed checks, where you rolled the appropriate stat, and you could get advantage on the roll if, in the fiction, your weapon or your school of magic seemed like it would help you out.
In that system, you couldn't accidentally make a bad character. It would be completely transparent where you were placing your strengths and weaknesses, and you would get exactly what you built for.
I guess if the checks you used for critical things always mapped to your better stats, it could work out that way, yes.
This is really no comparison to how nerfed a fighter with a bow would be in 4e, nor in the "trap" of "playstyle". A 5e fighter can play either way in both comparisons and still be "viable", 4e not so much.
The problem with that comparison is that the 5e fighter is about it if you have a 'martial' concept in mind. In 4e, if you had a decidedly 'martial' concept (one that didn't immediately scream 'rogue'), you had more classes to choose from (Fighter, Warlord, Ranger). You didn't need to automatically go Fighter, then apply extreme (3.5) or minimal (5e) system mastery to customize the fighter into a viable take on your concept. You'd choose from Fighter, Warlord, Ranger or Rogue. If you wanted to play an archer, well, the Ranger, alone,
had an Archery build pretty obvious choice.
Mind you, I love the way 5e finally handled STR vs DEX weapons/builds in a way that wasn't unduly restrictive and superfluous-class-spawning. A 5e fighter can be STR-based or DEX-based with minimal application of system mastery. Not /none/ and not the kind of contra-optimization that this discussion has edged into, but it can quite smoothly be done, without having to dig up some special option or other class that gets a big AC bonus unarmored or huge damage bonus with a rapier or anything /weird/.