Kinda yes to both, but it still produces some wonderful play. (being married to the one player helps too!

)
Oh, no question. I'm certain you two have had a blast. But the main point was just that the basal assumption of D&D play is collaboration and cooperation--true "solitaire" play or one-DM/one-player play requires a bit of elbow grease. Not impossible, by any means, but a meaningful effort to make it work more-or-less equivalently to how regular play does.
Adventuring NPCs (nowadays called DMPCs as, I think, a largely derogatory term) are a near-constant in our games/parties anway, so nothing new there. Henches and hirelings are certainly another, but for ome reason have never been as common as one might expect (I suspect because people don't like having to pay them!). I try to avoid giving resources etc. that a normal party wouldn't receive; for example if I'm running a single player through a canned module I don't adjust the treasure - or the opponents! - because of that.
Interesting--though if you've been using both "player plays multiple PCs" and (to use your less-pejorative term) "Adventuring NPCs" (ANPCs) then I think you're really pretty close to playing the game as "intended." E.g. if there's a beefy Fighter ANPC, and the player is playing both a cunning sneak-Thief and a well-prepared Magic-User, and they occasionally hire the services of a local Cleric when they do delving they expect to be particularly dangerous, then you've basically just run the game as-is, just with the DM playing one character, the player playing two characters, and more or less the two of them splitting the responsibilities for the third. (That is, player giving orders and probably handling much of the ordinary interaction for the hireling, but DM presumably keeping tabs on whether the player has gone too far, upset them, or the like.) Hence, rebuilding the basic method of play, but both participants taking on significantly more responsibility than they normally would--a meaningful effort to "make it work."
More or less, I think we can agree that if both you and your wife had
tried to play it completely unchanged--no multiple PCs, no ANPCs, no hireling coordination, etc.--that there probably would have been a lot of difficulties and the experience would have been less enjoyable for you than with the tweaks you've gone for, even if they are not as deep or complex as other tweaks could be.
Indeed; I was thinking of the sum total of a system rather than individual parts; and I believe that sum total does sit on a sliding scale somewhere between zero and overkill.
Alright. My issue, then, is that the variance between "subsystems," to use a common albeit loose bit of jargon, can be
absolutely wild, which makes the average rating across the entire system not particularly helpful. Doubly so if it's solely meant as a personal "eyeballing it" metric, since (as I'll discuss more below) a lot of that can depend on someone actually having rejected significant portions of one system (due to familiarity and comfort) while evaluating the absolute and thoroughgoing entirety of another. Such evaluations will then be really unhelpfully biased, since we kind of run into a Ship of Theseus problem of how much removal is "too much" vs "completely acceptable."
Well, there's a reason why many tables dropped weapon speed and some other over-complicated bits. And some tables (like ours) added some complexity to a few bits e.g. hit points, to give them a bit more detail and imply that yes not all injuries are the same.
So, this then leads to the question: Are you actually judging the
whole game as being on the low end of mechanics, or are you only judging the game
you play(ed) as such? Because, if I were a betting man, I would consider putting good money on you rating a system in its entirety if it's one you sat down to read and review with only a limited (say, 3-4 session) playtime experience, vs. rating a system you've run for however many years (
You're close-ish, but not quite on.
The way I see it, early-era D&D put its detail-abstraction level in direct inverse to what could be done at the table through roleplay. We can't live-roleplay combat, it has to be completely abstracted and so that part got highly detailed rules. Exploration needed some abstraction and thus it got some rules, but not as detailed as combat as the players could often just describe what they do and how they do it, in a more useful way than for combat. Social interactions got very few rules as there wasn't much abstraction needed: you just live-played it out at the table.
Interesting--not a theory I'd heard before, but a solid one, I'll definitely be keeping that in mind gong forward. My only issue with this (at least thus far, since I'm splitting your thoughts here) is that this seems like a
non sequitur. That is, for the purpose of this discussion only, I don't necessarily care
why one thing got lots of mechanical detail and another got almost none. I only care
that they got different levels of detail, and thus glossing over it as "it's on the low end for mechanics" seems unhelpful when several important parts, intended to be commonly used (even if people chose not to), were in fact extremely detailed and finicky.
I'm not sure where you saw the 5-6 on d6 model for resolution; that one's new on me. What I'm more used to as a fallback resolution system where nothing else applies is to roll under a relevant stat on d20. So if you're trying to run along a narrow ledge you'd roll under your Dexterity score (with the roll perhaps situationally modified e.g. a bonus if it's a wide ledge or a penalty if you're in heavy armour) to not fall off, that sort of thing; and it's IME a very elegant and simple mechanic.
IIRC it was a discussion of the
very early days of D&D, when few of the messy/idiosyncratic subsystems people came to know (and sometimes love) had existed yet. I do know that "roll-under" checks were quite common a bit later, once more of the aforementioned subsystems came online.
The thing with "play to find out what happens" is that, in the context it usually appears, it's supposed to apply to the GM as well as the players; where in my view while it's a great principle for the players it's not for the GM, who should already know what could* happen and ideally be thinking a few steps ahead of any likely outcome in order to best deal with it if it arises.
* - note 'could', not 'will'.
I mean, realistically, the GM in Dungeon World (or any PbtA game) does know a good deal about what
could happen. That's what prep is for; indeed (not to go too deep into DW stuff), that's explicitly how Fronts are constructed. TL;DR, avoiding DW jargon: for both campaigns as a whole and for sessions (usually 1-3 at a time), you're supposed to come up with dangerous/threatening things, a Really Bad Thing that
will happen if a given danger/threat isn't prevented/stopped/forestalled/etc., and various alerting events (signs, symbols, calamities, etc.) that will happen before the Really Bad Thing unless it's prevented/etc. (In jargon terms, every Front has 2-3 Dangers, each Danger has an Impending Doom, and every Impending Doom has 1-3 Grim Portents if it's an adventure Front, or 3-5 Grim Portents if it's a campaign front. Examples are given for what Dangers, Impending Dooms, and Grim Portents
can be like, but it's best to come up with your own.)
This way, you reap the best of both worlds: as play advances, new Fronts may emerge (again, the analogy of "Fighting on multiple fronts"
really does apply extremely well here), whether because of player choices or simply because they make sense to become real threats; and yet at the same time, there's still a sense of the world
existing, as there are real threats that the GM knows
will be a problem...unless those meddling heroes get their noses all up into it. Improvisation and recognition of the players pushing the story forward is central, but there are still known things.
After all, you can't "draw maps, leave blanks" if there isn't enough world to draw maps
about, y'know? It just means that there's the known stuff, and yet also
ʜɪᴄ ꜱᴜɴᴛ ᴅʀᴀᴄᴏɴᴇꜱ. Likewise, you can't "think offscreen too" if there's nothing that ever happens unless it's on the proverbial screen.
I've never watched B5 and am not very interested. But Battlestar Galactica was similar; and while it was engaging to watch as a show I can't imagine porting it over to an RPG, because once you know the ending, what's the point?
Well, that's sort of the thing. You don't! You (as GM) know what would happen if the PCs just up and disappeared one day--and that it would (generally) be pretty bad.*
OK, let's take this one step further. We've got the beans class and the rice class. Each has a clear benefit and a clear weakness. Each of them would like to become the mythical "meat" class that can do everything, but that would bugger up game balance along with destroying niches; and the closer both get to becoming meat the less distinct they are, to the eventual point of their becoming both more powerful overall and indistinguishable from each other.
Now, let's say we want to introduce a "potatoes" class, that's as good as rice but has some added benefit (I don't know what - my knowledge of nutrition would fit into a teaspoon, with about a teaspoon's volume of space left over). Leaving it as-is will unbalance things, as potatoes become a clearly superior option to rice and maybe ot beans as well. So, we can either find some way of powering up rice and beans to compensate, or - and this is my preferred option - we can find a way to tone down potatoes to bring them in line with what already exists.
Sure. This is why class design is challenging. Generally, you avoid this by not doing what beans, potatoes, and rice do--namely, each of them covers lots of things and is only
really deficient in one or two areas. (Ironically, in this sense, potatoes are actually
mostly worse than other staple products, being weak but not
quite deficient in three different amino acids, as opposed to genuinely deficient in just one, so you still have to pair them with something else--sometimes two things!)
Instead, with class design, you make it so each class only brings (by default) a small set. If there are ten things that every adventuring party should really be doing, then no class should bring more than (say) three of them. Even if a character expends resources in order to get better at more things, they should never cover more than about half--and the opportunity cost of their choice is that they don't become absolute masters of whatever they started out being good at. For this purely "positive" design, opportunity cost is the primary drive.
Spells specifically, no; but (ideally*) you've already paid for them by becoming a member of a class who can cast them and in so doing have lost out on a bunch of other things you could have done instead e.g. fight worth a damn.
* - ideally, though in practice this payment is becoming smaller all the time.
Alright, but this would seem to be conceding the core point: there are broad swathes of game design, at least as far as D&D-like games are concerned, where it would be genuinely
impossible to think of designing those things in an
exclusively "absolutely every single beneficial thing MUST be matched with a detrimental thing." Instead, you can have differential access to abilities, such that Class A starts off with abilities 1 and 2, class B starts off with 3 and 4, class C with 5 and 6, etc. Then other classes can mix things up, bringing 1 and 6 or 2 and 4 or whatever. And as characters grow, they can become 1++++ and 2, or 1++ and 2++, or 1++ and 2+ and pick up 3, etc. Again, these things are purely "positive" in the technical sense that they only
add more things, but because you're limited in how much you can add, you can't get everything.
To use 4e as an analogy, both Paladins and Fighters are Defender classes: this means they start off with features that pursue the same end (punishing enemies for attacking the party, and drawing enemy fire/position to places advantageous to the party.) However, Paladin starts off with a minor in Leader (Lay on Hands, a Channel Divinity to help cleanse an ally), while Fighter starts off with a minor in Striker (mastery of weapons, and powerful weapon attacks to punish enemies.) Either one can easily choose to specialize in Defender things, through feats, powers, PPs, EDs, and items, making it so their enemies constantly face a lose/lose proposition in battle. Either can choose to pick up Leader-like features, and for the Paladin, that can effectively make them
almost as good as actually being a Leader--but that's a double opportunity cost, as they haven't improved their defensive abilities, and (generally) can't do
both Defender-y things
and Leader-y things simultaneously. Meanwhile, the Fighter can absolutely go full Striker, becoming an absolute terror to her enemies--risk her mark punishment and she'll
end you, but don't risk it and you've just made her job
so much easier, as long as she can take the hits.
But neither of these characters can do everything. Indeed, neither of them ever gets much more than about halfway there. The "I'm almost as much Leader as Defender" Paladin will definitely fall if forced to stand alone, because for all that healing, he can't dish out the punishment, and healing just delays the inevitable. The mega-damage Fighter may be a force of nature when dealing with an enemy squad, but what can she do against a veritable horde of little pissers? She can only cleave so much flesh on any given turn, and they are so very ready with their own weapons.
One could extend a classic phrase: It is not just that "no man is an island," it's also that no man is a
continent either. Even if you manage to claim another nearby island,
it's never going to be enough on its own. You need your allies. There can never be a "potato class" as you've described, because you don't
make classes that do "everything bean-class does, and also three other things."
*In my home game, a black dragon would take over the city; an assassin-cult would finish its civil war with the "yes we really do just want to murder when we like" faction winning and making a major decapitating strike against the main religious authorities that have spurned them; a cult, semi-unknowingly worshiping an elder orb beholder, would do horrendous damage to the planet in order to allow said elder orb to escape to the cosmos beyond; and a heretical sect of death-worshipping druids would begin transforming the region into a fetid swamp where their mushroom-hive-mind could live forever and become incredibly powerful.
Name one rule you can make that I can't break and use to be a bad DM for some group out there?
Point to the post where I said rules were perfect and I will be happy to do exactly as you have requested.