The Crimson Binome
Hero
When your level primarily measures how good you are at fighting, getting better at fighting by surviving fights is a pretty realistic mechanic.XP for killing enemies isn’t particularly “realistic” either.
When your level primarily measures how good you are at fighting, getting better at fighting by surviving fights is a pretty realistic mechanic.XP for killing enemies isn’t particularly “realistic” either.
Why would you design your adventure that way if you’re using gold as XP? It’s entirely possible to distribute gold gradually throughout a campaign the same way it is to distribute monsters.
XP for gold was an option in 2e, as well, IIRC, but, yes, it was pretty standard before that. In 3e, ironnically, you don't get xp for gold, but you can burn xp, via item creation, to save gold, and there was expected wealth/level, so there was still a link between gold & xp, just an indirect one...This encourages the players to favor cunning over brawn and leads to better play. I'm given to understand that this feature is not unique to GLOG but is how it was done in older editions (I started playing with 2nd ed).
You can give xp for avoiding monsters, or even not give xp (or give less) for fighting them. It doesn't make tons of sense, but little about xp does.I really do agree that killing the monsters shouldn't be the objective. And I really think that cunning over brawn is *good*. There is no need to force battle, it will happen sooner or later anyway. Monsters as XP can really distort the game.
Yes. That's "favoring cunning over brawn," right there.I'm kind of bothered by gold as the source of XP, because it too can distort the game! It encourages PCs stealing and hiding treasure from each other - if you palm that golf-ball-sized diamond and don't share it, you might have just gone up 2 levels.
It's right if the system is about encouraging/rewarding acquiring treasure, not fighting. You would want to escape from the zombie siege and loot the ruins of the towns the zombies had alrady passed through, for instance, while the zombies go eat somebody else, you could then follow the zombies on their rampage, looting as they go. Heck, zombies are slow, you could precede them, sell fake 'protection from zombie' scrolls to the locals, and case any choice loot they may have left after buying them, to pick up after the zombies eat their brains.It can also lead to logic-defying situation. If two groups go into an identical barrow, and at the end of one there is a small copper bowl worth 5 gp, the party made xp... but if the other groups - having faced the same traps and the same monsters - find at the end a 50 pound bejeweled golden bowl worth 10 000 xp, they somehow learned 2 000 more than the unlucky people who found the dinky bowl? A group of hero that repels a week long zombie siege in an abandoned tower might gain nothing, while others who rob a fat merchant might bet 500 XP for a lazy heist. This isn't right.
In the past, it was 'find & retain for a time,' so you could have the huge sum blow right by them - stolen by other adventurers, required to raise half the party, or whaever other money pit occurs to you. Slower-acting or growing-equity money pits could keep the party poor while letting them level up.Lastly it can put odd constrains on the GM, as the power and advancement of the heroes is now directly tied on monetary reward. If the GM wants to run some kind of gritty game with low monetary reward where the heroes are constantly poor... they won't level up. Conversly, if the party is going to find a huge sum for plot reason... probably a bad idea too.
Old-school you got exp for combat as well as treasure, and it was not unusual to aware combat xp for avoiding a combat (back in the day, I did half exp for avoiding a fight, then if you avoided it again, 1/4, etc... with the balance if you ever finally ganked 'em).Not all adventuring should be about money. And what you learn from an adventure isn't just about the reward.
So... what why I don't like it. What I would like to learn is if there are good alternatives that are "osr/old school appropriate" to gold as XP out there?
Except for the part where you see no improvement at all for huge stretches of time and then suddenly become significantly better all at once. Or the part where punching enough boars can somehow make you better at playing the lure (but actually playing the lute can’t).When your level primarily measures how good you are at fighting, getting better at fighting by surviving fights is a pretty realistic mechanic.
No more than XP for combat forces the GM to design campaigns to have consistent opportunities for violent conflict. All rules systems constrain design, it’s just a matter of what constraints are right for your purposes. Which is why, as I say, “what am I going to award XP for?” is as important of a question as “what are the themes of this campaign?”But that's the thing, it *forces* the GM to do this. It's adding a constraint.
That would be fairly realistic, with any skill. Most people don't improve significantly - to the point where you might notice it - on a day-to-day basis. You only really notice improvements when you consider a longer time scale.Except for the part where you see no improvement at all for huge stretches of time and then suddenly become significantly better all at once.
Past a certain point, of course. In order to have a model which is as realistic as reality, it would have to be equally complex, and thus entirely unusable. A good model is one which provides useful data relative to its complexity.All game mechanics are necessarily abstract and break down past a certain point.
If you assume that a character practices the lute in proportion to how many boars they punch, then it actually is reasonable to use boar conquest as a metric for lute skill advancement.Or the part where punching enough boars can somehow make you better at playing the lute (but actually playing the lute can’t).
But D&D characters don’t improve on “a longer timescale.” They improve literally overnight and all at once.That would be fairly realistic, with any skill. Most people don't improve significantly - to the point where you might notice it - on a day-to-day basis. You only really notice improvements when you consider a longer time scale.
I’m not saying it isn’t smart design, I’m saying it isn’t realistic. And that’s ok! My entire point is that game mechanics don’t need to be realistic to be good.That the game mechanics only reflect significant improvements is a testament to efficiency in design. Modeling insignificant improvements would be a waste of complexity.
Yes, and what is “useful data” depends on your purposes. XP for gold can be a useful model, even a more useful one than XP for enemies defeated, depending on what sort of game play you want to foster.Past a certain point, of course. In order to have a model which is as realistic as reality, it would have to be equally complex, and thus entirely unusable. A good model is one which provides useful data relative to its complexity.
Why would you assume that when that isn’t happening in the game though? Under XP for enemies defeated, a character can easily end up leveling up during the course of a single dungeon delve, before they’ve had a chance for downtime in which to practice non-combat skills.If you assume that a character practices the lute in proportion to how many boars they punch, then it actually is reasonable to use boar conquest as a metric for lute skill advancement.
You can talk about what ideas are an assumed part of the progression till you’re blue in the face, but if it isn’t reflected in the gameplay, it doesn’t matter. I could come up with a set of “assumptions” that handwave away the “unrealistic” parts of any system, it doesn’t change the fact that it’s unrealistic. Again, the question comes down to what unrealistic abstractions you’re willing to suspend disbelief for.In case you've forgotten, the idea that you're using all of your skills on a regular basis is part of the fundamental basis for level-based skill progression. If that assumption doesn't hold, then it's incorrect of you to try and apply that model.
No, you're improving constantly over the period between gaining levels. Gaining a level simply marks the breakpoint where your improvement is significant, relative to the previous breakpoint.But D&D characters don’t improve on “a longer timescale.” They improve literally overnight and all at once.
That sounds like you're playing it wrong, then. The designers assume that you're actually using the skills along the way, and the advancement is meant to reflect that usage. If you aren't using those skills, then... I dunno, your DM will figure it out.You can talk about what ideas are an assumed part of the progression till you’re blue in the face, but if it isn’t reflected in the gameplay, it doesn’t matter. I could come up with a set of “assumptions” that handwave away the “unrealistic” parts of any system, it doesn’t change the fact that it’s unrealistic. Again, the question comes down to what unrealistic abstractions you’re willing to suspend disbelief for.