• NOW LIVE! Into the Woods--new character species, eerie monsters, and haunting villains to populate the woodlands of your D&D games.

Why is it so important?

Raven Crowking said:
(1) Some resources were intrinsically hazardous to use. This includes spells that age you, System Shock, and the way potions mixed if you attempted to use two at once.

(2) Wandering monsters were intended to create a time constraint. If you sat around camping, or spent too much time searching an area, you ran a risk of encountering something else that might sap (or overwhelm!) your resources.

(3) Limitations to what one can do within a round. You can attack or cast a healing spell, for example.

Well, we know that 3.X gutted (1) from the game, with very few exceptions. Those sort of cost/risk assessments were apparently "unfun". We know that the WotC site has run an adventure design article, widely discussed on this forum at one time, about cutting (2) from games because, again, they are "unfun". We also know that 4e is designed to ensure that you can attack while, say, healing your companions because the types of decisions required by (3) are "unfun".
I sincerely doubt that 3 is being completely removed from the game. Rather, from what I've read, certain healing abilities are integrated with certain classes in such a way that the character's first role is no longer one of support/the walking band-aid.

It is the costs/risks associated with any given choice that make moderation a worthwhile option. Most players are smart enough to know that when making tactical decisions, even if, like the designers, they are not cognizant of why the game is becoming less fun, and fall under the mistaken belief that going further down the road of "no/reduced costs/risks" will somehow alleviate the problems that walking down that road has caused.


RC
Ah, but that's not what seems to be happening. I honestly don't see the designers reducing costs/risks so greatly, but instead refocusing them so the costs/risks are mostly decided upon in light of making the only encounter the PCs are garunteed to have (the one they're currently in) as interesting and fun as possible. I've said it before, but the way things are being described, I think designers might purposefully design per-encounter abilities which, if used with abandon, will leave players with reduced options by the time an encounter is over.

Take the barbarian, for example. My thought would be the barbarian gaining some sort of "mini-rage" which won't last for the whole encounter, but will be useable in any encounter, so the barbarian will have to time when to use that rage very carefully. In the current per-day system, the barbarian's decision to rage is binary "yes/no" for a given encounter. If an encounter is dangerous enough to merrit raging, then he should on the first possible round, since the rage will last the whole combat. In essence, if you run fewer combats than the barbarian has rages (or at least tend to), there's no decision on the barbarian's part: rage, rinse, repeat.

Spellcasters are similar. As it stands now, the only reason for a spellcaster not to use his biggest spells first is the threat of a possible later encounter that will require them. And smart nova casters know how to get around this with spells like rope trick, teleport, plane shift, etc. But that's a digression. Spellcasters are forced to focus not on the encounter at hand, but rather on an encounter that may never happen, and the system rewards it for them. It fails to account for the fact that there is only one encounter that is garunteed, and all others are only maybes that could become broken promises to the wise spellcaster who budgeted his magic wisely for an encounter that never comes. Players take a risk for essentially no reward, and they have very little control over whether or not they'll get that reward.

In fact, early encounters tend to be less tactical for the reason that the spellcaster is at full. They can blast with impunity, crossing off only one of three fireballs from their spell list while still maintaining a majority of their power. It's only later in the day that managing those remaining resources even comes up, if ever.

But, if the system is designed around the idea that this one encounter is everything, and resources are designed accordingly, then it will focus players on the tactical fun of deciding on this encounter. Rage this round or later, when the enemy closes? Do you use your only lightning bolt for the encounter now, or wait for the enemy to try to come through the choke point? Should the cleric go mix it up along side the fighter, knowing that the fighter will need healing soon and can only benefit from that area heal if the cleric is close enough and actually in combat? Should the rogue disappear into the shadows this round, or stick it out for another few rounds so that he can save it just in case the fighter needs someone to sneak around and help him whittle down at the BBEG with a nice sneak attack?

There might not be another encounter after this one, so why not make the rules for this one as much fun as possible?
 

log in or register to remove this ad

Jackelope King said:
I sincerely doubt that 3 is being completely removed from the game. Rather, from what I've read, certain healing abilities are integrated with certain classes in such a way that the character's first role is no longer one of support/the walking band-aid.

Given that we are told that a PC will be able to heal his comrades and attack at the same time, this is something where you and I are going to have to agree to disagree until there is something more substantial to examine.

As it stands now, the only reason for a spellcaster not to use his biggest spells first is the threat of a possible later encounter that will require them. And smart nova casters know how to get around this with spells like rope trick, teleport, plane shift, etc.

And yet you doubt, if there is no threat of a possible later encounter, that the spellcaster will use his biggest spells first? :confused:


EDIT: BTW, I consider spellcasters looking towards long-term ramifications to be a good thing, that enhances the game experience. Obviously we differ here.
 
Last edited:

Raven Crowking said:
Given that we are told that a PC will be able to heal his comrades and attack at the same time, this is something where you and I are going to have to agree to disagree until there is something more substantial to examine.
Yes we will. Not enough information for anything definitive.

And yet you doubt, if there is no threat of a possible later encounter, that the spellcaster will use his biggest spells first? :confused:
I'm saying that under the current per-day system, the game rewards players who know how to tip the scales towards their spellcasters by limiting themselves to the smallest number of encounters in a row possible, so that they can approach each encounter at full-power. Remember that in 3e, they're essentially given a goodie bag of spells and are told, "Now these need to last you all day through about four encounters, so be smart when you use them!"

So what does a crafty caster do? Use the bag as if there's only one encounter and then go rest and recover right away. Once he discovers that it is indeed possible for a spellcaster to regulate the encounters a party faces in a day, they discover that they can essentially control the button that resets their spells. So rather than use that goodie bag that was meant to be spread out over about four encounters, the spellcaster uses all the best stuff in the first one, goes off to rest, and then repeats later.

On the other hand, if the designers make it so that goodie bag is only good for one encounter, now the spellcaster is in-line with the other non-casters. The spellcaster can't "go nova" anymore by "borrowing" spells that were slated to be used in later encounters, because his resources are no longer alloted that way. Now the focus is on the only encounter that is garunteed to happen, and the spellcaster really can't abuse the system anymore to always go in with nova tactics.

EDIT: BTW, I consider spellcasters looking towards long-term ramifications to be a good thing, that enhances the game experience. Obviously we differ here.
It's not bad if they look towards long-term ramifications. But it is bad if the system forces you to do that at the expense of enjoying the current (garunteed) encounter. Indeed, it would be just as inaccurate for me to muse, "I consider spellcasters thinking tactically towards winning the current encounter to be a good thing, that enhances the game experience. Obviously we differ here."

I enjoy the long-term planning. What I don't enjoy is when the system makes it more difficult to enjoy what encounters you are facing because you have to save resources or be absolutely slaughtered in some later battle that may never come, so you're forced to sit out of the current one to "be prepared".

Put another way, there's nothing wrong with being prepared and always keeping a first-aid kit handy and staying on your guard, but you might want to start questioning the wisdom of hoarding all your resources and flinging pebbles at the enemy just so you'll still have bullets in case "THEY" come around the corner before you start making tin-foil hats for yourself ;)
 

Jackelope King said:
I'm saying that under the current per-day system, the game rewards players who know how to tip the scales towards their spellcasters by limiting themselves to the smallest number of encounters in a row possible, so that they can approach each encounter at full-power. Remember that in 3e, they're essentially given a goodie bag of spells and are told, "Now these need to last you all day through about four encounters, so be smart when you use them!"

So what does a crafty caster do? Use the bag as if there's only one encounter and then go rest and recover right away. Once he discovers that it is indeed possible for a spellcaster to regulate the encounters a party faces in a day, they discover that they can essentially control the button that resets their spells. So rather than use that goodie bag that was meant to be spread out over about four encounters, the spellcaster uses all the best stuff in the first one, goes off to rest, and then repeats later.

Absolutely agree.

But I also would say that

A game rewards players who know how to tip the scales towards their characters by limiting themselves to the smallest number of encounters in a row possible, so that they can approach each encounter at full-power. So long as there is a goodie bag intended to last all day, a crafty player will use the bag as if there's only one encounter and then go rest and recover right away. So rather than use that goodie bag that was meant to be spread out over a day's worth of encounters, the character uses all the best stuff in the first one, goes off to rest, and then repeats later.​

On the other hand, if the designers make it so that goodie bag is good for every encounter, the character can "go nova" in each encounter because he can reset after every encounter. However, you are correct in saying that the character can't abuse the system anymore because it can be balanced to assume that the characters always go in with nova tactics. This is why the mechanical threshold of significance is reduced; where previously it might have included any encounter that could affect the PC's resources, now it is only the narrow range that is challenging when the PCs go nova.

Of course, what 4e promises is a combination of these two systems. It will still be true that there is a "goodie bag" meant to be spread over one day's encounters, so the problem created by that type of resource managment will still exist.

This will definitely lead some to balance their encounters to assume that the characters always go in with nova tactics, using both the per-day and the per-encounter goodie bags. This reduces the mechanical threshold of significance even more; where previously it might have included any encounter that could affect the PC's per-encounter resources, now it is only the narrow range that is challenging when the PCs go nova with both per-encounter and per-day resources.

The only thing that prevents going nova with all resources is some form of risk or cost associated with doing so.


RC
 

Actually, the only thing that prevents a player from "going nova" is to partition resources in such a way that players don't have access to them "earlier" than they should, so you can't go nova in encounter 1 by stealing resources that the designers assumed wouldn't be spent until encounters 2, 3, and 4.

You confusion seems to stem from using the term "nova" incorrectly. "Going nova" was unbalanced because it meant using several encounters worth of resources to win. If resources aren't structured that way, then going nova is impossible. It means to burn resources that the designers assumed you'd use later in the day earlier than expected, and thus being more effective.

You are still failing to see that your definition of "mechanical threshold of significance" is faulty. The one I presented above shows that there are still significant thresholds of significance in what I described when you look at the encounter itself and the risks and challenges faced in any given encounter. Your definition makes the expenditure in and for given encounter irrelevant, and only looks at the net sum of the affects of multiple encounters on resources.

As I described in post #1070, you continue to overlook and define away the only mechanical threshold of significance that a designer can realistically assume. It's folly to design a resource system where the design assumes the long term that might not even happen to the detriment of the definite that is already happening.
 
Last edited:

Jackelope King said:
Actually, the only thing that prevents a player from "going nova" is to partition resources in such a way that players don't have access to them "earlier" than they should, so you can't go nova in encounter 1 by stealing resources that the designers assumed wouldn't be spent until encounters 2, 3, and 4.

Well, here we differ philosophically, because I believe that the designers have no business deciding what resources the players use in each encounter, out of the resources available. I am of the opinion that is is for the DM/designers to offer meaningful choices, which are meaningful because they have both context and consequence, and that is for the players to then make decisions on that basis. I am therefore against anything that removes either context or consequence as a matter of principle.

In any event, though, I have already agreed that a full per-encounter system eliminates the short adventuring day problem, so long ago that I'm shocked that this is still somehow "evidence" against by general position.

You confusion seems to stem from using the term "nova" incorrectly. "Going nova" was unbalanced because it meant using several encounters worth of resources to win.

You seem to think that going "nova" is anything short of using up all or most of your significant resources within a given encounter. Going "nova" isn't generally a problem within a strictly per-encounter system, as I said in the post you are responding to, the portion of which (apparently) you missed, so that while going "nova" is possible the reset is so short that the subsequent burn-out is generally not important.

If you truly believe that going nova is impossible in such a game, however, I challenge you to extend an encounter in such a system after the PCs have burned through their resources by throwing more mechanically significant opponents at them. If they don't complain, don't call it unfair, and don't feel nova-burn, then you'll have made your point.

You are still failing to see that your definition of "mechanical threshold of significance" is faulty.

You are still failing to demonstrate that it is so, or that you have understood my arguments to this point. Again, this may be my fault. However, as what is now occurring is a monumental waste of both of our times, let us simply agree to disagree until a year after the three core 4e books have released, and then we can examine whether or not the 9-9:15 adventuring day problem still exists.


RC
 
Last edited:

Raven Crowking said:
Well, here we differ philosophically, because I believe that the designers have no business deciding what resources the players use in each encounter, out of the resources available. I am of the opinion that is is for the DM/designers to offer meaningful choices, which are meaningful because they have both context and consequence, and that is for the players to then make decisions on that basis. I am therefore against anything that removes either context or consequence as a matter of principle.
One of our divides. Designers cannot reliably make those choices meaningful because they can't assume and design around multiple encounters that might not happen... their time would be better spent designing making that one garunteed encounter the best it can be.

In any event, though, I have already agreed that a full per-encounter system eliminates the short adventuring day problem, so long ago that I'm shocked that this is still somehow "evidence" against by general position.
My appologies.

You seem to think that going "nova" is anything short of using up all or most of your significant resources within a given encounter. Going "nova" isn't generally a problem within a strictly per-encounter system, as I said in the post you are responding to, the portion of which (apparently) you missed, so that while going "nova" is possible the reset is so short that the subsequent burn-out is generally not important.

If you truly believe that going nova is impossible in such a game, however, I challenge you to extend an encounter in such a system after the PCs have burned through their resources by throwing more mechanically significant opponents at them. If they don't complain, don't call it unfair, and don't feel nova-burn, then you'll have made your point.
I am perfectly aware fo what going nova is. What you're describing simply isn't going nova. Is it still a problem? Sure it is... draining characters of resources and hitting them while they're weak is a problem for any system where resources can be lost. but that isn't going nova. Going nova is a very voluntary action where you use more resources than normally alloted to you in a given time-frame to become temporarily more powerful than is otherwise possible. That's what "going nova" means. Not just "being powerful", but "being more powerful than you should be because you're using resources more quickly than the system accounts for".

You are still failing to demonstrate that it is so, or that you have understood my arguments to this point. Again, this may be my fault. However, as what is now occurring is a monumental waste of both of our times, let us simply agree to disagree until a year after the three core 4e books have released, and then we can examine whether or not the 9-9:15 adventuring day problem still exists.

RC
Fair enough.
 

It doesn't matter if the character can expend all of his per-encounter resources if his per-encounter resources under 4e are 1/4 of his per-day resources under 3.5. For a number of different reasons its likely to be closer to 1/3 or so. Will that assuage your worries, about 'nova', Raven?

In general, 4e seems to be 'disabling' the ability to borrow resources, either forward or back in time. Classes do this by limiting power at one end or the other of the power curve (usually giving up power at lower level for extra power at higher levels). In encounters, this leads to most of a character's power being available at all times, but not being able to 'save' power forward. Currently, encounter design has to assume that characters can optionally save power from previous encounters, or borrow power from following encounters. Its another example of the unpredictability of party power level. In this case the most predictable encounter is the first one in a day - as an adventure designer you can assume that the party is at 100% (though because of the variability introduced by the skills system and by the multiclassing system you can't know what every party's 100% power level is. After that first encounter for the day you cannot predict what the party's power level is. After the first encounter of the adventure, you can't even predict what the first encounter of the day is without heavy railroading.

80% of a character's power being in per-encounter/at-will abilities is an answer to the question "what is the party's power level for this specific encounter?" The answer is "somewhere between 80% and 100% of the party's maximum power".

The more I think about 4ed design, the more I see what the fundamentals of 4ed are going to be, and the more I like them. The goal is predictability of design. No matter where you look, no matter what segment of play you're examining, the goal is to cut off the far right and left of the bell curve, and flatten the top of the curve to spread it out. The opposing goal is to prevent everything from looking and playing the same - this is a balancing act; and will require some compromise.

Raven Crowking said:
Well, here we differ philosophically, because I believe that the designers have no business deciding what resources the players use in each encounter, out of the resources available. I am of the opinion that is is for the DM/designers to offer meaningful choices, which are meaningful because they have both context and consequence, and that is for the players to then make decisions on that basis. I am therefore against anything that removes either context or consequence as a matter of principle.
The problem with this is the people who cannot prep before running for whatever reason, and have to use commercial-off-the-shelf modules with no editing whatsoever. If you have time to prep, of course it's trivial to design your own adventures etc to target your own party and its tactics, etc. In a COTS module, the module designer cannot know the capabilities of the party being run through the adventure. In that case the predictability of party design is a big gain. And at the same time, if the GM is running a COTS module, the party has no insight into the designers' thinking or design theory, and cannot 'budget' their power. And you don't get to say that's not a viable way to play; the designers have to support the no-prep game style, or people fall away because they don't have time to prep and play both.

Adventure design has to be decoupled from party design; and the way the designers have chosen to do this is to make party capability at every 'level' of design theory (from character design to advancement to monster placement etc) the 'curve' of variability has been cut off at both ends and flattened. Characters will all be MUCH closer in power level to each other at each level and each encounter. This is for the benefit of adventure designers primarily, not for players.
 

Jackelope King said:
but that isn't going nova. Going nova is a very voluntary action where you use more resources than normally alloted to you in a given time-frame to become temporarily more powerful than is otherwise possible. That's what "going nova" means. Not just "being powerful", but "being more powerful than you should be because you're using resources more quickly than the system accounts for".


Ah, then. Obviously the problem doesn't exist. I can't use more per-day resources within a given encounter than allotted to me in a given day either.
 

IanArgent said:
The problem with this is the people who cannot prep before running for whatever reason, and have to use commercial-off-the-shelf modules with no editing whatsoever.

I think that my point is even more true under these circumstances. A COTS module should not decide what resources should be used in any given encounter, if for no other reason that that the writer cannot have enough information to make this determination. A COTS module should present interesting decisions, with context and consequence built into them.


RC
 

Into the Woods

Remove ads

Top