Is it... too simple? (Related experience inside)

I guess I am looking at it from the perspective that once I spend Effort I consider it a "sunk" cost... so why do I care if I did succeed purely by effort or not...

I'm inclined to agree, but still think that misses the point of the OP. The point is that effort is a situational enhancement to your chance of success, as opposed to something like skill ranks that consistently enhances your chance of success. Everything else is just talking around that point.

Let's use your example of a spell, perhaps Bless (+1 on attack rolls and +1 morale bonus vs. fear). I expend a spell to get a +1 bonus... that according to this logic is only worth something if whoever I casted it on rolls within the small window where they are 1 under what they needed.

Yes, exactly.

How does a cleric decide whether expending the spell is worth it for the buff? Isn't this the same issue being claimed with Effort?

In my experience, by not casting Bless very often. Bless is not generally seen as a reliable and potent spell (unless you've got 50 1st level allies you are trying to buff), and generally a waste of a spell slot IME. The reliable benefits of something like Cure Light Wounds are seen as being vastly more important than Bless. I can probably list a half-dozen 1st level clerical spells that are relied on more than Bless precisely because it only helps about 1 in 20 attacks.

I don't think I've ever seen someone call out a buff to hit/skill check/AC/etc. as useless because the roll wasn't always exactly in the range where the buff made the difference and I've never seen agonizing over when to expend buff spells similar to what the OP seems to be saying about Effort (though Effort is easier to replenish by magnitudes than the spell)... all it is is a self-buff (consider it a spell)for anything.

No, but I have seen players argue that a buff that doesn't gain a sufficiently large edge isn't worthwhile to spend an action on, and more to the point that it is not worth it to spend resources on anything unless you can become reliably good at that thing. For example, don't spreading skill points around among many categories and instead focusing on a few categories they intend to be able to reliably succeed in. If the DC is 25 for example, having 3 skill ranks in 5 things is pretty meaningless compared to having 15 ranks in one thing, and that's generally true if you can be proactive even if (and maybe especially if) the DC is 15.


Again I'm confused about this... how would a buff that does mechanically reflect the differences in the character making the roll... unless he hits in whatever the range the buff creates work exactly?

Effort also isn't something you'd use on every roll....

The bolded part is the answer to your own question.

I'm not sure you understand what Edge is exactly. It allows you to conserve your points from pools when using Effort... I'm not sure how Edge would be more common or more central to the system... could you explain?

That's pretty much what I thought edge was. Basically, back when the system was just being previewed, from my casual reading I thought that characters would be in large part defined by their Edges, so that a character with edge was reliably advantaged without spending scarse narrative resources compared to one that wasn't. What I was thinking of was a situation similar to GURPS, where spellcasters are defined by the spells that they can cast with sufficient skill to avoid spending fatigue/spell points. So for example, I thought you could define a fighter as having an Edge in combat, so that they would reliably be a better fighter without spending effort, or that you could define a rogue by having an Edge in sneakiness, so that they would be reliably better at sneaking without having to spend effort at it.

But now that the system is out, that doesn't seem to be how it works. You doesn't seem like you have specific edges like that - that's the province of skills, apparantly. And you have to have a lot of edge to get free effort, so its more of a 'level up' sort of thing than initial character differentiation.

The 'randomness' of the system that the OP is talking about seems to be a very common complaint, and its resonating for me because I just took Baldur's Gate out for a spin since I never played it back when it was out and had heard so many good things about it. And frankly, the 2e mechanics feel really clunky to me compared to modern cRPG mechanics (or even Neverwinter Nights 3e derived mechanics). Basically, in a cRPG I've gotten used to the idea over the last 10-15 years that if you die, it's because you made a tactical mistake, and you start over and do something different in order to progress past that point. Heck, I had that idea back from Ultima IV. But that's not the experience I'm having of Baldur's Gate, where death just seems to come out of the blue almost completely randomly (3 kobolds land 3 successive ranged attacks on my highest AC character from across the screen, doing 21 damage, for example). Instead, Baldur's Gate seems to encourage more of a 'if at first you don't succeed, it's probably just because the enemy rolled a critical hit before you did, try try again...' mentality. Now, I'm a bit frustrated because I can tell that BG is even more random than 1e/2e do to 'tweaks' they've made, but the fact remains right now I'm really feeling that sort of randomness is really primitive (and its worth noting, I'm running a 3e game right now that hasn't felt really random very often). If Numenera ends up just being 2e D&D meets FATE, I'm not sure I'm that interested.
 

log in or register to remove this ad

I'm inclined to agree, but still think that misses the point of the OP. The point is that effort is a situational enhancement to your chance of success, as opposed to something like skill ranks that consistently enhances your chance of success. Everything else is just talking around that point.

If you say so...


In my experience, by not casting Bless very often. Bless is not generally seen as a reliable and potent spell (unless you've got 50 1st level allies you are trying to buff), and generally a waste of a spell slot IME. The reliable benefits of something like Cure Light Wounds are seen as being vastly more important than Bless. I can probably list a half-dozen 1st level clerical spells that are relied on more than Bless precisely because it only helps about 1 in 20 attacks.

I wasn't really trying to debate the value or non-value of the Bless spell, it was simple and similar in mechanics so I chose it as an example...


No, but I have seen players argue that a buff that doesn't gain a sufficiently large edge isn't worthwhile to spend an action on, and more to the point that it is not worth it to spend resources on anything unless you can become reliably good at that thing. For example, don't spreading skill points around among many categories and instead focusing on a few categories they intend to be able to reliably succeed in. If the DC is 25 for example, having 3 skill ranks in 5 things is pretty meaningless compared to having 15 ranks in one thing, and that's generally true if you can be proactive even if (and maybe especially if) the DC is 15.

What is "sufficiently" large though? Because having 15 ranks is meaningless compared to having 40 against a DC of 50...


The bolded part is the answer to your own question.

In what way?



That's pretty much what I thought edge was. Basically, back when the system was just being previewed, from my casual reading I thought that characters would be in large part defined by their Edges, so that a character with edge was reliably advantaged without spending scarse narrative resources compared to one that wasn't. What I was thinking of was a situation similar to GURPS, where spellcasters are defined by the spells that they can cast with sufficient skill to avoid spending fatigue/spell points. So for example, I thought you could define a fighter as having an Edge in combat, so that they would reliably be a better fighter without spending effort, or that you could define a rogue by having an Edge in sneakiness, so that they would be reliably better at sneaking without having to spend effort at it.

But now that the system is out, that doesn't seem to be how it works. You doesn't seem like you have specific edges like that - that's the province of skills, apparantly. And you have to have a lot of edge to get free effort, so its more of a 'level up' sort of thing than initial character differentiation.

Well by tier 2 you can have enough Edge (3), if you specialize in one stat to pay for a free level of effort in anything that relates to that stat (Might/Speed/Intellect). there are also certain abilities from some focuses that grant you boosts to Edge under certain circumstances as well as cyphers that can increase Edge... so I do think it's a little more complex than you seem to be presenting it above. But hey, we all have our own views on the game and while I enjoy it's mechanics I can see why some may not.
 
Last edited:

so if your chance of success is 0% and you use effort it increases it by 15%... if your chance of success is 30% spending one level of effort increases that chance by 15% to 45%. I get what you are saying but it doesn't change the fact that you have increased your effectiveness by +15%.
Well, how do you define 'effectiveness'? If my normal chance of success is 15%, then an increase of 15% means I will succeed twice as often. That's what I'd call doubling my effectiveness, i.e. the increase in effectiveness is relative to my normal chance.
If my normal chance of success is 30%, then increasing the chance by 15% means I'll succeed 1.5 times as often (on average). If my normal chance is 0%, it's a non-decision, since I can only ever succeed if I take the 15% bonus. If my normal chance is 90% I normally wouldn't bother imporiving the chance by 15%.

All of this is of course assuming I actually consider the check important enough that I want to succeed. There are also situations in which it might be more important _not to fail_. Taking my last example of a base chance of 90%: If my character's life depended on that single roll, I'd definitely take the 15% bonus to ensure automatic success.
 

Well, how do you define 'effectiveness'? If my normal chance of success is 15%, then an increase of 15% means I will succeed twice as often. That's what I'd call doubling my effectiveness, i.e. the increase in effectiveness is relative to my normal chance.
If my normal chance of success is 30%, then increasing the chance by 15% means I'll succeed 1.5 times as often (on average). If my normal chance is 0%, it's a non-decision, since I can only ever succeed if I take the 15% bonus. If my normal chance is 90% I normally wouldn't bother imporiving the chance by 15%.

All of this is of course assuming I actually consider the check important enough that I want to succeed. There are also situations in which it might be more important _not to fail_. Taking my last example of a base chance of 90%: If my character's life depended on that single roll, I'd definitely take the 15% bonus to ensure automatic success.

As I said earlier I understand what you are saying... bu this seems to be semantics based around the word "effectiveness"?? I'm still not seeing how you choosing to use relative effectiveness as the measuring stick proves what I said was "wrong". You are still increasing your effectiveness by 15%. I didn't say relative effectiveness, which seems to be what you are speaking to.
 

I'm not sure at this point that anyone is saying you are wrong per se, Imaro. I think what I'm seeing is a lot of explaining of different perceptions.

This is a problem that game designers have to face all the time: how a mechanic "feels" versus how the math actually works. It seems like this mechanic feels just fine to you, but some of us are trying to explain why it doesn't feel right to us. I will admit that I am coming from a less D&D-focused background. I generally dislike ALL 1d20 resolution mechanics, and whenever I am playing d20 games I try to build in rerolls or other ways to get around the reliance on the d20. (My main 4E character was an Elven Avenger of Tymora, if that gives you an idea.) I spend most of my time in Savage Worlds, where the players (and some villains) get two chances to succeed on every roll.

To give an example that is similar but opposite to this discussion, I hear so much talk on forums and podcasts about Apocalypse World and its derivatives. Much of this talk surrounds the "succeed with consequence" result, as if that's the entire focus of the game. When you look at the mechanics, though, that range is pretty small. It is more likely that a player will either straight-up succeed or fail than "succeed with consequence." Despite this narrow range, the very existence of that result seems to energize players of the system. It's almost all they talk about.

So while the range of results we are talking about is indeed narrow, and the resource may not be as much an issue in play as it might seem in reading, nevertheless it can cause an issue of perception.
 

To give an example that is similar but opposite to this discussion, I hear so much talk on forums and podcasts about Apocalypse World and its derivatives. Much of this talk surrounds the "succeed with consequence" result, as if that's the entire focus of the game. When you look at the mechanics, though, that range is pretty small. It is more likely that a player will either straight-up succeed or fail than "succeed with consequence." Despite this narrow range, the very existence of that result seems to energize players of the system. It's almost all they talk about.

So while the range of results we are talking about is indeed narrow, and the resource may not be as much an issue in play as it might seem in reading, nevertheless it can cause an issue of perception.

I agree.

And I'm going to keep hammering this home until it sticks:

How you think about a system, how you prepare to play aa system, and how you approach the game are more important than the system.

A lot of indy inspired games to me seem more revolutionary in how they are shaping how the players think about the game, than they are in actual mechanics. "Succeed with consequences" or "Failing foward" aren't new ideas or even particularly novel results of even process simulation, but by underlining them in the rules as designed in intended and important mechanical result changes the way people approach the game both as players and GMs. What you see is people internalizing and examining things that were alway there to use, but which many people (maybe even most people) didn't consciously understand or clearly see.
 

Another important point is that if you fail a skill check, you need to spend effort to try again:

If a character fails a task (whether it's climbing a wall, picking a lock, trying to figure out a mysterious device, or something else) she can attempt it again, but she must apply at least one level of Effort when retrying that task. A retry is a new action, not part of the same action that failed, and it takes the same amount of time as the first attempt did.

Thus characters with a high pool will succeed more often because they are able to retry more, whereas a character with low pool will not be able to as easily.
 

Remove ads

Top