Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Dungeons & Dragons
Replacing 1d20 with 3d6 is nearly pointless
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Esker" data-source="post: 7892710" data-attributes="member: 6966824"><p>Ok, we evidently need a little primer on probability. </p><p></p><p>I'm going to put it in a spoiler, because it got a little long:</p><p></p><p>[SPOILER]</p><p>When we talk about success chances, we are measuring the chance of some event occurring, right? In this case, the type of event we're interested in is "Our roll against DC X is successful". In vanilla 5e, a roll against DC X is successful if and only if the number we get on the die plus a modifier equals or exceeds X. So we're talking about the event R + M >= X, and we care about P(R+M >= X). For simplicity, we can move M to the other side and lump it in with the DC -- the event R >= X-M will be TRUE for exactly the same rolls that R+M >=X will be true, so we're not changing anything by doing that.</p><p></p><p>Now if I just have a mathematical statement, like R >= 3, I can write down a list of all the values of R for which that statement holds. If R comes from a d20, the list is 3, 4, 5, ..., 20. Since R can only take on one value, the sub-events R=3, R=4, R=5, ..., R=20 are non-overlapping, and so the probability that one of them occurs is the sum of the probabilities of each one.</p><p></p><p>Ok, but you knew that. Now what happens if I decide that I'm going to double the roll I get on the die? Now, my original event is 2R + M >= X, but we can still move M over, to get 2R >= X-M; no problem there. Just to simplify the notation, I'm going to define Y=X-M, so I can write things like 2R >= Y instead. Alright. So now I can either write down the list of values that 2R can take, or I can just write down the list of values that R can take, and for each one, decide whether it makes the event true or not.</p><p></p><p>So, on a d20, 2R can only take on even numbers: 2, 4, 6, ..., 40, and each of these has a 5% chance of occurring, because each one corresponds to exactly one value on the original d20, which (we presume) are all equally likely. So, how do I determine P(2R >= Y)? Well, if Y is, say, 10, I add up P(2R = 10) + P(2R = 12) + ... + P(2R = 40). So far so good. </p><p></p><p>What if Y is 11? Well, 2R can't actually <em>equal</em> 11, but we didn't ask what the probability that 2R = Y is, we asked what the probability that 2R >= Y is. What values of 2R satisfy that inequality? Well, they're almost the same ones as before, except we have to throw out 10. How about Y=12? Turns out it's the same values as Y=11 --- we would throw out 11 if 11 were a possible roll (if we're being very formal about it we still do get rid of P(Y=11), but this is zero, so subtracting out its probability doesn't do anything. So we still have P(2R >= 12) = P(2R = 12) + P(2R = 14) + ... + P(2R = 40), the same as P(2R >= 11).</p><p></p><p>How about Y = -74? Well now, the list of values of 2R that satisfy 2R >= Y expands to... well... all of them. So we get P(2R >= -74) = P(2R = 2) + P(2R = 4) + ... + P(2R = 40). Did we do anything wrong or sneaky or break anything in either the Y = 11 or Y = -74 case? What about Y = 3.14159? That's fine too. The event and the probability are perfectly well defined; we just have to consider the set of possible rolls that make the statement true (4 and up in the case of pi), and add up the probabilities of each of those rolls. If Y goes "out of bounds" to the left we'll get 1, since every roll makes the event true; if Y goes out of bounds to the right, we get 0, since none of them do. And if Y sits in between two possible rolls, it still has a set of possible rolls to the right of it, so we just look at those; though that means that the success chance is flat for a bit between each pair of successive rolls.</p><p></p><p>If we made a function, f(Y) = P(2R >= Y), it would actually be well defined at every real number value for Y; it would just be flat between values of 2R, since only when y crosses those values do we actually add anything to the set of outcomes that make the event true.</p><p></p><p>That's why the first graph in my last post had that saw shape with the flat bits. I can still talk about a DC 11 check even if I can't roll an 11, it just has the same difficulty as a DC 12 check unless I make some adjustment... not being able to roll 11 doesn't make the chance of succeeding at a DC 11 check 0. The important distinction, which I think is what's causing the confusion, is that the x-axis in the graph isn't showing rolls, it's showing DCs. Well, adjusted DCs, where we're moving the modifier over to the DC side instead of the roll side.</p><p></p><p>This sort of thing happens with RAW too, by the way, just to reassure you that I'm not introducing any voodoo. Take a rogue with a stealth modifier of +11, trying to sneak around a monster with passive perception of 10. The unadjusted DC is 10, but the adjusted DC is -2. I can talk about the probability that that rogue succeeds at that check, and I can talk about it as the probability that the d20 roll is -2 or better. Of course that winds up being the same as the probability that the d20 is 1 or better, but it's not wrong to write P(R >= -2).</p><p></p><p>Ok, now what about the confirmation correction I keep talking about? Well, we could perfectly well just use 2*3d6 - 10 and be done with it, and get the step-shaped graph I posted for our success probabilities. The game would run fine, albeit with a loss of granularity in the DC distinctions that actually matter. But it's not only aesthetically unsatisfying that the graph is jagged like that, nor is it only aesthetically unsatisfying that the 2*3d6-10 curve is above the d20 curve more often than it is below it, it's also a worse approximation than if we make an adjustment. And in this case it's a worse approximation because the 2*3d6-10 distribution has (slightly) the wrong mean.</p><p></p><p>So how does the confirmation correction fix this? It actually isn't just a smoothing mechanic; it fixes the whole die roll distribution to have the right mean. As I said, you can really think of the confirmation correction as having a 50% chance of subtracting 1 from <em>every</em> roll; in essence subtracting 1/2 on average from every roll (in the sense that the number of heads in a single coin flip is "on average" 1/2), and therefore subtracting 1/2 from the mean. But because I wanted a system that was not only <em>physical</em> but practically <em>efficient</em>, I noted that you don't care whether or not you subtract 1 from your roll unless doing so changes the outcome. And this will only happen if your roll was exactly equal to the adjusted DC. So though I can see why this makes it <em>seem</em> like I'm adjusting one point in an <em>ad hoc</em> fashion, it only seems that way because I'm ignoring meaningless rolls.</p><p></p><p></p><p></p><p>I didn't do that precisely because I wanted a system to correspond to something <em>physical</em> -- something you could actually implement.</p><p></p><p></p><p></p><p>Let's clarify something else here: the graphs of success probabilities aren't distributions at all; they're CDFs. The graphs (or the quantities we're depicting on the axes) don't have means or variances, as neither the DC nor the success chance is a random variable. So when I say we want to center a distribution at 10.5, I'm not talking about the DCs that are on the graph; this time I'm talking about the actual rolls. As it happens, if a symmetric distribution is centered at 10.5, then it also has its median at 10.5, meaning we are equally likely to get a value above the mean and below it.</p><p></p><p>The OP's original observation is that we could match the first two moments (the mean and variance) of the two roll distributions. I realized that since we started out noting that we wanted to double the 3d6 roll to match the variances, and since shifting by 10.5 was functionally identical to shifting by 11 as far as success probabilities go, we'd need to do something to "declump" the distribution in order to properly center it. The confirmation roll mechanic effectively turns the discrete roll distribution into a continuous one, making it easier to work with from a centering and scaling perspective (we could use percentile dice for the confirmation roll to enable us to set any fractional DC to a precision of 0.01, but that would be a little silly).</p><p></p><p>When I graph the success probability with the confirmation die factored in, I'm not just interpolating or smoothing; I'm actually showing you the probabilities of success at each DC (odd and even). Again, to find the success probability for DC 11, we can look at the rolls that satisfy 2R-10-(d2-1) >= 11. We can satisfy this if 2R-10 >= 12 -- that is, if 2R >= 22 (that is, if 2R = 22, 2R = 24, ..., 2R = 36), since for these rolls, subtracting (d2-1) at worst leaves us with 11, which is still a success. And actually that's the only way we can do it, since we can't get 2R-10 = 11, even though if we did d2-1 could be 0, satisfying the event. But if the DC is even (12, say), then there are two ways to get a success: either 2R-10 >= 13, regardless of the d2, or 2R-10 = 12 and the d2 comes up 2.</p><p></p><p></p><p></p><p>This is a common mistake: you're conflating joint probabilities with conditional probabilities. The DC isn't a random variable, really, so talking about the probability P(DC = 12 & modified roll >= 12) isn't really meaningful. We care about the <em>conditional</em> probability, P(modified roll >= 12 | DC = 12). But if you don't believe me, apply your own calculation to a d20 roll. What are the odds that you need a 12 and roll one? By your reasoning, it would be (1/16)*(1/20), or 0.003 (that is, 0.3%). But that's not what we care about when we talk about the likelihood of rolling a 12.</p><p></p><p>Again, by the way, if the confusion is due to my suggestion that we only roll to confirm when the roll is exactly equal to the DC, that was only to avoid pointless rolls. If you roll the d2 on every roll, then the probability of rolling an 11 is P(roll 12) * P(d2 = 1). On 2*3d6-10, that's P(3d6 = 11) * 1/2, or 0.125 * 0.5 = 0.0625. Pretty close actually to the 0.05 chance you have on a d20.</p><p></p><p></p><p></p><p>Dude, read what I post if you're going to reply. I untruncated the tails for you. And I've been explaining at great length why nothing is misaligned.</p><p></p><p>[/SPOILER]</p></blockquote><p></p>
[QUOTE="Esker, post: 7892710, member: 6966824"] Ok, we evidently need a little primer on probability. I'm going to put it in a spoiler, because it got a little long: [SPOILER] When we talk about success chances, we are measuring the chance of some event occurring, right? In this case, the type of event we're interested in is "Our roll against DC X is successful". In vanilla 5e, a roll against DC X is successful if and only if the number we get on the die plus a modifier equals or exceeds X. So we're talking about the event R + M >= X, and we care about P(R+M >= X). For simplicity, we can move M to the other side and lump it in with the DC -- the event R >= X-M will be TRUE for exactly the same rolls that R+M >=X will be true, so we're not changing anything by doing that. Now if I just have a mathematical statement, like R >= 3, I can write down a list of all the values of R for which that statement holds. If R comes from a d20, the list is 3, 4, 5, ..., 20. Since R can only take on one value, the sub-events R=3, R=4, R=5, ..., R=20 are non-overlapping, and so the probability that one of them occurs is the sum of the probabilities of each one. Ok, but you knew that. Now what happens if I decide that I'm going to double the roll I get on the die? Now, my original event is 2R + M >= X, but we can still move M over, to get 2R >= X-M; no problem there. Just to simplify the notation, I'm going to define Y=X-M, so I can write things like 2R >= Y instead. Alright. So now I can either write down the list of values that 2R can take, or I can just write down the list of values that R can take, and for each one, decide whether it makes the event true or not. So, on a d20, 2R can only take on even numbers: 2, 4, 6, ..., 40, and each of these has a 5% chance of occurring, because each one corresponds to exactly one value on the original d20, which (we presume) are all equally likely. So, how do I determine P(2R >= Y)? Well, if Y is, say, 10, I add up P(2R = 10) + P(2R = 12) + ... + P(2R = 40). So far so good. What if Y is 11? Well, 2R can't actually [I]equal[/I] 11, but we didn't ask what the probability that 2R = Y is, we asked what the probability that 2R >= Y is. What values of 2R satisfy that inequality? Well, they're almost the same ones as before, except we have to throw out 10. How about Y=12? Turns out it's the same values as Y=11 --- we would throw out 11 if 11 were a possible roll (if we're being very formal about it we still do get rid of P(Y=11), but this is zero, so subtracting out its probability doesn't do anything. So we still have P(2R >= 12) = P(2R = 12) + P(2R = 14) + ... + P(2R = 40), the same as P(2R >= 11). How about Y = -74? Well now, the list of values of 2R that satisfy 2R >= Y expands to... well... all of them. So we get P(2R >= -74) = P(2R = 2) + P(2R = 4) + ... + P(2R = 40). Did we do anything wrong or sneaky or break anything in either the Y = 11 or Y = -74 case? What about Y = 3.14159? That's fine too. The event and the probability are perfectly well defined; we just have to consider the set of possible rolls that make the statement true (4 and up in the case of pi), and add up the probabilities of each of those rolls. If Y goes "out of bounds" to the left we'll get 1, since every roll makes the event true; if Y goes out of bounds to the right, we get 0, since none of them do. And if Y sits in between two possible rolls, it still has a set of possible rolls to the right of it, so we just look at those; though that means that the success chance is flat for a bit between each pair of successive rolls. If we made a function, f(Y) = P(2R >= Y), it would actually be well defined at every real number value for Y; it would just be flat between values of 2R, since only when y crosses those values do we actually add anything to the set of outcomes that make the event true. That's why the first graph in my last post had that saw shape with the flat bits. I can still talk about a DC 11 check even if I can't roll an 11, it just has the same difficulty as a DC 12 check unless I make some adjustment... not being able to roll 11 doesn't make the chance of succeeding at a DC 11 check 0. The important distinction, which I think is what's causing the confusion, is that the x-axis in the graph isn't showing rolls, it's showing DCs. Well, adjusted DCs, where we're moving the modifier over to the DC side instead of the roll side. This sort of thing happens with RAW too, by the way, just to reassure you that I'm not introducing any voodoo. Take a rogue with a stealth modifier of +11, trying to sneak around a monster with passive perception of 10. The unadjusted DC is 10, but the adjusted DC is -2. I can talk about the probability that that rogue succeeds at that check, and I can talk about it as the probability that the d20 roll is -2 or better. Of course that winds up being the same as the probability that the d20 is 1 or better, but it's not wrong to write P(R >= -2). Ok, now what about the confirmation correction I keep talking about? Well, we could perfectly well just use 2*3d6 - 10 and be done with it, and get the step-shaped graph I posted for our success probabilities. The game would run fine, albeit with a loss of granularity in the DC distinctions that actually matter. But it's not only aesthetically unsatisfying that the graph is jagged like that, nor is it only aesthetically unsatisfying that the 2*3d6-10 curve is above the d20 curve more often than it is below it, it's also a worse approximation than if we make an adjustment. And in this case it's a worse approximation because the 2*3d6-10 distribution has (slightly) the wrong mean. So how does the confirmation correction fix this? It actually isn't just a smoothing mechanic; it fixes the whole die roll distribution to have the right mean. As I said, you can really think of the confirmation correction as having a 50% chance of subtracting 1 from [I]every[/I] roll; in essence subtracting 1/2 on average from every roll (in the sense that the number of heads in a single coin flip is "on average" 1/2), and therefore subtracting 1/2 from the mean. But because I wanted a system that was not only [I]physical[/I] but practically [I]efficient[/I], I noted that you don't care whether or not you subtract 1 from your roll unless doing so changes the outcome. And this will only happen if your roll was exactly equal to the adjusted DC. So though I can see why this makes it [I]seem[/I] like I'm adjusting one point in an [I]ad hoc[/I] fashion, it only seems that way because I'm ignoring meaningless rolls. I didn't do that precisely because I wanted a system to correspond to something [I]physical[/I] -- something you could actually implement. Let's clarify something else here: the graphs of success probabilities aren't distributions at all; they're CDFs. The graphs (or the quantities we're depicting on the axes) don't have means or variances, as neither the DC nor the success chance is a random variable. So when I say we want to center a distribution at 10.5, I'm not talking about the DCs that are on the graph; this time I'm talking about the actual rolls. As it happens, if a symmetric distribution is centered at 10.5, then it also has its median at 10.5, meaning we are equally likely to get a value above the mean and below it. The OP's original observation is that we could match the first two moments (the mean and variance) of the two roll distributions. I realized that since we started out noting that we wanted to double the 3d6 roll to match the variances, and since shifting by 10.5 was functionally identical to shifting by 11 as far as success probabilities go, we'd need to do something to "declump" the distribution in order to properly center it. The confirmation roll mechanic effectively turns the discrete roll distribution into a continuous one, making it easier to work with from a centering and scaling perspective (we could use percentile dice for the confirmation roll to enable us to set any fractional DC to a precision of 0.01, but that would be a little silly). When I graph the success probability with the confirmation die factored in, I'm not just interpolating or smoothing; I'm actually showing you the probabilities of success at each DC (odd and even). Again, to find the success probability for DC 11, we can look at the rolls that satisfy 2R-10-(d2-1) >= 11. We can satisfy this if 2R-10 >= 12 -- that is, if 2R >= 22 (that is, if 2R = 22, 2R = 24, ..., 2R = 36), since for these rolls, subtracting (d2-1) at worst leaves us with 11, which is still a success. And actually that's the only way we can do it, since we can't get 2R-10 = 11, even though if we did d2-1 could be 0, satisfying the event. But if the DC is even (12, say), then there are two ways to get a success: either 2R-10 >= 13, regardless of the d2, or 2R-10 = 12 and the d2 comes up 2. This is a common mistake: you're conflating joint probabilities with conditional probabilities. The DC isn't a random variable, really, so talking about the probability P(DC = 12 & modified roll >= 12) isn't really meaningful. We care about the [I]conditional[/I] probability, P(modified roll >= 12 | DC = 12). But if you don't believe me, apply your own calculation to a d20 roll. What are the odds that you need a 12 and roll one? By your reasoning, it would be (1/16)*(1/20), or 0.003 (that is, 0.3%). But that's not what we care about when we talk about the likelihood of rolling a 12. Again, by the way, if the confusion is due to my suggestion that we only roll to confirm when the roll is exactly equal to the DC, that was only to avoid pointless rolls. If you roll the d2 on every roll, then the probability of rolling an 11 is P(roll 12) * P(d2 = 1). On 2*3d6-10, that's P(3d6 = 11) * 1/2, or 0.125 * 0.5 = 0.0625. Pretty close actually to the 0.05 chance you have on a d20. Dude, read what I post if you're going to reply. I untruncated the tails for you. And I've been explaining at great length why nothing is misaligned. [/SPOILER] [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
Replacing 1d20 with 3d6 is nearly pointless
Top