RigaMortus
Explorer
First off, let me say that I could be completely wrong here and I'm willing to accept that. But...
Got into a slight tiff with a groupmate today when I tried to explain the difference between a 25% chance by rolling a 1d4 and a 25% chance by rolling 1d100.
Now, like I said, I am not a statistics major so I could be completely off (coincidentally enough, the person I had the dispute with is a math major of some sort). I can remember numerous times on this site where people who were really good with statistics would be able to break down average damage (for example).
Anyway, we had a 25% chance to succeed at something. My arguement (although I couldn't articulate that well) was that rolling a 1 on a 1d4 is not the same as rolling 1-25 on a 1d100. The reason being, there is a larger margin. All things equal, it would take you 1 out of 4 tries to roll a 1 on a 1d4 and 1 out of 25 tried to roll anything between a 1 to 25 on a 1d100. That is a lot more rolls. Anyway, I was hoping someone here can prove or disprove (hopefully prove) my theory here.
Got into a slight tiff with a groupmate today when I tried to explain the difference between a 25% chance by rolling a 1d4 and a 25% chance by rolling 1d100.
Now, like I said, I am not a statistics major so I could be completely off (coincidentally enough, the person I had the dispute with is a math major of some sort). I can remember numerous times on this site where people who were really good with statistics would be able to break down average damage (for example).
Anyway, we had a 25% chance to succeed at something. My arguement (although I couldn't articulate that well) was that rolling a 1 on a 1d4 is not the same as rolling 1-25 on a 1d100. The reason being, there is a larger margin. All things equal, it would take you 1 out of 4 tries to roll a 1 on a 1d4 and 1 out of 25 tried to roll anything between a 1 to 25 on a 1d100. That is a lot more rolls. Anyway, I was hoping someone here can prove or disprove (hopefully prove) my theory here.