Understanding the Edition Wars (and other heated arguments)

As someone into psychology, I found this article interesting:

5 Logical Fallacies That Make You Wrong More Than You Think | Cracked.com

It's not about D&D or roleplaying, so apologies and forewarning regarding that.

However, it's pretty funny (being from Cracked) and it points to some interesting, normal psychology that leads to behavior preventing changing one's position during an argument, and how one views the "other side" during arguments.


I thought it'd be interesting to share here, and might be helpful to reflect on the way we ourselves post (especially about edition related issues) and how we might view other posters.
 

log in or register to remove this ad

Only somebody incredibly fat and lazy would post such drivel. I'm not emotional when I argue! That's nonsense! And you're a stupid-head who spits acid! :)

Seriously, cool article. I've never actually seen a cracked article that was actually about a serious topic for once mixed with their style of humor. Pretty interesting to know that when I defend my favorite RPG, I'm defending my right to breed!!!
 
Last edited:

Interesting subject. I think another reasion edition wars get so heated is they are debates over preferences not facts (facts get brought in to buffer opinions, but you are really dealing with taste); they are lincoln isn't better than a cadilac type arguments. I like to compare edition wars to local debates (here in boston) over greek style pizza versus Italian style pizza. People argue over the things the attribute their like/dislike to. But if you disprove a greek pizza haters assumption, it is still going to taste bad to him.
 



Interesting take on arguments, but the guy is wrong about for example - the logic of spending money to prevent rare causes of death. The problem is that prevent common causes of death usually fall into the class of wicked problems. This is actually a point that the author raises in the rebuttle, when he says, "Unfortunately, running a government or an economy is a little more complicated, and we're still stuck in "Bear = Run Away" mode." This same point can be used to argue that, for example, it would be cheaper per death prevented to try to stop deaths by bears - even though these are statistically rare - than it would be to for example, stop deaths from household accidents like slips and fall (which are far more common). How you attack the problem of stopping deaths from bears (shoot bears) is conceptually easier and more straight forward than actually injury proofing everyone's homes - to say nothing of being actually less intrusive (most people don't live with bears). This in no way shows that killing bears is money well spent, but does expose that there may be more logic in the argument than merely failure to understand probabilities. The logic may actually be, "This event is unlikely, but at least the answer seems tractable. Whereas I don't even have a clue about what to do about this more immediate and likely problem, so I'll have to ignore it whether I want to or not." The ultimate example of this is the problem of old age, which is 100% likely to kill you if nothing else will, but may not necessarily be worth spending a lot of money on right a way because there may not be any thing we can do about it no matter how much money we spend.

There are actually two other related common sense falacies tied up with that. One of them is the 'man hour' falacy, which basically says that if a problem requires 10 years to figure out, that if you hired twice as many people to work on it that it would necessarily require only 5 years to figure out and so forth. So if you spent 1000 times as much on a problem that took 10 years to figure out, you could get it done in 3 or 4 days, right? The answer is, not necessarily and indeed, usually not. Related to this in American discourse is the 'Manhattan Project' falacy, which says, that for any seemingly hard and intractable project, an answer could be readily supplied if only enough will existed to accomplish it. The problem with this falacy is that at the beginning of the Manhattan Project, science and engineering of building an atomic weapon were actually well understood. We had a pretty good idea of how to actually go about it, the math was worked out, and worse come to worse we knew of a brute force approach to the problem that would eventually solve it. In general, none of these are generally true about wicked and intractable problems, so comparing for example solving the energy crisis or building a car that runs on water to the Manhattan Project is generally a sign of poor education in engineering more than it is anything else.

Or in short, while there is no doubt that we mismanage our priorities, it is certainly not true that merely looking at the likelihood of events and other simple statistical would result in proper prioritizing. Indeed, as far as I can tell, the author in the second section is using some logical fallacies to insist on the logic of some of his own pet causes. One sign of this is that he's actually pretty admirably biased in attacking both sides of a political position in every other section, but trivially dismisses a series of related political positions in this one section only. This is a clear indication to me that at least in this section he'd stop thinking, "How do I show we are all biased?", and started thinking, "Aha, this is how I can win an argument!"

As for #3, I don't think I've ever thought anyone was being intentionally dishonest. In fact, I more often see the issues he raises in #3 being used as a fall back position as a way of dealing with uncomfortable facts. That is to say, when presented with facts, there is a tendency to cite sincerity of belief as a defence against truth either in the form of, "I sincerely believe X, therefore my opinion is just as valid as yours.", or else, "Even though you have evidence, your point can be dismissed because it associates something I like with something I don't want it associated with, therefore its ad hominem and you aren't allowed to point it out." Or in short, even though its probably true that no one in the debate is being intentionally dishonest, even if both sides know that it is not any help. The honesty and sincerity itself can be used as the basis of logical fallacies.
 
Last edited:

In edition wars, the most motivated debaters are the ones who strongly identify with a team (an edition) and want to win the fight against the enemy.

Very, very few of us want to look at the strengths and weaknesses of different editions, note how we could keep and combine those strengths from various editions without the weaknesses, etc.
 

In edition wars, the most motivated debaters are the ones who strongly identify with a team (an edition) and want to win the fight against the enemy.

Very, very few of us want to look at the strengths and weaknesses of different editions, note how we could keep and combine those strengths from various editions without the weaknesses, etc.

My experience is that, in fact, most of us are just fine looking at the strengths and weaknesses of different editions, and so on. Most of us are not so deeply invested in a particular edition to make a fuss over it. But when we have discussions in a public venue, someone who has an axe to grind tends to take exception. Those who are actually warring are the minority, and tend to screw things up for the rest of us.
 
Last edited:

Can someone ,please, XP Umbran for me? I cannot recall the last time I XP'd him, but I am being told that I need to spread the XP before I can XP him again.
 

In edition wars, the most motivated debaters are the ones who strongly identify with a team (an edition) and want to win the fight against the enemy.

Very, very few of us want to look at the strengths and weaknesses of different editions, note how we could keep and combine those strengths from various editions without the weaknesses, etc.

I'm going to call this as an example of arguing on the basis of "the other guy has something wrong with him". It's a logical fallacy and a subtle ad hominem attack. Basically it says that the other guy has no logical basis for his beliefs, and he is passionate for dishonest, blindly partisan or down right insane reasons.

I really honestly don't think that is ever the case. I think that pretty much everyone is evaluating the editions - or anything else - on the basis of the information that they have, and looking at the stengths and weaknesses of the two and making a critical judgment. Furthermore, I think that there is a cultural fallacy involved in the 'comprimise meme' you've just advanced, namely, that for any given problem you can take answers from both sides and come up with some sort of comprimise or halfway position that would be stronger than both and make more people happy. I think that its pretty rare when that is actually true, because there are usually legitimate tradeoffs where it is hard to produce 'win/win' for everyone or even most everyone. And I likewise reject that it is only intrangiance, irrationality, and stupidity that keep us from recognizing these magic comprimise positions.

If we put together both of your claims, we see a very strong preference for 'moderation' and a very strong attack on what you see as 'extremism'. Or in other words, you are staking out territory for your 'moderate' team against the teams you see as being to either side of you. And I dare say there is the whiff of you implying that the 'moderate' team is the rational, reasoning, considerate one and the other teams are composed of unreasoning wild eyed fanatics that are just ruining everything for you. Now, there are probably some good reasons for prefering moderation, and by no means am I saying that you are being dishonest or disingenius. I'm merely pointing out that your claim that the other guys are merely 'fighting for their team and demonizing the other side as 'the enemy' is ironicly a charge that can be directed right back at your post.
 

Remove ads

Top