• NOW LIVE! Into the Woods--new character species, eerie monsters, and haunting villains to populate the woodlands of your D&D games.

Is any one alignment intellectually superior?

Which alignment is intellectually superior?

  • Any Good

    Votes: 14 4.3%
  • Any Evil

    Votes: 1 0.3%
  • Any Neutral

    Votes: 8 2.4%
  • Any Lawful

    Votes: 15 4.6%
  • Any Chaotic

    Votes: 2 0.6%
  • Lawful Good

    Votes: 12 3.6%
  • Lawful Neutral

    Votes: 24 7.3%
  • Lawful Evil

    Votes: 21 6.4%
  • Neutral Good

    Votes: 35 10.6%
  • (True) Neutral

    Votes: 35 10.6%
  • Neutral Evil

    Votes: 6 1.8%
  • Chaotic Good

    Votes: 9 2.7%
  • Chaotic Neutral

    Votes: 6 1.8%
  • Chaotic Evil

    Votes: 2 0.6%
  • None

    Votes: 132 40.1%
  • Other

    Votes: 7 2.1%

  • Poll closed .
gizmo33 said:
Before I decided (with my monkey brain) that what I was reading was journalism, and not science (as it is defined by the community that self-applies this label).

Is this kind of article from the October 14, 2004 issue of Neuron (Vol 44) more to your liking?

http://www.csbmb.princeton.edu/~jdgreene/Greene-WebPage_files/Greene-etal-Neuron04.pdf

I chose the LA Times article because it's "layman friendly", not becuase it's playing fast and loose with the facts.

gizmo33 said:
I guess I don't know what you're talking about here. My comment stands whether or not you equate these two. Take either, or both, and define what is "superior" by either criteria. What I'm saying is that you can't define, universally, what is "intellectually superior" or "morally superior" without talking about what you expect as a result of either intellectual or morally based decisions.

I don't disagree with that. But when one asks which alignment is "intellectually superior", the answer many people will give is the answer that they think a purely intellectual response will usually produce. And as the article above says, "The present results indicate that brain regions associated with abstract reasoning and cognitive control (including dorsolateral prefrontal cortext and anterior cingulate cortex) are recruited to resolve difficult personal moral dilemmas in which utilitarian values require "personal" moral violations, violations that have previously been associated with increased activity in emotion-related brain regions."

In other words, I think many people are aware (intuitively if not consciously) that when their brain makes moral decisions, the parts of the abstract reasoning and cognitive control portion of their brains (which I certainly think of as the "intellectual" as opposed to "emotional" part) contribute ruthlessly utilitarian opinions that are willing to violate their personal moral codes. If you ask them to assess which form of morality best matches purely intellectual moral reasoning, it's not surprising that a lot of people point to ruthlessly utilitarian alignments. And because that's only part of how people make moral decisions, that answer may have absolutely nothing to do with how they ultimately make their own moral decisions because people generally make moral decisions with emotion as well as logical reasoning.

gizmo33 said:
Even something as simple as an IQ Test is burdened, I think legitimately, with questions as to whether or not the thing it seeks to measure even makes sense as a universal. Even if you know what YOUR particular criteria is, casual use of these terms will confuse other people who do not hold your definition.

The articles are talking about structures of the brain that contribute known elements to a person's thinking. In fact, one article I've read (perhapst he LA Times article that I pointed out earlier) points out that people with damage to some of these brain structures don't make certain types of moral decisions quickly or effectively.

gizmo33 said:
They would only know which region (IF ANY) of my brain was active at the time I made this decision. I would not expect anyone with scientific training to interpret the results using IMO vague terms like "morality" and "reason". I'd leave that to journalists (whose job, admittedly, it make things interesting).

Read the article from the academic journal that I provided a link to above if you want. If you don't like the use of the terms like "moral judgement" or "moral violations" in the article, you can take that that up with Dr. Greene at Princeton or one of the other authors.
 

log in or register to remove this ad

John, your arguments really don't coincide with your examples. Biological illustrations do nothing but reinforce the idea that emotion and intellect are intrinsically tied to moral action. In fact, it can then be considered completely intellectual...unfortunately. Re-read your articles. Its not a matter of 'when' and 'how' so much as 'if'. That's why I just discard the biologiocal argument--it muddles good philosophy up.
 

Wild Gazebo said:
John, your arguments really don't coincide with your examples.

Can you be more specific? If I could spot the problem you are talking about without help, I wouldn't be doing it if it's really a problem.

Wild Gazebo said:
Biological illustrations do nothing but reinforce the idea that emotion and intellect are intrinsically tied to moral action. In fact, it can then be considered completely intellectual...unfortunately.

I have been using the term "intellectual" as a contrast to "emotional". See Merriam-Websters first definition of "intellectual":

"of or relating to the intellect or its use b : developed or chiefly guided by the intellect rather than by emotion or experience : RATIONAL c : requiring use of the intellect"

I suspect that's not how everyone else is using it but I also suspect that's how some people other than myself are also using it in this thread. That may very well be why some people are assuming the question is a euphemism for "which alignment makes the most sense" and other people are assuming that the question is asking which alignment best reflects cold logical reasoning.

Wild Gazebo said:
Re-read your articles. Its not a matter of 'when' and 'how' so much as 'if'. That's why I just discard the biologiocal argument--it muddles good philosophy up.

I don't understand what you are saying here.
 


By illustrating the biological process through which a moral dilema is registered and potentially acted upon you are creating a new definition of intellect encompassing a rational structure of biological needs. By the comparison of a biological function you seem to be trying to separate alignment and intellect. But in fact it only intrisically connects the two. I know you are only wanting to separate emotion and intellect...but it really doesn't work that way. We cannot objectively ascribe emotion inherently to alignment. Even your examples demonstrate a relegation between the two, compromising one or the other, to attain a certain environmentally sound outcome. You see, your arguments are counter-productive. I wouldn't say anything normally...but I happen to agree with you. I feel intellect and alignment are completely separate--at the instant of cognation. The actual action or application of intellect I feel can be hindered by alignment, but not the pure state of intellect.

The 'when' and 'how' I am reffering to is my assumption of why you brought your examples into play. You stretched for some sort of tangible substanciation of your ideas to justify your argument. You looked at when a certain part of the brain excites versus how other parts of the brain excite and inffered our vague knowledge about those parts of the brain to create a sense of the creation or activity during a moral dilema. This description fails to illustrate whether an ethos caters to or hinders intellectual aptitude...it simply states it is part of a biological design. The 'if' is: what sort of relationship do alignment and intellect share?

I would argue that intellect, defined as the cognition of environment from and including self, can be present and equal across any culture, any ethos, and religion. For, intellect has no boundaries. The perception of a world beyond, of our environment, of subjective experience, of limitlessness begs a type of dominance over any sort of moral structures. Because, any sort of moral structure must be learned or created. Now, the action upon our morals or intellect can easily be curbed by one or the other...but I don't consider that a fair comparison...or even all that interesting. And, I think that is where (most of) the real misconceptions of these arguments stem...the difference between acting upon an intellectual goal and simply processing the possiblities internally.

Hope that clears things up.
 

Wild Gazebo said:
By illustrating the biological process through which a moral dilema is registered and potentially acted upon you are creating a new definition of intellect encompassing a rational structure of biological needs. By the comparison of a biological function you seem to be trying to separate alignment and intellect. But in fact it only intrisically connects the two.

Yes, because that's what I think many of the people answsering with Neutral or Evil alignments (on the Good to Evil) scale are trying to do, and I do think it makes some sense.

Wild Gazebo said:
I know you are only wanting to separate emotion and intellect...but it really doesn't work that way. We cannot objectively ascribe emotion inherently to alignment.

Actually, I think we can in many cases. For example, empathy seems to be fairly strongly tied to the Good and Evil axis. Sociopaths are characterized by deficient empathy and highly empathic people are inclined to treat the welfare of others as equal or more important than their own welfare. Empathy is generally experienced as an emotional response rather than a rational response and it's the absence of that emotional and intuitive response that characterizes sociopaths.

Wild Gazebo said:
Even your examples demonstrate a relegation between the two, compromising one or the other, to attain a certain environmentally sound outcome. You see, your arguments are counter-productive.

Counter-productive in what way? I'm trying to understand the question and responses more than I'm trying to "win" anything.

Wild Gazebo said:
I wouldn't say anything normally...but I happen to agree with you. I feel intellect and alignment are completely separate--at the instant of cognation. The actual action or application of intellect I feel can be hindered by alignment, but not the pure state of intellect.

That's the perspective I took when I answered "none". Pure abstract intellect can be applied toward any moral end. But I can also understand why people associate intellect, in the absence of emotion, with certain alignments (often Neutral or Evil on the Good/Evil axis) and I don't think that has anything to do with their own personal alignment preference or which alignment they think is best.

Wild Gazebo said:
The 'when' and 'how' I am reffering to is my assumption of why you brought your examples into play. You stretched for some sort of tangible substanciation of your ideas to justify your argument. You looked at when a certain part of the brain excites versus how other parts of the brain excite and inffered our vague knowledge about those parts of the brain to create a sense of the creation or activity during a moral dilema.

Actually, what the article describes pretty much matches my own internal sense of there being discreet components (or internal arguments) from which a moral decision is derived. And I don't think our knowledge of how those parts of the brain work is as vague as you think it is. This article discusses how brain damage can illustrate the role played by each of those portions of the brain:

http://www.csbmb.princeton.edu/~jdgreene/Greene-WebPage_files/Greene-Haidt-TiCS-02.pdf

Wild Gazebo said:
This description fails to illustrate whether an ethos caters to or hinders intellectual aptitude...it simply states it is part of a biological design. The 'if' is: what sort of relationship do alignment and intellect share?

That's not the relationship I'm trying to illustrate. The relationship I'm considering is what sorts or moral decisions might be made purely by reason rather than emotion. Given the clarification, that was the question that was being asked.

This research suggests that the rational or logical component is often ruthlessly utilitarian rather than empathetic or romantic. Empathy is probably an important component in producing Good or Evil moral decisions, thus I think it makes a lot of sense that many people in this tread are assuming that purely intellectual decisions will be ruthlessly utilitiarian and quite possibly Evil. And I think that may have no bearing on which alignment they personally favor the most.

Wild Gazebo said:
I would argue that intellect, defined as the cognition of environment from and including self, can be present and equal across any culture, any ethos, and religion.

But that's not how the author of the question defines "intellect". FreeTheSlaves, in clarification, wrote:

"I would define intellectually superior in this context to mean which (single or groups of) alignment hold greater rational reasons, rather than emotive reasons, to warrant being adhered to over the other alignments. "Greater" means the sum of it's quantity and the weight of it's quality of (rational) reasons combined."

In other words, FreeTheSlaves is, in fact, looking to seperate the rational component as opposed to the emotional component.

Wild Gazebo said:
For, intellect has no boundaries. The perception of a world beyond, of our environment, of subjective experience, of limitlessness begs a type of dominance over any sort of moral structures. Because, any sort of moral structure must be learned or created.

I don't think that's true. Both humans and chimpanzees predictably respond to certain moral tests in the same way (which contradicts what game theorists predict as rationally optimal behavior). That suggests to me that certain moral structures exist independently of being learned or created, even though the application may be flexible and the emotional response can be suppressed. Similarly, the regularity with which sociopaths are deficient in empathy suggests that being Good or Evil (in the D&D alignment sense) may hinge on a person's capacity to feel emmpathy for others. The article linked to above illustrates how various brain defects can produce predictable moral defects, so it's also not a stretch to imagine that a person's normal brain plays a role in producing a normal range of morality and that we have limits to our morality that we don't even notice.

Wild Gazebo said:
Now, the action upon our morals or intellect can easily be curbed by one or the other...but I don't consider that a fair comparison...or even all that interesting. And, I think that is where (most of) the real misconceptions of these arguments stem...the difference between acting upon an intellectual goal and simply processing the possiblities internally.

I think that may be the problem at an abstract level. But I'm looking to explain the details. I think that the research points to many of the rational possibilities that a person internally processes being ruthlessly efficient and utilitarian, even if people often reject those possibilities for emotional reasons in practice. As such, I think many people experience the entirely rational component of their moral decisions, which they process as a possibility internally, as ruthlessly utilitiarian. As a result, they consider rational and emotionless decisions to be ruthlessly utilitarian.

When asked, which alignment is intellectually superior, and reading that as intended by the author (which alignment is supported by rational as opposed to emotional reasoning), I think many people are thinking of entirely rational moral decisions as ruthlessly utilitarian. Most people assign ruthlessly utilitarian decisions to alignments which are Evil or Neutral (read the individual responses to see this in action). As a result, when asked "which alignment is intellectually superior", they are responding with "which alignment is most ruthlessly utilitarian". And my argument is that those answers may have absolutely nothing to do with the alignment a person self-identifies with, admires, or would be classified as in real life.

Wild Gazebo said:
Hope that clears things up.

A bit. Thanks for the patience. :)
 

Lasher Dragon said:
Just think, with all the time and energy invested in this thread someone could've submitted an entire neighborhood into the "World's Largest City". :p

If I were really interested in being productive, I could be doing any number of things that could help earn me money rather than wasting time on an Internet message board.

As I've said elsewhere, all of these abstract alignment discussions are helping me sort out how I want to handle the issue when I revise my setting and the next time I run D&D.
 

Oops, I seemed to have missed FreeTheSlaves' clarification. Sorry for the derailment...wait, no I'm not.

John, don't you think that there is a strong chance that herd survival insticts and moral quandries elicit similar responses in the brain? One could even be the product of the other. Don't you see the potential for serious misconceptions? Isn't there a strong chance that there can be separate cognitive functions displaying very similar patterns? One could even argue that emotion is our poor attempt to quantify our instictual motives. Ahhhh...I see.

You see alignment as a set or internal response rather than a world view. I get it. We keep on crossing our planes but never in sync so that we can step across.

Nothin' to see here...keep on moving.
 

John Morrow said:
But when one asks which alignment is "intellectually superior", the answer many people will give is the answer that they think a purely intellectual response will usually produce.

Thanks for the Neuron article, it did make more sense than the other one (and IMO was far less ambitious in it's conclusions than the news article). IMO this is getting beyond what's useful to RPGs though, and so I apologize if I'm not able address everything you've said. I'll try to keep it to this -

DnD alignment covers a greater scope of decisions than IIRC what the Neuron article describes. Also, the article uses "cognitive" in a certain way that excludes what I was talking about earlier. My assertion was that people's cognitive processes and culture contribute to their emotional responses. So telling me that such and such a decision occurred in the emotional center of the brain does not rule out the (IMO strong) possibility that cognitive processes played a role in creating that emotional response. I think psychotherapy is based on this idea.

IIRC to summarize your position: "emotional and rational parts of the brain both play a part in moral decisions". Then I misunderstood what you were saying earlier, and I think the Neuron article supports this, and with some generalization you can apply it (qualified) to the DnD alignment system.
 

John Morrow said:
As a result, when asked "which alignment is intellectually superior", they are responding with "which alignment is most ruthlessly utilitarian". And my argument is that those answers may have absolutely nothing to do with the alignment a person self-identifies with, admires, or would be classified as in real life.

I agree that this could be true for some, but not across the board. IMO the "rational=utilitarian=evil" ignores the presence of rational constructs (ie. philosophy) that supports non-utilitarian goals (or even defines what "utilitarian" means). If I believe that my tribe's welfare is more important than my own, then what is "utilitarian" to me is affected by that.

Also, if I believe in a universe ruled by a Lawful Good god, then it makes no sense to be Chaotic Evil, and vice versa.

If a person "reasons" their way to Neutral Evil being the most "intellectually superior" I still maintain that it says *something* about their world view/cognitive beliefs, even if that doesn't mean that person is Neutral Evil himself.
 

Into the Woods

Remove ads

Top