Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*TTRPGs General
Is any one alignment intellectually superior?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="John Morrow" data-source="post: 2163549" data-attributes="member: 27012"><p>Yes, because that's what I think many of the people answsering with Neutral or Evil alignments (on the Good to Evil) scale are trying to do, and I do think it makes some sense.</p><p></p><p></p><p></p><p>Actually, I think we can in many cases. For example, empathy seems to be fairly strongly tied to the Good and Evil axis. Sociopaths are characterized by deficient empathy and highly empathic people are inclined to treat the welfare of others as equal or more important than their own welfare. Empathy is generally experienced as an emotional response rather than a rational response and it's the absence of that emotional and intuitive response that characterizes sociopaths. </p><p></p><p></p><p></p><p>Counter-productive in what way? I'm trying to understand the question and responses more than I'm trying to "win" anything.</p><p></p><p></p><p></p><p>That's the perspective I took when I answered "none". Pure abstract intellect can be applied toward any moral end. But I can also understand why people associate intellect, in the absence of emotion, with certain alignments (often Neutral or Evil on the Good/Evil axis) and I don't think that has anything to do with their own personal alignment preference or which alignment they think is best.</p><p></p><p></p><p></p><p>Actually, what the article describes pretty much matches my own internal sense of there being discreet components (or internal arguments) from which a moral decision is derived. And I don't think our knowledge of how those parts of the brain work is as vague as you think it is. This article discusses how brain damage can illustrate the role played by each of those portions of the brain:</p><p></p><p><a href="http://www.csbmb.princeton.edu/~jdgreene/Greene-WebPage_files/Greene-Haidt-TiCS-02.pdf" target="_blank">http://www.csbmb.princeton.edu/~jdgreene/Greene-WebPage_files/Greene-Haidt-TiCS-02.pdf</a></p><p></p><p></p><p></p><p>That's not the relationship I'm trying to illustrate. The relationship I'm considering is what sorts or moral decisions might be made purely by reason rather than emotion. Given the clarification, that was the question that was being asked. </p><p></p><p>This research suggests that the rational or logical component is often ruthlessly utilitarian rather than empathetic or romantic. Empathy is probably an important component in producing Good or Evil moral decisions, thus I think it makes a lot of sense that many people in this tread are assuming that purely intellectual decisions will be ruthlessly utilitiarian and quite possibly Evil. And I think that may have no bearing on which alignment they personally favor the most.</p><p></p><p></p><p></p><p>But that's not how the author of the question defines "intellect". FreeTheSlaves, in clarification, wrote:</p><p></p><p><em>"I would define intellectually superior in this context to mean which (single or groups of) alignment hold greater rational reasons, rather than emotive reasons, to warrant being adhered to over the other alignments. "Greater" means the sum of it's quantity and the weight of it's quality of (rational) reasons combined."</em></p><p></p><p>In other words, FreeTheSlaves is, in fact, looking to seperate the rational component as opposed to the emotional component. </p><p></p><p></p><p></p><p>I don't think that's true. Both humans and chimpanzees predictably respond to certain moral tests in the same way (which contradicts what game theorists predict as rationally optimal behavior). That suggests to me that certain moral structures exist independently of being learned or created, even though the application may be flexible and the emotional response can be suppressed. Similarly, the regularity with which sociopaths are deficient in empathy suggests that being Good or Evil (in the D&D alignment sense) may hinge on a person's capacity to feel emmpathy for others. The article linked to above illustrates how various brain defects can produce predictable moral defects, so it's also not a stretch to imagine that a person's normal brain plays a role in producing a normal range of morality and that we have limits to our morality that we don't even notice.</p><p></p><p></p><p></p><p>I think that may be the problem at an abstract level. But I'm looking to explain the details. I think that the research points to many of the rational possibilities that a person internally processes being ruthlessly efficient and utilitarian, even if people often reject those possibilities for emotional reasons in practice. As such, I think many people experience the entirely rational component of their moral decisions, which they process as a possibility internally, as ruthlessly utilitiarian. As a result, they consider rational and emotionless decisions to be ruthlessly utilitarian.</p><p></p><p>When asked, which alignment is intellectually superior, and reading that as intended by the author (which alignment is supported by rational as opposed to emotional reasoning), I think many people are thinking of entirely rational moral decisions as ruthlessly utilitarian. Most people assign ruthlessly utilitarian decisions to alignments which are Evil or Neutral (read the individual responses to see this in action). As a result, when asked "which alignment is intellectually superior", they are responding with "which alignment is most ruthlessly utilitarian". And my argument is that those answers may have absolutely nothing to do with the alignment a person self-identifies with, admires, or would be classified as in real life.</p><p></p><p></p><p></p><p>A bit. Thanks for the patience. <img src="https://cdn.jsdelivr.net/joypixels/assets/8.0/png/unicode/64/1f642.png" class="smilie smilie--emoji" loading="lazy" width="64" height="64" alt=":)" title="Smile :)" data-smilie="1"data-shortname=":)" /></p></blockquote><p></p>
[QUOTE="John Morrow, post: 2163549, member: 27012"] Yes, because that's what I think many of the people answsering with Neutral or Evil alignments (on the Good to Evil) scale are trying to do, and I do think it makes some sense. Actually, I think we can in many cases. For example, empathy seems to be fairly strongly tied to the Good and Evil axis. Sociopaths are characterized by deficient empathy and highly empathic people are inclined to treat the welfare of others as equal or more important than their own welfare. Empathy is generally experienced as an emotional response rather than a rational response and it's the absence of that emotional and intuitive response that characterizes sociopaths. Counter-productive in what way? I'm trying to understand the question and responses more than I'm trying to "win" anything. That's the perspective I took when I answered "none". Pure abstract intellect can be applied toward any moral end. But I can also understand why people associate intellect, in the absence of emotion, with certain alignments (often Neutral or Evil on the Good/Evil axis) and I don't think that has anything to do with their own personal alignment preference or which alignment they think is best. Actually, what the article describes pretty much matches my own internal sense of there being discreet components (or internal arguments) from which a moral decision is derived. And I don't think our knowledge of how those parts of the brain work is as vague as you think it is. This article discusses how brain damage can illustrate the role played by each of those portions of the brain: [url]http://www.csbmb.princeton.edu/~jdgreene/Greene-WebPage_files/Greene-Haidt-TiCS-02.pdf[/url] That's not the relationship I'm trying to illustrate. The relationship I'm considering is what sorts or moral decisions might be made purely by reason rather than emotion. Given the clarification, that was the question that was being asked. This research suggests that the rational or logical component is often ruthlessly utilitarian rather than empathetic or romantic. Empathy is probably an important component in producing Good or Evil moral decisions, thus I think it makes a lot of sense that many people in this tread are assuming that purely intellectual decisions will be ruthlessly utilitiarian and quite possibly Evil. And I think that may have no bearing on which alignment they personally favor the most. But that's not how the author of the question defines "intellect". FreeTheSlaves, in clarification, wrote: [i]"I would define intellectually superior in this context to mean which (single or groups of) alignment hold greater rational reasons, rather than emotive reasons, to warrant being adhered to over the other alignments. "Greater" means the sum of it's quantity and the weight of it's quality of (rational) reasons combined."[/i] In other words, FreeTheSlaves is, in fact, looking to seperate the rational component as opposed to the emotional component. I don't think that's true. Both humans and chimpanzees predictably respond to certain moral tests in the same way (which contradicts what game theorists predict as rationally optimal behavior). That suggests to me that certain moral structures exist independently of being learned or created, even though the application may be flexible and the emotional response can be suppressed. Similarly, the regularity with which sociopaths are deficient in empathy suggests that being Good or Evil (in the D&D alignment sense) may hinge on a person's capacity to feel emmpathy for others. The article linked to above illustrates how various brain defects can produce predictable moral defects, so it's also not a stretch to imagine that a person's normal brain plays a role in producing a normal range of morality and that we have limits to our morality that we don't even notice. I think that may be the problem at an abstract level. But I'm looking to explain the details. I think that the research points to many of the rational possibilities that a person internally processes being ruthlessly efficient and utilitarian, even if people often reject those possibilities for emotional reasons in practice. As such, I think many people experience the entirely rational component of their moral decisions, which they process as a possibility internally, as ruthlessly utilitiarian. As a result, they consider rational and emotionless decisions to be ruthlessly utilitarian. When asked, which alignment is intellectually superior, and reading that as intended by the author (which alignment is supported by rational as opposed to emotional reasoning), I think many people are thinking of entirely rational moral decisions as ruthlessly utilitarian. Most people assign ruthlessly utilitarian decisions to alignments which are Evil or Neutral (read the individual responses to see this in action). As a result, when asked "which alignment is intellectually superior", they are responding with "which alignment is most ruthlessly utilitarian". And my argument is that those answers may have absolutely nothing to do with the alignment a person self-identifies with, admires, or would be classified as in real life. A bit. Thanks for the patience. :) [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
Is any one alignment intellectually superior?
Top