Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*Geek Talk & Media
D&D and the rising pandemic
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="NotAYakk" data-source="post: 8167173" data-attributes="member: 72555"><p>I'm not certain what you mean by logic here. If you mean "anything that isn't a reflex", well sure.</p><p></p><p>I'm talking about logic, like correct lines of reason from explicitly assumed facts following assumed correct rules resulting in a sound conclusion.</p><p></p><p>I know people who did PHDs in proof theory, where they try to make formal mathematical proofs to be <strong>actually sound</strong>, and it is hard. Even in the strange atmosphere of formal mathematics, basically <strong>everyone uses shortcuts</strong> and cheats with heuristics and skipping steps.</p><p></p><p>Sometimes those skipped steps are valid, and sometimes they are not.</p><p></p><p>Bubbling up from there, you can "rationally" decide X or Y, but that "rational" decision is at <em>best</em> <strong>rationalized</strong>. Ie, you can produce a "rational justification" for your decision, <strong>that decision wasn't made based off a logical deduction</strong>.</p><p></p><p>And, at <em>best</em>, because your self image is "I am rational", someone else presenting an argument using the language of rationality will generate cognitive dissonance, and make you uncomfortable with your decision, and you might accept <strong>their</strong> "rational justification" to change your behavior.</p><p></p><p>But you almost certainly <strong>did not</strong> actually determine if their argument was actually sound or not, because I've seen what it actually takes to determine if a chain of logic is sound, and I don't believe you are doing that.</p><p></p><p>You <em>rationalized</em> it was sound at best. You where convinced to build an argument to yourself in the language of rationality that justified your change of position.</p><p></p><p>Believing you are rational, and believing that rational argument can change your actions, means that you are predisposed to listen to arguments framed as rational, and if you ignore them you may experience cognitive dissonance. So it isn't <strong>nothing</strong>. But it doesn't mean "your actions are based off cold, clear logic". Almost all of your actions are based off heuristics and feelings; <strong>at best</strong> those heuristics and feelings can be modified by certain kinds of rational-language arguments and self discipline, and you can generate a plausible "rationalization" for your heuristics and feelings after the fact.</p><p></p><p>And even if you decided to turn your decision in an area into an algorithm -- say, take a bunch of resumes and score them using as close to objective criteria as you can, enter the results in a spreadsheet, and calculate points -- the criteria and calculation choices you make in turn aren't going to be based off pure cold logic almost certainly. And if they are, then the base for those in turn won't be based off pure cold logic.</p><p></p><p>Actually building a pure cold logic chain to make even the simplest decision in the most constrained environment is insanely hard. And what you get out of it isn't "this is true", but a conditional claim which you have to use heuristics to map over to "pretty much true", like "assuming model X is consistent, and my association between the formal symbols and what I consider counting numbers is sound, then there is an infinite number of primes".</p><p></p><p>So no, nobody is smart enough to do that for their actions. You can use the self image of "rationality" to iterate on your heuristics and feelings, but the cost of actually making decisions and acting on pure logic is crazy.</p><p></p><p>---</p><p></p><p>This does mean it is possible to reason a person out of a position they did not come to by reason, <strong>if they consider themselves to be reasonable</strong>. But it isn't easy, because every rational reason-based argument you have ever made is full of holes, because every such argument is full of holes. And the ones that aren't are so large that you can't hold them in your head all at once, <strong>and</strong> contain things that aren't holes that look like holes.</p><p></p><p>Rational arguments are arguments some people are predisposed to listen to. Agreeing on the value of Rational arguments can be done <strong>without</strong> believing that you are rational.</p><p></p><p>The problem with believing you are rational is that it means your actions are rational, which can sort of excuse you from being responsible for your actions. Engineers disease is when you are an expert in one area, and you hold "I'm smart, so my decisions must be right and rational" once you convince yourself of something (often outside of your area of expertise).</p><p></p><p>Ie, the trap can look like this: As a rational person, your rational decision that racism isn't real (or whatever) is rational. And as a smart person, your rational belief is more right than others. People arguing the other side just aren't as smart and rational as you.</p><p></p><p>If you instead decide "I am not rational", believe that you should listen to rational arguments in order to improve yourself, but accept that many of your actions aren't going to be rational. When someone comes at you with a poisonous rational argument, you can explicitly know "there is a danger in exposing my OODA loop, and this person might be attacking me with this", and try to avoid the trap.</p><p></p><p>For example, if someone gave me an unassailable rational argument to do something particularly horrible, I wouldn't judge it only on the soundness of the argument. I'd consider the possibility my ability to understand the argument is imperfect (because I'm not a cold logic machine, and I as a human suck at it), and consider the possibility that someone is weaponizing rational arguments against me.</p></blockquote><p></p>
[QUOTE="NotAYakk, post: 8167173, member: 72555"] I'm not certain what you mean by logic here. If you mean "anything that isn't a reflex", well sure. I'm talking about logic, like correct lines of reason from explicitly assumed facts following assumed correct rules resulting in a sound conclusion. I know people who did PHDs in proof theory, where they try to make formal mathematical proofs to be [B]actually sound[/B], and it is hard. Even in the strange atmosphere of formal mathematics, basically [B]everyone uses shortcuts[/B] and cheats with heuristics and skipping steps. Sometimes those skipped steps are valid, and sometimes they are not. Bubbling up from there, you can "rationally" decide X or Y, but that "rational" decision is at [I]best[/I] [B]rationalized[/B]. Ie, you can produce a "rational justification" for your decision, [B]that decision wasn't made based off a logical deduction[/B]. And, at [I]best[/I], because your self image is "I am rational", someone else presenting an argument using the language of rationality will generate cognitive dissonance, and make you uncomfortable with your decision, and you might accept [B]their[/B] "rational justification" to change your behavior. But you almost certainly [B]did not[/B] actually determine if their argument was actually sound or not, because I've seen what it actually takes to determine if a chain of logic is sound, and I don't believe you are doing that. You [I]rationalized[/I] it was sound at best. You where convinced to build an argument to yourself in the language of rationality that justified your change of position. Believing you are rational, and believing that rational argument can change your actions, means that you are predisposed to listen to arguments framed as rational, and if you ignore them you may experience cognitive dissonance. So it isn't [B]nothing[/B]. But it doesn't mean "your actions are based off cold, clear logic". Almost all of your actions are based off heuristics and feelings; [B]at best[/B] those heuristics and feelings can be modified by certain kinds of rational-language arguments and self discipline, and you can generate a plausible "rationalization" for your heuristics and feelings after the fact. And even if you decided to turn your decision in an area into an algorithm -- say, take a bunch of resumes and score them using as close to objective criteria as you can, enter the results in a spreadsheet, and calculate points -- the criteria and calculation choices you make in turn aren't going to be based off pure cold logic almost certainly. And if they are, then the base for those in turn won't be based off pure cold logic. Actually building a pure cold logic chain to make even the simplest decision in the most constrained environment is insanely hard. And what you get out of it isn't "this is true", but a conditional claim which you have to use heuristics to map over to "pretty much true", like "assuming model X is consistent, and my association between the formal symbols and what I consider counting numbers is sound, then there is an infinite number of primes". So no, nobody is smart enough to do that for their actions. You can use the self image of "rationality" to iterate on your heuristics and feelings, but the cost of actually making decisions and acting on pure logic is crazy. --- This does mean it is possible to reason a person out of a position they did not come to by reason, [B]if they consider themselves to be reasonable[/B]. But it isn't easy, because every rational reason-based argument you have ever made is full of holes, because every such argument is full of holes. And the ones that aren't are so large that you can't hold them in your head all at once, [B]and[/B] contain things that aren't holes that look like holes. Rational arguments are arguments some people are predisposed to listen to. Agreeing on the value of Rational arguments can be done [B]without[/B] believing that you are rational. The problem with believing you are rational is that it means your actions are rational, which can sort of excuse you from being responsible for your actions. Engineers disease is when you are an expert in one area, and you hold "I'm smart, so my decisions must be right and rational" once you convince yourself of something (often outside of your area of expertise). Ie, the trap can look like this: As a rational person, your rational decision that racism isn't real (or whatever) is rational. And as a smart person, your rational belief is more right than others. People arguing the other side just aren't as smart and rational as you. If you instead decide "I am not rational", believe that you should listen to rational arguments in order to improve yourself, but accept that many of your actions aren't going to be rational. When someone comes at you with a poisonous rational argument, you can explicitly know "there is a danger in exposing my OODA loop, and this person might be attacking me with this", and try to avoid the trap. For example, if someone gave me an unassailable rational argument to do something particularly horrible, I wouldn't judge it only on the soundness of the argument. I'd consider the possibility my ability to understand the argument is imperfect (because I'm not a cold logic machine, and I as a human suck at it), and consider the possibility that someone is weaponizing rational arguments against me. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
D&D and the rising pandemic
Top