log in or register to remove this ad

 

D&D and the rising pandemic

Thomas Shey

Adventurer
We’re literally triaging people in the streets. “But it’s a hoax!”

Conspiracy thinking is a powerful drug. Once you decide that people really are doing massive manipulations of process and hiding it successfully, its a very hard position to crack because its self-reinforcing, and then confirmation bias does the rest.
 

log in or register to remove this ad


NotAYakk

Legend
Sure we are. The issue isn't one of being "smart enough". It is about being aware what it going on inside your head, and purposefully setting up patterns to avoid traps. You might consider that, in fact, each of us is of two minds. They speak with the same voice, though, so it can be hard to distinguish them if you aren't careful.

The human brain has a set of structures collectively called the "limbic system". It is responsible for regulating your endocrine system, processing your response to emotional stimuli, and reinforcing behavior. For our purposes, you may consider this the "Oh sh*t, jaguar!" portion of your brain. It isn't precisely illogical, so much as it is based on getting you a fast response based on only small bits of information. When there is, in fact, a jaguar in the tall grass, you want the limbic system, as it gets you the fear response you need to rev up your adrenaline and get your feet moving. When there may be a jaguar, the cost of being wrong is small, compared to the value of being right, so the limbic response is useful and appropriate.

You also have structures in your brain that can process cold, clear logic very well. But they are glacially slow compared to the limbic system. By the time you have processed a rational response, the limbic system has already gotten its answer out there, and you're already acting on it.

Also relevant for our discussion - the limbic system does not differentiate between physical and social threats. Someone coming at you with a club is not all that different from a person about to cause you great loss of face, from the limbic standpoint.

Our collective problem being that there are very few jaguars any more. For modern life, we really want the reasoned response that is so often drowned out or colored by the limbic response. And, we can get it, but it takes practice to filter out the nonsense, and anyone can occasionally fail.
I'm not certain what you mean by logic here. If you mean "anything that isn't a reflex", well sure.

I'm talking about logic, like correct lines of reason from explicitly assumed facts following assumed correct rules resulting in a sound conclusion.

I know people who did PHDs in proof theory, where they try to make formal mathematical proofs to be actually sound, and it is hard. Even in the strange atmosphere of formal mathematics, basically everyone uses shortcuts and cheats with heuristics and skipping steps.

Sometimes those skipped steps are valid, and sometimes they are not.

Bubbling up from there, you can "rationally" decide X or Y, but that "rational" decision is at best rationalized. Ie, you can produce a "rational justification" for your decision, that decision wasn't made based off a logical deduction.

And, at best, because your self image is "I am rational", someone else presenting an argument using the language of rationality will generate cognitive dissonance, and make you uncomfortable with your decision, and you might accept their "rational justification" to change your behavior.

But you almost certainly did not actually determine if their argument was actually sound or not, because I've seen what it actually takes to determine if a chain of logic is sound, and I don't believe you are doing that.

You rationalized it was sound at best. You where convinced to build an argument to yourself in the language of rationality that justified your change of position.

Believing you are rational, and believing that rational argument can change your actions, means that you are predisposed to listen to arguments framed as rational, and if you ignore them you may experience cognitive dissonance. So it isn't nothing. But it doesn't mean "your actions are based off cold, clear logic". Almost all of your actions are based off heuristics and feelings; at best those heuristics and feelings can be modified by certain kinds of rational-language arguments and self discipline, and you can generate a plausible "rationalization" for your heuristics and feelings after the fact.

And even if you decided to turn your decision in an area into an algorithm -- say, take a bunch of resumes and score them using as close to objective criteria as you can, enter the results in a spreadsheet, and calculate points -- the criteria and calculation choices you make in turn aren't going to be based off pure cold logic almost certainly. And if they are, then the base for those in turn won't be based off pure cold logic.

Actually building a pure cold logic chain to make even the simplest decision in the most constrained environment is insanely hard. And what you get out of it isn't "this is true", but a conditional claim which you have to use heuristics to map over to "pretty much true", like "assuming model X is consistent, and my association between the formal symbols and what I consider counting numbers is sound, then there is an infinite number of primes".

So no, nobody is smart enough to do that for their actions. You can use the self image of "rationality" to iterate on your heuristics and feelings, but the cost of actually making decisions and acting on pure logic is crazy.

---

This does mean it is possible to reason a person out of a position they did not come to by reason, if they consider themselves to be reasonable. But it isn't easy, because every rational reason-based argument you have ever made is full of holes, because every such argument is full of holes. And the ones that aren't are so large that you can't hold them in your head all at once, and contain things that aren't holes that look like holes.

Rational arguments are arguments some people are predisposed to listen to. Agreeing on the value of Rational arguments can be done without believing that you are rational.

The problem with believing you are rational is that it means your actions are rational, which can sort of excuse you from being responsible for your actions. Engineers disease is when you are an expert in one area, and you hold "I'm smart, so my decisions must be right and rational" once you convince yourself of something (often outside of your area of expertise).

Ie, the trap can look like this: As a rational person, your rational decision that racism isn't real (or whatever) is rational. And as a smart person, your rational belief is more right than others. People arguing the other side just aren't as smart and rational as you.

If you instead decide "I am not rational", believe that you should listen to rational arguments in order to improve yourself, but accept that many of your actions aren't going to be rational. When someone comes at you with a poisonous rational argument, you can explicitly know "there is a danger in exposing my OODA loop, and this person might be attacking me with this", and try to avoid the trap.

For example, if someone gave me an unassailable rational argument to do something particularly horrible, I wouldn't judge it only on the soundness of the argument. I'd consider the possibility my ability to understand the argument is imperfect (because I'm not a cold logic machine, and I as a human suck at it), and consider the possibility that someone is weaponizing rational arguments against me.
 
Last edited:

MarkB

Legend
I live in Los Angeles County; when you hear about ambulances driving around for hours and paramedics treating people outside the hospital because they just can't wait any longer, you know its gotten pretty bad.
In London the mayor has declared a state of emergency due to critical shortage of hospital places, and they're drafting in firefighters to drive the ambulances.
 

Umbran

Mod Squad
Staff member
I'm not certain what you mean by logic here. If you mean "anything that isn't a reflex", well sure.

I'm talking about logic, like correct lines of reason from explicitly assumed facts following assumed correct rules resulting in a sound conclusion.

Yep. That's fine. I'm a physicist, and have had far more training than most on formal logic. A bit later we will have to talk about the differences between logical and reasonable/rational, but we can start here.

I know people who did PHDs in proof theory, where they try to make formal mathematical proofs to be actually sound, and it is hard. Even in the strange atmosphere of formal mathematics, basically everyone uses shortcuts and cheats with heuristics and skipping steps.

Sure. And in physics we use approximations. However, what you are missing there is the differences in the level of complexity of the logic required by a formal mathematician and a typical person in day-to-day life. You don't need to be able to solve Fermat's Last Theorem in order to make a reasoned choice about whether to get your kids vaccinated.

Bubbling up from there, you can "rationally" decide X or Y, but that "rational" decision is at best rationalized. Ie, you can produce a "rational justification" for your decision, that decision wasn't made based off a logical deduction.

So, here's the kicker - how did you come to that conclusion? By your own posit, at best your own position on this is rationalized - a veneer you have placed on what is really either an emotional stance or the result of a cognitive shortcut. Your stance, by your own assertion, cannot be logical.

Therefore, your position cannot be trusted, now can it?
 

NotAYakk

Legend
Sure. And in physics we use approximations. However, what you are missing there is the differences in the level of complexity of the logic required by a formal mathematician and a typical person in day-to-day life. You don't need to be able to solve Fermat's Last Theorem in order to make a reasoned choice about whether to get your kids vaccinated.
Yes, deciding to get your kids vaccinated using pure rationality is insanely harder than solving Fermat's Last Theorem using pure rationality.

We just don't do it when deciding to get your kids vaccinated or not. We use heuristics (including beliefs) and habits, and we are influenced to change those based on arguments and information. Those who believe themselves rational take pride in their ability to be able to rationalize their choices, and often in listening to rationally-framed arguments and information. So they are sometimes ore amenable to those kinds of arguments and information.

So, here's the kicker - how did you come to that conclusion? By your own posit, at best your own position on this is rationalized - a veneer you have placed on what is really either an emotional stance or the result of a cognitive shortcut. Your stance, by your own assertion, cannot be logical.

Therefore, your position cannot be trusted, now can it?
If you hold that the only positions that can be trusted are those grounded in purely rational arguments, then nothing can be trusted.

That position is an artifact of believing that you are able to express, understand and evaluate purely rational arguments about reality, and only those arguments can be trusted.

And yes, my position is rationalized. You aren't seeing a proof that I'm right. You are seeing a rationalization for my position. This rationalization may be persuasive to you. And rational arguments back? I often find them persuastive as well, even if I don't believe I made decisions rationally.

I think that believing I make decisions rationally doesn't match observations, it doesn't match observations I have of other people, and from what little I understand of how people think, it doesn't match either. I also find that people having pride that "they are rational" is not that uncommon, and the belief (to me) seems to explain behavior better than them actually being rational.

I make rational arguments to myself all the time, and I find rational arguments persuasive (even ones I use on myself!), and sometimes I attempt (with limited success) to change my heuristics and habits based on them. This may result in some of my actions moving in a way that is somehow objectively rational, but I don't have that much faith in it.

But if someone gives me a rational argument why I should, I dunno, torture and kill a bunch of babies? I don't care if the logic of the argument is unassailable. I might listen in order to determine if my guess is that they are plausibly going to follow through, and then attempt to prevent it.

And far less extreme rational arguments are going to run into similar "tripwire" heuristics.

Someone who believes they are a rational person might run into cognitive dissonance that "but the logic was unassailable that I should do the horrible thing". I think that is a trap.
 



Garthanos

Arcadian Knight
I think one of you may be talking about % of cases, ane the other talking about % of population.
sure however if we see the thing occurring over many years the cases without the controls we are putting on it now == the population especially with upcoming even more contagious versions (an overwhelmed medical system is 5 to 10 percent on death rates)
 
Last edited:

Umbran

Mod Squad
Staff member
Yes, deciding to get your kids vaccinated using pure rationality is insanely harder than solving Fermat's Last Theorem using pure rationality.

So, now we get into the difference between logic and being rational/reasonable.

Absolute, pure, unadulterated Vulcan logic has few places in modern life, for one simple reason - logic ultimately requires you to know the actual (and absolute) values of variables. If you include so much as a rounding error in pure logic, the thing can fall apart. We lack such complete knowledge of our universe, so pure logic is denied us in most practical matters. It is a Star Trek fiction.

We can, however, be reasonable. Being reasonable is being fairly logical, but with some boundaries around what inputs and results you accept to handle the fact that logic isn't everything.

It is very, very easy to come to a reasonable conclusion about vaccination, while Fermat's Last Theorem is very difficult.
 
Last edited:

CleverNickName

Limit Break Dancing
4,100 people died from Covid-19 in America yesterday alone. That makes it the second-deadliest singe day in America...I think only the Galveston Hurricane beats it, and probably not for long.
...
Also in America alone, over a quarter-million new cases were reported yesterday as well (260,973). This is the highest number of cases reported in a single day here. Of any virus, ever.
America broke both of these records again yesterday (4,207 dead and 279,154 new cases in America.) On a list of the Top Ten Deadliest Days in America, eight of them are from Covid-19. The surprise attack on Pearl Harbor doesn't even make the top 20 anymore.

Still no new policies have been implemented, and there has been no new guidance from our leaders.
 

NotAYakk

Legend
So, now we get into the difference between logic and being rational/reasonable.

Absolute, pure, unadulterated Vulcan logic has few places in modern life, for one simple reasons - logic ultimately requires you to know the actual (and absolute) values of variables. If you include so much as a rounding error in pure logic, the thing can fall apart. We lack such complete knowledge of our universe, so pure logic is denied us in most practical matters. It is a Star Trek fiction.
I can do logic on things when I don't require the actual and absolute values of variables.

You draw different conclusions based on that fact than you could with the actual and absolute values of variables.
To digress...

Hell -- Constructive Analysis | E. Bishop | Springer -- here is a branch of mathematics where we do away with the law of excluded middle and a few other axioms, and we get a pretty good argument that all provable theorems also produce what they claim exists.

Due to the restrictions on the operations we are allowed to do, there are things you cannot prove in this branch that you could prove in more classical analysis, like the intermediate value theorem.

In classical analysis, if you have a continuous function defined on a closed interval such that it is less than 0 at the start, and greater than 0 at the end, we can prove that there is a point in the middle where its value is 0.

In the above constructive analysis we cannot prove it; instead, we can prove that for any non-zero window of precision we want, we can find an value between the start and the end that maps at least that close to 0.

Here we have a version of formal analysis that embraces and accepts imprecision and the limits of our ability to reason about infinities concretely. Now, while it is a "fun" read, it turns out some mad science physicist types have gone off and used it to form an alternative construction of relativistic models of the big bang and generated an irreversible arrow of time from it, which is neat; basically, there isn't enough room in the universe early on for the arrow of time to go backwards into it. Pop sci version: Does Time Really Flow? New Clues Come From a Century-Old Approach to Math.
In any case, yes, Spock isn't what I'm talking about.

It is possible to do logical rational reasoning based off incomplete and error prone data. It is just hard.

Formal logic is insanely easier; the difference is that formal logic there is some hope of spotting errors. Because of that, they actually attempt to avoid errors. And people working in relatively formal logic still use heuristics rather than actually provably correct steps, except as an academic exercise by logicians (and occassionally such exercises find errors in arguments).

In comparison, in everyday reasoning, errors in deduction are basically impossible to eliminate; beyond that, the raw amount of state it takes to reason about a non-trivial conjecture is so ridiculously huge that if you think you are reasoning without pencil and paper, you aren't; you are (again) applying heuristics. If it is an area of expertise, you are probably using heuristics to reduce the complexity of the problem down to what your experience has told you are relevant details; if it isn't, you are using heuristics to reduce the complexity of the problem down to irrelevant details.

We can, however, be reasonable. Being reasonable is being fairly logical, but with some boundaries around what inputs and results you accept to handle the fact that logic isn't everything.

It is very, very easy to come to a reasonable conclusion about vaccination, while Fermat's Last Theorem is very difficult.
Naw, Fermat's Last Theorem is easy to have a reasonable conclusion about. It remains a bunch of symbols on the page.

It is plausible to actually check and have expertise in knowing if the proof of Fermat's Last Theorem is valid, and for that expertise in turn to be objectively and clearly checked. I mean, I haven't done that, but I think I know how hard it is to find out if someone is blathering nonsense about mathematics (my technique involves a ladder of trust basically).
On the subject of vaccination; hell, on the subject of "does the sun come up tomorrow" -- that is so insanely hard to have expertise on it isn't funny, compared to formal mathematics.

Math is only hard because it is so easy, we have built insane constructs on it, and those insane constructs keep on seeming to generate interesting truths.

So we don't even try. We hand wave heuristics around. Some more hand wavey than others.

I use a heuristic that people who are expert epidemiologists probably aren't clueless about epidemiology. Also, that there are going to be better statisticians than me looking at the papers involved. When I run into statistical claims in popular media about anti virus effectiveness, I do napkin math to see if they are plausible; if I find a mistake, I'll iterate on the assumption that the communication was fuzzy.

Does that work? I don't know. I haven't build a model of if my napkin math is worth the credibility I put in it (I probably lack the expertise to know if I'm a statistical idiot, most people do; heuristically, I have evidence I am not, but again... I know I have been an idiot about subjects I didn't think I was an idiot on in the past, so why presume I'm not an idiot today?)

Down that infinite regress, I just drop it (unless I feel bored). Why? Heuristics. I wasted time on that kind of iteration before.
 


Eltab

Is this a moon, or is it a space station?
America broke both of these records again yesterday (4,207 dead and 279,154 new cases in America.) On a list of the Top Ten Deadliest Days in America, eight of them are from Covid-19. The surprise attack on Pearl Harbor doesn't even make the top 20 anymore.

Still no new policies have been implemented, and there has been no new guidance from our leaders.
in the US, there currently is nobody from whom the bureaucracy will accept orders if given (this has been true since Election Day). So we run on autopilot for another two weeks.
 

Umbran

Mod Squad
Staff member
I can do logic on things when I don't require the actual and absolute values of variables.

Now you are just contradicting yourself.

First, the best one can do is rationalize. But now you can do logic. Your discussion is not cogent or consistent, so I'm not going to engage with it further. Have a good weekend.
 

NotAYakk

Legend
I can do logic. I can do symbol manipulation, check proofs, and with low fidelity and lots of effort connect those symbols to things in the world.

But I can't live my life rationally, because rational decision making is insanely expensive.

You have to use heristics and habits to determine your actions, both immediatly and in aggregate.

And anyone thinking they are making their decisions rationally is fooling themselves. At best they can produce rationalizations for their actions, and adapt their actions when convinced by rational sounding arguments.

I don't see the contradiction. My issue is that the problem is intractible; I don't believe anyone is that smart.

Maybe I'm wrong and all of these people claiming to be "rational" are actually amazingly smarter than I can understand.

More likely they are using "rational" to mean something I am not. So my model is that they have heuristics and habits and beliefs and an ability to rationalize, and a self image that they are "rational" so when confronted with arguments framed as "rational" feel they should be persuaded by them.

And they call that "being rational".

But I could be wrong.
 

I think being rational just means you are willing to listen to the arguments of others with an open mind, and are willing to move on your position if the arguments are strong enough. A rational person in my view, is someone who is willing to change their mind, and admit that their previously helt beliefs were wrong. They are a person not completely entrenched in their opinion, and open to logic.

Of course even a rational person can be convinced by irrational or flawed arguments.
 


Eltab

Is this a moon, or is it a space station?
I think being rational just means you are willing to listen to the arguments of others with an open mind, and are willing to move on your position if the arguments are strong enough. A rational person in my view, is someone who is willing to change their mind, and admit that their previously helt beliefs were wrong. They are a person not completely entrenched in their opinion, and open to logic.

Of course even a rational person can be convinced by irrational or flawed arguments.
And a rational person will go check claims made in an argument against the state of the real world.
Claim 1 "The sky is blue"
Claim 2 "The sky is grey"
Rational "Let's look out a window."
 


Advertisement2

Advertisement4

Top