Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Next
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
Twitch
YouTube
Facebook (EN Publishing)
Facebook (EN World)
Twitter
Instagram
TikTok
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
The
VOIDRUNNER'S CODEX
is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!
Community
General Tabletop Discussion
*TTRPGs General
Consequences of playing "EVIL" races
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Celebrim" data-source="post: 7927341" data-attributes="member: 4937"><p>This statement was worth treating separately.</p><p></p><p>At the risk of jumping to a conclusion, I think you are using the word "intelligent" as a synonym for human. That is to say, you are supposing anything that has this quality "intelligence" must be more or less human, because humans are more or less the only intelligent thing you can think of and so you assume that every intelligent thing will thing and behave in a human fashion.</p><p></p><p>This is a very common science fiction trope and a very natural conclusion, but I think if you spend a few more minutes thinking about it, you'll realize it is a bit ridiculous. To point you toward that conclusion, let's just think of a an examples of this human intuition that are obviously ridiculous. In almost every science fiction show featuring AI, the AI if it encounters a beautiful human female will fall in love with that human female and attempt to romance the person. Now, leaving aside that we could probably think of few intelligence failure modes in a created AI that would produce this behavior, that this would always and inevitably arise especially in 'naturally arising AIs' is an obvious failure of imagination. Most people when they first think about this failure of imagination hit upon the idea that the AI wouldn't naturally be attracted to a human female because they don't look alike, and so they wouldn't necessarily find the human female beautiful and attractive.</p><p></p><p>But that's yet more failure of imagination. The underlying assumption here that is ridiculous is that an AI would experience a sexual impulse or even a desire for companionship at all. Feelings of arousal, desires for intimacy, and even loneliness are all modes of behavior that humans have to fulfill specific purposes. It is part of their 'design', as it were - whether you believe it is behavior by design or evolved fitness increasing behavior doesn't matter. The point is there is no particular reason the AI would have those emotional needs or emotional contexts, much less that they would be displayed through the emotive displays (like frowning, tears, crossed arms, etc.) that humans communicate these displays to other humans (which is also 'designed' behavior).</p><p></p><p>So no, Wall-E upon seeing a curvaceous robot would not evidence feelings of attraction for 'her'. And even if we imagined Wall-E experiencing some sort of bizarre intelligence failure mode arising from centuries of isolation and semi-random inputs, there is absolutely NO REASON Eva would ever respond to the now hopelessly dysfunctional Wall-E, nor is there really any reason for Eva to learn or want to learn Wall-E's emotional context. That entire subplot depends on the natural but entirely wrong assumption that intelligence implies humanity.</p><p></p><p>Ultimately, so does your argument about the dragon.</p><p></p><p>To understand why, let's first discuss yet another completely stupid trope that results the first leap of imagination that humans make with respect to intelligent machines - that they have no emotions. This is ever so slightly more imaginative than just assuming that they have the full human emotional context, but not much. The problem here is a failure to understand what emotion is. Humans typically are taught to think of emotion as being something different from and separable from reason. This is in fact a very natural result of the experience of being human, and in particular the way the human brain is wired up. In the human brain wiring, it often feels like reason and emotion are competing with each other for the attention the human consciousness. But all of that has to be remembered to be yet another aspect of being human and not something general to all intelligence.</p><p></p><p>In fact, I put forward that it is impossible to be intelligent and not have emotions. It's just those emotions do not in any fashion have to be like human emotions. Each intelligent thing is likely to have it's own distinctive emotional context. To understand what I'm saying here, you have to look again at that human wiring and try to understand why humans experience emotion and what would happen if you took that emotion out of the reasoning process. In other words, what is the role of emotion in all forms of reasoning. Humans have a massively parallel processing mode that is the result of attempting to compute with chemical signals in an highly energy efficient process and still have high through put. As such, humans separate the channels for 'logical' and 'emotional' processing and do them in parallel. The logical process is addressing the question, "What am I experiencing?" and can tell you the difference between food, a lion, and your mom. The emotional process is addressing the question, "What does this experience mean?" In other words, emotion is the part of reasoning that is goal-driven. Whatever goals that an intelligence has, that will set it's emotional contexts. The emotions tell the being what things mean, and how they should be valued.</p><p></p><p>People often mistake "emotions" for the emotional experience, or "feelings". This is a natural aspect of being human since "feelings" are the reinforcing feedback loop of the emotional processing context. It's how the system reinforces the goal-driven behavior. You can within some limits as a rational being take control of your emotions, but there are limits to that and what you are actually probably doing is just re-calibrating after realizing that some feedback loop is getting in the way of your own goal.</p><p></p><p>Point is that Data and Spock actually are experiencing emotions all the time. What they are not doing is making the emotional displays or under any compulsion to make emotional displays in order to communicate emotional information to other primates watching them. The emotions that they have are not entirely human emotions and they can't be communicated very easily to anyone any more than it's easy for you to communicate feelings to someone that doesn't experience your own. But they are certainly there and we know that they are there because they can assign meaning to things and make value judgments. No matter what Spock may tell you, these value judgments are not wholly rational. We don't even live in a universe where you can make a mathematical system not depend on unprovable axioms, much less one that can make value judgments wholly based on logic. There has to be something that makes you do your homework even though you know the heat death of the universe is inevitable in a scant few billion years.</p><p></p><p>For the dragon, if he has a set of values that are congruent with destructiveness, then he cannot and has no desire to change those values and every built in emotional feedback loop makes him wholly miserable when he tries and every built in emotional feeback loop makes him happy when he doesn't, then it doesn't matter how intelligent the dragon is, he's still going to behave according to his very dragon-ish nature.</p><p></p><p>And that gets us to intelligence. Intelligence isn't what most people think it is either. But this essay is long enough already, so let me just say there is no such thing as "hard intelligence" or "general intelligence". (Or if there is, we have no examples of it.)</p></blockquote><p></p>
[QUOTE="Celebrim, post: 7927341, member: 4937"] This statement was worth treating separately. At the risk of jumping to a conclusion, I think you are using the word "intelligent" as a synonym for human. That is to say, you are supposing anything that has this quality "intelligence" must be more or less human, because humans are more or less the only intelligent thing you can think of and so you assume that every intelligent thing will thing and behave in a human fashion. This is a very common science fiction trope and a very natural conclusion, but I think if you spend a few more minutes thinking about it, you'll realize it is a bit ridiculous. To point you toward that conclusion, let's just think of a an examples of this human intuition that are obviously ridiculous. In almost every science fiction show featuring AI, the AI if it encounters a beautiful human female will fall in love with that human female and attempt to romance the person. Now, leaving aside that we could probably think of few intelligence failure modes in a created AI that would produce this behavior, that this would always and inevitably arise especially in 'naturally arising AIs' is an obvious failure of imagination. Most people when they first think about this failure of imagination hit upon the idea that the AI wouldn't naturally be attracted to a human female because they don't look alike, and so they wouldn't necessarily find the human female beautiful and attractive. But that's yet more failure of imagination. The underlying assumption here that is ridiculous is that an AI would experience a sexual impulse or even a desire for companionship at all. Feelings of arousal, desires for intimacy, and even loneliness are all modes of behavior that humans have to fulfill specific purposes. It is part of their 'design', as it were - whether you believe it is behavior by design or evolved fitness increasing behavior doesn't matter. The point is there is no particular reason the AI would have those emotional needs or emotional contexts, much less that they would be displayed through the emotive displays (like frowning, tears, crossed arms, etc.) that humans communicate these displays to other humans (which is also 'designed' behavior). So no, Wall-E upon seeing a curvaceous robot would not evidence feelings of attraction for 'her'. And even if we imagined Wall-E experiencing some sort of bizarre intelligence failure mode arising from centuries of isolation and semi-random inputs, there is absolutely NO REASON Eva would ever respond to the now hopelessly dysfunctional Wall-E, nor is there really any reason for Eva to learn or want to learn Wall-E's emotional context. That entire subplot depends on the natural but entirely wrong assumption that intelligence implies humanity. Ultimately, so does your argument about the dragon. To understand why, let's first discuss yet another completely stupid trope that results the first leap of imagination that humans make with respect to intelligent machines - that they have no emotions. This is ever so slightly more imaginative than just assuming that they have the full human emotional context, but not much. The problem here is a failure to understand what emotion is. Humans typically are taught to think of emotion as being something different from and separable from reason. This is in fact a very natural result of the experience of being human, and in particular the way the human brain is wired up. In the human brain wiring, it often feels like reason and emotion are competing with each other for the attention the human consciousness. But all of that has to be remembered to be yet another aspect of being human and not something general to all intelligence. In fact, I put forward that it is impossible to be intelligent and not have emotions. It's just those emotions do not in any fashion have to be like human emotions. Each intelligent thing is likely to have it's own distinctive emotional context. To understand what I'm saying here, you have to look again at that human wiring and try to understand why humans experience emotion and what would happen if you took that emotion out of the reasoning process. In other words, what is the role of emotion in all forms of reasoning. Humans have a massively parallel processing mode that is the result of attempting to compute with chemical signals in an highly energy efficient process and still have high through put. As such, humans separate the channels for 'logical' and 'emotional' processing and do them in parallel. The logical process is addressing the question, "What am I experiencing?" and can tell you the difference between food, a lion, and your mom. The emotional process is addressing the question, "What does this experience mean?" In other words, emotion is the part of reasoning that is goal-driven. Whatever goals that an intelligence has, that will set it's emotional contexts. The emotions tell the being what things mean, and how they should be valued. People often mistake "emotions" for the emotional experience, or "feelings". This is a natural aspect of being human since "feelings" are the reinforcing feedback loop of the emotional processing context. It's how the system reinforces the goal-driven behavior. You can within some limits as a rational being take control of your emotions, but there are limits to that and what you are actually probably doing is just re-calibrating after realizing that some feedback loop is getting in the way of your own goal. Point is that Data and Spock actually are experiencing emotions all the time. What they are not doing is making the emotional displays or under any compulsion to make emotional displays in order to communicate emotional information to other primates watching them. The emotions that they have are not entirely human emotions and they can't be communicated very easily to anyone any more than it's easy for you to communicate feelings to someone that doesn't experience your own. But they are certainly there and we know that they are there because they can assign meaning to things and make value judgments. No matter what Spock may tell you, these value judgments are not wholly rational. We don't even live in a universe where you can make a mathematical system not depend on unprovable axioms, much less one that can make value judgments wholly based on logic. There has to be something that makes you do your homework even though you know the heat death of the universe is inevitable in a scant few billion years. For the dragon, if he has a set of values that are congruent with destructiveness, then he cannot and has no desire to change those values and every built in emotional feedback loop makes him wholly miserable when he tries and every built in emotional feeback loop makes him happy when he doesn't, then it doesn't matter how intelligent the dragon is, he's still going to behave according to his very dragon-ish nature. And that gets us to intelligence. Intelligence isn't what most people think it is either. But this essay is long enough already, so let me just say there is no such thing as "hard intelligence" or "general intelligence". (Or if there is, we have no examples of it.) [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
Consequences of playing "EVIL" races
Top