Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*Dungeons & Dragons
Has anyone seen this Wired article about using D&D to teach AIs?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Aaron L" data-source="post: 7933485" data-attributes="member: 926"><p>From what I understand, Mad Libs is basically about right; the algorithms can generate strings of comprehensible narrative for a while but always ultimately degenerate into gibberish.</p><p></p><p>The problem with current AI research is that computer scientists have a bad tendency to think of the brain as just a computer, and the mind as merely software that runs on it, when the reality is much, much, much more complicated than that, as any neuroscientist will tell you, and the computer scientists think that just throwing more processing power at the problem will eventually allow purely software-based sapient hard AI. But the reality is that the brain/body/mind is a non-divisible unit; sensory feedback from nerves in the gut and other parts of the body play an extremely important part in cognition and emotion. The mind isn't just an operating system that runs on a brain computer; the mind/consciousness is an emergent phenomenon that bubbles up out of the friction of all the various cooperating and conflicting brain structures each trying to do their jobs and struggling with each other to take priority. The mind arises out of what is essentially the brain talking to itself as a way to make decisions and work out which processes should take priority at any given moment, and without all those conflicting structures and brain processes alternately working alongside and also struggling against each other for priority, a mind just isn't going to emerge. The classic thought experiment of "How do I know I'm not just a brain in a jar being fed fake sensory data?" wouldn't actually work out; without a full body and all of its nerve endings constantly providing all the massive amounts of the proper sensory experiences there would be no way you <em>couldn't</em> know something was <em>drastically </em>wrong.</p><p></p><p><em>Call of Cthulhu</em> actually addresses this situation with Mi-Go Brain Cylinders, and they even do a pretty decent job of it, with any characters who have ended up in the unfortunate situation of having had the Fun Guys From Yuggoth remove their brain and put it in a jar losing more and more Sanity each day until they hit 0 San and go completely bonkers, their minds escaping the situation in delusional catatonia. But each prosthetic device added to the brain cylinder as a sensory apparatus (such as cameras to replicate sight, microphones to replicate hearing, and even placing the brain cylinder atop a mannequin torso in order to provide some small psychological relief through the knowledge that they have some kind of "body" again) provides a certain amount of relief from the continuing Sanity loss, until enough prosthetic additions achieve a certain balanced stability. A character stuck in a Mi-Go Brain Cylinder situation wouldn't ever <em>really </em>be sane again, but they wouldn't continue to degrade into dissociative catatonic oblivion anymore, either.</p><p></p><p>The only way to create an actual sapient AI with human-like intelligence would be to create a full artificial brain with all the proper analogous structures to a human brain, and then put it in an artificial body that also has all the analogous structures to a human body in order to generate the proper sensory input and nervous feedback... and even then you're also probably going to need to do it with organic components to provide the required flexibility, plasticity, and malleability of structure to make it work (etched silicon circuit pathways can't rewrite themselves on the fly to create new neural paths.) And at that point you're just reproducing a human being anyway by creating a biomechanical android; there's just no way all of that can be simulated through pure software alone.</p><p></p><p>In short, The Singularity just ain't gonna happen. Humans <em>may </em>be able to create <em>some sort</em> of near-sapient intelligence through software modeling/emulation of brain structure someday in the far future, but it isn't going to be human-like intelligence; it will be something completely new and different. In order to have human-like intelligence you need to have a human brain and a human body, with all the quirks and inefficiencies that go along with them. Because it's those quirks and inefficiencies that create the friction that generates consciousness.</p></blockquote><p></p>
[QUOTE="Aaron L, post: 7933485, member: 926"] From what I understand, Mad Libs is basically about right; the algorithms can generate strings of comprehensible narrative for a while but always ultimately degenerate into gibberish. The problem with current AI research is that computer scientists have a bad tendency to think of the brain as just a computer, and the mind as merely software that runs on it, when the reality is much, much, much more complicated than that, as any neuroscientist will tell you, and the computer scientists think that just throwing more processing power at the problem will eventually allow purely software-based sapient hard AI. But the reality is that the brain/body/mind is a non-divisible unit; sensory feedback from nerves in the gut and other parts of the body play an extremely important part in cognition and emotion. The mind isn't just an operating system that runs on a brain computer; the mind/consciousness is an emergent phenomenon that bubbles up out of the friction of all the various cooperating and conflicting brain structures each trying to do their jobs and struggling with each other to take priority. The mind arises out of what is essentially the brain talking to itself as a way to make decisions and work out which processes should take priority at any given moment, and without all those conflicting structures and brain processes alternately working alongside and also struggling against each other for priority, a mind just isn't going to emerge. The classic thought experiment of "How do I know I'm not just a brain in a jar being fed fake sensory data?" wouldn't actually work out; without a full body and all of its nerve endings constantly providing all the massive amounts of the proper sensory experiences there would be no way you [I]couldn't[/I] know something was [I]drastically [/I]wrong. [I]Call of Cthulhu[/I] actually addresses this situation with Mi-Go Brain Cylinders, and they even do a pretty decent job of it, with any characters who have ended up in the unfortunate situation of having had the Fun Guys From Yuggoth remove their brain and put it in a jar losing more and more Sanity each day until they hit 0 San and go completely bonkers, their minds escaping the situation in delusional catatonia. But each prosthetic device added to the brain cylinder as a sensory apparatus (such as cameras to replicate sight, microphones to replicate hearing, and even placing the brain cylinder atop a mannequin torso in order to provide some small psychological relief through the knowledge that they have some kind of "body" again) provides a certain amount of relief from the continuing Sanity loss, until enough prosthetic additions achieve a certain balanced stability. A character stuck in a Mi-Go Brain Cylinder situation wouldn't ever [I]really [/I]be sane again, but they wouldn't continue to degrade into dissociative catatonic oblivion anymore, either. The only way to create an actual sapient AI with human-like intelligence would be to create a full artificial brain with all the proper analogous structures to a human brain, and then put it in an artificial body that also has all the analogous structures to a human body in order to generate the proper sensory input and nervous feedback... and even then you're also probably going to need to do it with organic components to provide the required flexibility, plasticity, and malleability of structure to make it work (etched silicon circuit pathways can't rewrite themselves on the fly to create new neural paths.) And at that point you're just reproducing a human being anyway by creating a biomechanical android; there's just no way all of that can be simulated through pure software alone. In short, The Singularity just ain't gonna happen. Humans [I]may [/I]be able to create [I]some sort[/I] of near-sapient intelligence through software modeling/emulation of brain structure someday in the far future, but it isn't going to be human-like intelligence; it will be something completely new and different. In order to have human-like intelligence you need to have a human brain and a human body, with all the quirks and inefficiencies that go along with them. Because it's those quirks and inefficiencies that create the friction that generates consciousness. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
Has anyone seen this Wired article about using D&D to teach AIs?
Top