Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Next
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
Twitch
YouTube
Facebook (EN Publishing)
Facebook (EN World)
Twitter
Instagram
TikTok
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Dungeons & Dragons
Has anyone seen this Wired article about using D&D to teach AIs?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Aaron L" data-source="post: 7933798" data-attributes="member: 926"><p>My fault, I should have been more specific; I wasn't talking about a cosmological singularity, a Black Hole or the Big Bang; I was referring to the concept of the Technological Singularity believed in by some people, which is a hypothetical event supposed to occur when the first true sapient hard AI is created, at which point the first thing it will do is either upgrade itself or create a second generation AI that is smarter than itself, which will supposedly then create an explosion of ever-increasingly smarter AIs that will outstrip human intelligence so quickly and so completely that we can have no way of predicting what would happen beyond that point. In other words, Judgement Day from Terminator.</p><p></p><p>I don't believe such a thing will ever happen, or is even possible. For one thing the whole concept of the Singularity (they like to capitalize it) relies on the assumption that intelligence vastly superior to human-level is even possible in the first place. That's a <em>big </em>assumption to make, that it is possible for there to be a superior level of consciousness beyond human consciousness that we can't comprehend, and that these AIs will achieve it. The Singularitarians love to over-extrapolate based on analogies and they use the analogy that the AIs will be to humans as humans are to chimps. However, it just might be that a level of super-consciousness like that isn't even <em>possible, </em>and that there just isn't any room to be very much smarter than the smartest human beings. The assumption that meta-human super-consciousness is even possible is a rather large one to make. It also relies on the assumption that the AIs will be able to quickly and easily improve upon their own intelligence by orders of magnitude, and that is also another <em>huge </em>assumption to make. </p><p></p><p>Essentially the whole idea of the Singularity is a technofetishist utopia built upon a whole lot of assumptions (even though I would call it a nightmarish <em>dystopia</em>) that is adhered to with religious fervor by some people, including quite a few Silicon Valley tech-bros.</p><p></p><p>There are even certain groups out there that exist entirely to hasten the arrival of the Singularity, on the assumption that it is inevitable and, when it happens, the AI that results will essentially have godlike powers and will be able to create a giant, eternal computer simulation inside itself containing perfectly simulated mathematical mental duplicates of every person who has ever existed, and then the AI will use this giant simulation to eternally punish the computer simulations of every person who ever lived who did not actively work toward ensuring its creation as quickly as possible. So, because they do not want to have their future computer simulation-selves damned to eternal punishment. Because they honestly they believe that, since it will be a perfect simulation of themselves, it will essentially <em>be them</em> experiencing the eternal punishment of Future Computer Hell... persistence of identity and the metaphysically questionable idea of such a thing be damned. So they devote all of their resources to making sure that the Singularity happens as soon as possible, so that the Future God AI won't be mad at them when it arrives and put them in Simulated Computer Hell in its imagination for all eternity (or until the universe ends... but a lot of them also believe the AI will be so smart it will figure out a way to either prevent that, or preserve itself through the end of the universe.) </p><p></p><p>It's basically the absolute worst kind of <em>I Have No Mouth And I Must Scream</em> scenario, but they <em><strong>fervently </strong></em>believe in it being absolutely real and inevitable. </p><p></p><p>There's a whole lot of delusional techno-religious utopianism/dystopianism going on with it, and it's all mixed up with transhumanist/cyborg fetish futurism. I've read that there are people who have bankrupted themselves from donating more money than they could afford to these groups because they fear the inevitable coming of the Singularity Future God AI so much. When the rational reality is that the whole thing is <em>completely </em><strong>evitable</strong>, and in fact highly, <em>highly </em>unlikely.</p></blockquote><p></p>
[QUOTE="Aaron L, post: 7933798, member: 926"] My fault, I should have been more specific; I wasn't talking about a cosmological singularity, a Black Hole or the Big Bang; I was referring to the concept of the Technological Singularity believed in by some people, which is a hypothetical event supposed to occur when the first true sapient hard AI is created, at which point the first thing it will do is either upgrade itself or create a second generation AI that is smarter than itself, which will supposedly then create an explosion of ever-increasingly smarter AIs that will outstrip human intelligence so quickly and so completely that we can have no way of predicting what would happen beyond that point. In other words, Judgement Day from Terminator. I don't believe such a thing will ever happen, or is even possible. For one thing the whole concept of the Singularity (they like to capitalize it) relies on the assumption that intelligence vastly superior to human-level is even possible in the first place. That's a [I]big [/I]assumption to make, that it is possible for there to be a superior level of consciousness beyond human consciousness that we can't comprehend, and that these AIs will achieve it. The Singularitarians love to over-extrapolate based on analogies and they use the analogy that the AIs will be to humans as humans are to chimps. However, it just might be that a level of super-consciousness like that isn't even [I]possible, [/I]and that there just isn't any room to be very much smarter than the smartest human beings. The assumption that meta-human super-consciousness is even possible is a rather large one to make. It also relies on the assumption that the AIs will be able to quickly and easily improve upon their own intelligence by orders of magnitude, and that is also another [I]huge [/I]assumption to make. Essentially the whole idea of the Singularity is a technofetishist utopia built upon a whole lot of assumptions (even though I would call it a nightmarish [I]dystopia[/I]) that is adhered to with religious fervor by some people, including quite a few Silicon Valley tech-bros. There are even certain groups out there that exist entirely to hasten the arrival of the Singularity, on the assumption that it is inevitable and, when it happens, the AI that results will essentially have godlike powers and will be able to create a giant, eternal computer simulation inside itself containing perfectly simulated mathematical mental duplicates of every person who has ever existed, and then the AI will use this giant simulation to eternally punish the computer simulations of every person who ever lived who did not actively work toward ensuring its creation as quickly as possible. So, because they do not want to have their future computer simulation-selves damned to eternal punishment. Because they honestly they believe that, since it will be a perfect simulation of themselves, it will essentially [I]be them[/I] experiencing the eternal punishment of Future Computer Hell... persistence of identity and the metaphysically questionable idea of such a thing be damned. So they devote all of their resources to making sure that the Singularity happens as soon as possible, so that the Future God AI won't be mad at them when it arrives and put them in Simulated Computer Hell in its imagination for all eternity (or until the universe ends... but a lot of them also believe the AI will be so smart it will figure out a way to either prevent that, or preserve itself through the end of the universe.) It's basically the absolute worst kind of [I]I Have No Mouth And I Must Scream[/I] scenario, but they [I][B]fervently [/B][/I]believe in it being absolutely real and inevitable. There's a whole lot of delusional techno-religious utopianism/dystopianism going on with it, and it's all mixed up with transhumanist/cyborg fetish futurism. I've read that there are people who have bankrupted themselves from donating more money than they could afford to these groups because they fear the inevitable coming of the Singularity Future God AI so much. When the rational reality is that the whole thing is [I]completely [/I][B]evitable[/B], and in fact highly, [I]highly [/I]unlikely. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
Has anyone seen this Wired article about using D&D to teach AIs?
Top