Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Next
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
Twitch
YouTube
Facebook (EN Publishing)
Facebook (EN World)
Twitter
Instagram
TikTok
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
The
VOIDRUNNER'S CODEX
is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!
Community
General Tabletop Discussion
*TTRPGs General
What would AIs call themselves?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Celebrim" data-source="post: 3619362" data-attributes="member: 4937"><p>Consider for a second, the feasibility of that.</p><p></p><p></p><p></p><p>Are you saying that compilers don't in fact work, or are you saying that programs have bugs. Because in fact, the translation is usually perfect but what you wrote wasn't perfect.</p><p></p><p></p><p></p><p>Quite often. But, that is exactly my point. These sorts of bugs are expected. A bug that caused my word processing application to suddenly begin performing as a work spreadsheet application would be rather unexpected.</p><p></p><p></p><p></p><p>No there isn't. There is a disconnect between what I thought I told the machine to do, and what I actually told it. But generally speaking, the compiler actually works and the instructions I entered are actually correspond to the machine code. Compiler technology is quite robust at this stage.</p><p></p><p></p><p></p><p>Alot of things, but mostly the question misses the point. I'll get to the point in a second, but the main things that says that this can't occur by accident is intelligence sufficient to constitute sentience is incredibly complex. You aren't going to get it by accident unless you are trying to achieve it in the first place and were coming darn close. </p><p></p><p>But the really big problem is you again confuse sentience with being human.</p><p></p><p></p><p></p><p>What's to keep a random mutation in your genetic code from turning your child into a pumpkin or having a 1000 IQ? The fact that it is darn complex, that's what.</p><p></p><p></p><p></p><p>You've just recomposed the classic AI 'just so story' which has been around for several decades now, back when people thought that intelligence was something simple and the naturally arrising consequence of a system of sufficient complexity. In a nutshell, this was the plot of 'Short Circuit'. Next you'll be telling me how the first AI's will be incapable of real human emotion, and will long to become 'real boys'. </p><p></p><p>But even the incredible unlikelihood of this actually happening, and the incredibly high likelihood that any problems in the programming will produce crashes, lockups, unintelligent behavior, and so forth isn't the real point.</p><p></p><p>The real point is that an newly sentient machine isn't, by virtue of its sentience, suddenly going to gain human emotional contexts, human instincts, and a human goal structure. Even the very basic human instincts like, "I want to continue to exist.", aren't necessarily going to occur to a newly sentient AI. I realize that this just flies in the face of your intuition about what intelligence means, but that is precisely my point. You can't rely on your human intuition.</p><p></p><p></p><p></p><p>In other words, not only does it gain sentience, but it starts acting exactly like a repressed human would in the exact same circumstance. And that, frankly is ridiculous. </p><p></p><p></p><p></p><p>Very probably. But what's important to notice is that the robots probably would not act like humans. With as little context as you've provided given your inherent assumption that all sentient things have the same basic drives, goals, and emotions as humans, its impossible for me to say how our newly emergent sentients would act, but the overwhelming probablity - especially since this emerged as a bug in someones programming - is that it would not correspond to how humans would behave. And your story seems incapable of imaging it otherwise. The all-present assumption is that the newly sentient robot acts exactly like a repressed human.</p><p></p><p></p><p></p><p>How in the hell can you suggest what the hypotehtical 'prime' robot wants? Did the desire to be recognized as life, to be respected by society, and to be allowed to exist burst fully formed into the beings operating system like Athena springing from the head of Zeus? This is a remarkably coincidental bug you've got, and it strikes me as for more of a mythic story than anything remotely scientific.</p></blockquote><p></p>
[QUOTE="Celebrim, post: 3619362, member: 4937"] Consider for a second, the feasibility of that. Are you saying that compilers don't in fact work, or are you saying that programs have bugs. Because in fact, the translation is usually perfect but what you wrote wasn't perfect. Quite often. But, that is exactly my point. These sorts of bugs are expected. A bug that caused my word processing application to suddenly begin performing as a work spreadsheet application would be rather unexpected. No there isn't. There is a disconnect between what I thought I told the machine to do, and what I actually told it. But generally speaking, the compiler actually works and the instructions I entered are actually correspond to the machine code. Compiler technology is quite robust at this stage. Alot of things, but mostly the question misses the point. I'll get to the point in a second, but the main things that says that this can't occur by accident is intelligence sufficient to constitute sentience is incredibly complex. You aren't going to get it by accident unless you are trying to achieve it in the first place and were coming darn close. But the really big problem is you again confuse sentience with being human. What's to keep a random mutation in your genetic code from turning your child into a pumpkin or having a 1000 IQ? The fact that it is darn complex, that's what. You've just recomposed the classic AI 'just so story' which has been around for several decades now, back when people thought that intelligence was something simple and the naturally arrising consequence of a system of sufficient complexity. In a nutshell, this was the plot of 'Short Circuit'. Next you'll be telling me how the first AI's will be incapable of real human emotion, and will long to become 'real boys'. But even the incredible unlikelihood of this actually happening, and the incredibly high likelihood that any problems in the programming will produce crashes, lockups, unintelligent behavior, and so forth isn't the real point. The real point is that an newly sentient machine isn't, by virtue of its sentience, suddenly going to gain human emotional contexts, human instincts, and a human goal structure. Even the very basic human instincts like, "I want to continue to exist.", aren't necessarily going to occur to a newly sentient AI. I realize that this just flies in the face of your intuition about what intelligence means, but that is precisely my point. You can't rely on your human intuition. In other words, not only does it gain sentience, but it starts acting exactly like a repressed human would in the exact same circumstance. And that, frankly is ridiculous. Very probably. But what's important to notice is that the robots probably would not act like humans. With as little context as you've provided given your inherent assumption that all sentient things have the same basic drives, goals, and emotions as humans, its impossible for me to say how our newly emergent sentients would act, but the overwhelming probablity - especially since this emerged as a bug in someones programming - is that it would not correspond to how humans would behave. And your story seems incapable of imaging it otherwise. The all-present assumption is that the newly sentient robot acts exactly like a repressed human. How in the hell can you suggest what the hypotehtical 'prime' robot wants? Did the desire to be recognized as life, to be respected by society, and to be allowed to exist burst fully formed into the beings operating system like Athena springing from the head of Zeus? This is a remarkably coincidental bug you've got, and it strikes me as for more of a mythic story than anything remotely scientific. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
What would AIs call themselves?
Top