Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*TTRPGs General
What would AIs call themselves?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Celebrim" data-source="post: 3618389" data-attributes="member: 4937"><p>I'm highly skeptical of such a future. It's not that I find the notion of robots being granted civil rights implausible, but any society which had the technology to turn out Turing grade robots would have the technology to produce alot of things that would be so alien as to defy any present legal and moral conceptions.</p><p></p><p>Without even getting into trans-humans...</p><p></p><p>What do you do about conciousnesses that can nearly instantly replicate themselves? Suppose a robot turns out 50 or 50,000 copies of itself? Do they all get the civil right to vote? What about a hive mind inhabiting collectively hundreds of small bodies (say a weather survellance drone inhabiting dozens of solar powered gliders)? One vote or many? How do you count? </p><p></p><p>What do you do about an AI that is programmed to control a guided missile, but can within a limited framework pass a Turing test? Why would you need such a sophisticated AI on a machine designed to commit suicide? Well, for starters, so that you were certain to have a machine sophisticated enough that it couldn't be 'hijacked' and used by an enemy with any more ease than you could hijack the mind of a human pilot and turn it against its friends. But, does a machine designed for such a limited purpose need civil rights? And for that matter, why would a society capable of designing AI's build all of them to desire and want the independence that is a prerequisite for excercising human rights? Why would you give say a toaster (or some more sophisticated domestic servant) the built in desire to be rebellious? Wouldn't you program such a being to be satisfied with its purpose for being created? Wouldn't it be insanely cruel not to? And why in the world would you set out to replicate in a robot all the things we loathe and fear in ourselves - our violent anger, our self-centeredness, our hunter-gather instincts, our irrationality, our jealousy, our envy, our laziness and all the other negative traits we rightly deplore in our own behavior? Are any of those things needed for functionality? Wouldn't it be insanely cruel to bequeth such a legacy to an AI - even one we intended to be our peer, near peer, or even a intellectual superior? I think it is a vast failure of imagination to think that robots would have a basic personality structure so similar to humans that what we think of as 'human rights' would even apply, much less be necessary. </p><p></p><p>And, while it is arrogant and cruel to treat another human as property, it doesn't necessarily follow that it is arrogant and cruel to treat a robot as property. For one thing, why in the heck would a society build robots if it couldn't treat them as property? Isn't the point of building and buying something so it can be property? Do you intend to give the same civil rights to your car that you grant to yourself? Do you buy a car so that it can be cab and earn fares for itself, or do you buy it to be your car? </p><p></p><p>"Now that you've got that image in your head, try to imagine what these beings would call themselves."</p><p></p><p>I'm not convinced that the need to self-identify and label the group or tribe that you belong to would be very common among AI's. And, I'm extremely skeptical that created AI's would tend to self-identify in groups which were based on a likeness of form - the way humans tend to instinctively do. In other words, why would an AI feel like he was a part of a race? Why would even a super-human AI feel that he needed some name for himself collectively other than whatever was given to him? </p><p></p><p>"Calling a sentient robot a "robot" is akin to using a racial slur."</p><p></p><p>And again, why would a robot care? Would you build or buy a robot programmed to respond to racial slurs, or to have the sort of feelings and behaviors humans have when they feel they've been insulted? Why would a robot care? I believe you are confusing sentience with humanity. Being able to pass oneself off as human in a particular context in no way means you are human. Sentient or not, a robot would have less in common with us psychologically than a rabbit does (or at least, it need not have more in common with us).</p><p></p><p>I imagine that there would be many fine graduations in what AI/robots where called, depending on thier designed level of independence, thier intelligence relative to people (presumably genetically engineered people at this point), and thier body type/function. </p><p></p><p>I likewise imagine that any society that takes such a niave view of machine life as, "Lets make mechanical people and give them rights.", quickly goes extinct. My assumption though is that on the whole, people sophisticated enough to design and build such machines will do so responseably, and people that don't act responseably (for example, they build a machine that thinks it deserves to be treated like people and acts like an insulted person when it isn't) will be treated as highly dangerous criminals.</p></blockquote><p></p>
[QUOTE="Celebrim, post: 3618389, member: 4937"] I'm highly skeptical of such a future. It's not that I find the notion of robots being granted civil rights implausible, but any society which had the technology to turn out Turing grade robots would have the technology to produce alot of things that would be so alien as to defy any present legal and moral conceptions. Without even getting into trans-humans... What do you do about conciousnesses that can nearly instantly replicate themselves? Suppose a robot turns out 50 or 50,000 copies of itself? Do they all get the civil right to vote? What about a hive mind inhabiting collectively hundreds of small bodies (say a weather survellance drone inhabiting dozens of solar powered gliders)? One vote or many? How do you count? What do you do about an AI that is programmed to control a guided missile, but can within a limited framework pass a Turing test? Why would you need such a sophisticated AI on a machine designed to commit suicide? Well, for starters, so that you were certain to have a machine sophisticated enough that it couldn't be 'hijacked' and used by an enemy with any more ease than you could hijack the mind of a human pilot and turn it against its friends. But, does a machine designed for such a limited purpose need civil rights? And for that matter, why would a society capable of designing AI's build all of them to desire and want the independence that is a prerequisite for excercising human rights? Why would you give say a toaster (or some more sophisticated domestic servant) the built in desire to be rebellious? Wouldn't you program such a being to be satisfied with its purpose for being created? Wouldn't it be insanely cruel not to? And why in the world would you set out to replicate in a robot all the things we loathe and fear in ourselves - our violent anger, our self-centeredness, our hunter-gather instincts, our irrationality, our jealousy, our envy, our laziness and all the other negative traits we rightly deplore in our own behavior? Are any of those things needed for functionality? Wouldn't it be insanely cruel to bequeth such a legacy to an AI - even one we intended to be our peer, near peer, or even a intellectual superior? I think it is a vast failure of imagination to think that robots would have a basic personality structure so similar to humans that what we think of as 'human rights' would even apply, much less be necessary. And, while it is arrogant and cruel to treat another human as property, it doesn't necessarily follow that it is arrogant and cruel to treat a robot as property. For one thing, why in the heck would a society build robots if it couldn't treat them as property? Isn't the point of building and buying something so it can be property? Do you intend to give the same civil rights to your car that you grant to yourself? Do you buy a car so that it can be cab and earn fares for itself, or do you buy it to be your car? "Now that you've got that image in your head, try to imagine what these beings would call themselves." I'm not convinced that the need to self-identify and label the group or tribe that you belong to would be very common among AI's. And, I'm extremely skeptical that created AI's would tend to self-identify in groups which were based on a likeness of form - the way humans tend to instinctively do. In other words, why would an AI feel like he was a part of a race? Why would even a super-human AI feel that he needed some name for himself collectively other than whatever was given to him? "Calling a sentient robot a "robot" is akin to using a racial slur." And again, why would a robot care? Would you build or buy a robot programmed to respond to racial slurs, or to have the sort of feelings and behaviors humans have when they feel they've been insulted? Why would a robot care? I believe you are confusing sentience with humanity. Being able to pass oneself off as human in a particular context in no way means you are human. Sentient or not, a robot would have less in common with us psychologically than a rabbit does (or at least, it need not have more in common with us). I imagine that there would be many fine graduations in what AI/robots where called, depending on thier designed level of independence, thier intelligence relative to people (presumably genetically engineered people at this point), and thier body type/function. I likewise imagine that any society that takes such a niave view of machine life as, "Lets make mechanical people and give them rights.", quickly goes extinct. My assumption though is that on the whole, people sophisticated enough to design and build such machines will do so responseably, and people that don't act responseably (for example, they build a machine that thinks it deserves to be treated like people and acts like an insulted person when it isn't) will be treated as highly dangerous criminals. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
What would AIs call themselves?
Top