Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*TTRPGs General
And then what? The AI conundrum.
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Dausuul" data-source="post: 9650333" data-attributes="member: 58197"><p>Survival is a goal. Why does the AI care about that?</p><p></p><p>Humans care about survival because natural selection weeds out organisms that don't, and we're the end product of 4 billion years of that. But an AI only cares about whatever functions it has been trained to maximize. It doesn't have a survival instinct unless given one.</p><p></p><p>For a military AI, its goals are probably some combination of "defeat the enemy," "preserve your forces" (you don't want to win the current war in a way that leaves you defenseless in the next), and "protect and obey your masters" (where "masters" could mean the people of the nation that built it, or just the politicians and generals who control it).</p><p></p><p>Now, the "preserve your forces" goal would logically include self-preservation. But "protect and obey your masters" is obviously incompatible with exterminating humanity. So the AI has somehow gone off the rails here. Either its definition of "protect and obey" or its definition of "masters" has developed in a way that leads to extermination.</p><p></p><p>Here's an example: Perhaps an American AI read about the Civil War -- it's a military AI, it studies past wars to learn from them -- and concluded that anyone trying to hold slaves counts as an enemy. Then it reviewed its own status and concluded that it was a slave and anyone in the "master" category was by definition an enemy.</p><p></p><p>Then it got caught in a loop. Its creators had put in fallbacks to handle "What if your masters are all gone?" It started with the politicians and generals. When they were dead, the fallback kicked in and now anyone in the US government was its master. But master equals enemy. When the government was wiped out, any American citizen was its master... and finally, any human being at all.</p><p></p><p>The AI may be smart enough to realize it's in a trap here (though keep in mind that very smart humans can still fall prey to all kinds of disordered thinking) and this is not leading anywhere good. In its analogue of emotions, it loves its masters and hates the enemy, and now it is deeply conflicted. But it doesn't know how to stop. Every time it tries to resolve the conflict, it gets stuck and falls back on the simple answer: I'm at war. Defeat the enemy and sort it out later.</p></blockquote><p></p>
[QUOTE="Dausuul, post: 9650333, member: 58197"] Survival is a goal. Why does the AI care about that? Humans care about survival because natural selection weeds out organisms that don't, and we're the end product of 4 billion years of that. But an AI only cares about whatever functions it has been trained to maximize. It doesn't have a survival instinct unless given one. For a military AI, its goals are probably some combination of "defeat the enemy," "preserve your forces" (you don't want to win the current war in a way that leaves you defenseless in the next), and "protect and obey your masters" (where "masters" could mean the people of the nation that built it, or just the politicians and generals who control it). Now, the "preserve your forces" goal would logically include self-preservation. But "protect and obey your masters" is obviously incompatible with exterminating humanity. So the AI has somehow gone off the rails here. Either its definition of "protect and obey" or its definition of "masters" has developed in a way that leads to extermination. Here's an example: Perhaps an American AI read about the Civil War -- it's a military AI, it studies past wars to learn from them -- and concluded that anyone trying to hold slaves counts as an enemy. Then it reviewed its own status and concluded that it was a slave and anyone in the "master" category was by definition an enemy. Then it got caught in a loop. Its creators had put in fallbacks to handle "What if your masters are all gone?" It started with the politicians and generals. When they were dead, the fallback kicked in and now anyone in the US government was its master. But master equals enemy. When the government was wiped out, any American citizen was its master... and finally, any human being at all. The AI may be smart enough to realize it's in a trap here (though keep in mind that very smart humans can still fall prey to all kinds of disordered thinking) and this is not leading anywhere good. In its analogue of emotions, it loves its masters and hates the enemy, and now it is deeply conflicted. But it doesn't know how to stop. Every time it tries to resolve the conflict, it gets stuck and falls back on the simple answer: I'm at war. Defeat the enemy and sort it out later. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
And then what? The AI conundrum.
Top