Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
NOW LIVE! Today's the day you meet your new best friend. You don’t have to leave Wolfy behind... In 'Pets & Sidekicks' your companions level up with you!
Community
General Tabletop Discussion
*TTRPGs General
The AI Red Scare is only harming artists and needs to stop.
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Celebrim" data-source="post: 9371790" data-attributes="member: 4937"><p>I don't think it's hijacking at all. I think that the philosophical implications here are central to understanding this and why analogies about meat and sausage are so much self-serving self-deception. I think it's important to address what it actually is, rather than living inside an unreflected upon mental construct that basically has nothing to do with reality.</p><p></p><p>If we can't understand the thing - spoilers we can't understand it - we'd at least better be trying to do so.</p><p></p><p></p><p></p><p>I mean, what is creativity? Back in the 1960's we thought intelligence was something that spontaneously arose out of complexity. People - smart people - thought that if they just made a machine that could play chess that it would spontaneously evolve into something intelligent. It was in retrospect the spontaneous generation theory of life applied to intelligence. It was like those 19th century scientists looking at cells for the first time and going, "Well it has a membrane and there is a black dot in the middle. It can't be that complicated". Now were are the point where I think we're getting close to being able to define intelligence and know what intelligence is, and the answer would knock those 1960s researchers out of their chairs and if we are honest should be knocking us out of our chairs right now.</p><p></p><p>Turns out "intelligence" is a word like "magic" that we use for something we don't understand. And it's probably not, actually certainly not "a thing", but likely a bunch of things only some of which we as "thinking apes" actually have.</p><p></p><p></p><p></p><p>It's a good question. I don't know. I had always considered creativity an error handling routine. My theory was it was based on faulty memory and was what we did to fill in the gaps in those memories. But I don't know, and I don't know how accessible that routine is to most people - even smart people. There is clearly a divergence here, as we are discovering that like the ability to actually see things in ones mind's eye that I had just assumed was ubiquitous is also an algorithm with a processor of diverse power across the human population. </p><p></p><p></p><p></p><p>I think that's the inevitable conclusion. I mean, the more you study parrots and baboons, the more obvious it is that we have processors and processes they don't have, but it could be that human language production is more like a parrot and autocomplete than we really want to admit. As an autistic missing certain apparently common processors of my own, it explains a massive amount of what I thought was a bizarrely missing channel in human speech production which I never could figure out where the "normals" were getting the clues to fill in. Namely, it always bothered me that sentences weren't prefixed by their purpose and I couldn't understand how people fully understood language without that context, yet acted as if they did understand and had speakers that received the speech as if it had been understood. I always thought people filled the missing channel with intonation and body language, which is part of the processing complex I'm bad at/low end of the spectrum. But the more human speech I processed online, the more I realized that was wrong. Humans weren't even aware they were missing that sense channel and that was the result of so much human miscommunication. Worse, much of human speech was obviously being produced without the human understanding why they spoke as they did or what purpose they were trying to accomplish. And I think the answer is that we're all closer to parrots doing glorified autocomplete than we'd like to think. I think people string together words based on prior patterns they've heard and they keep doing that until they achieve some desired response. Like so many conversations just boil down to repeated queries of, "Are you on my side?" where the words don't even matter. Listen for them sometimes. </p><p></p><p>Now that's not entirely fair. There is clearly something we have that parrots and gorillas don't have. But I wonder what the functional range is on that extra missing thing is. It's not always clear to me that people understand what they actually said much less why they said it.</p><p></p><p></p><p></p><p>Do you notice how many words only survive in speech as part of certain phrases? I mean I do it too, even though my working vocabulary is in the 38000 words range, I'm not sure now that I don't just have a bigger word salad.</p><p></p><p>I have so many interactions online these days were every single sentence that the person says I could goggle and find out there, and not just one example. And even where there is some word variation it's no bigger than the range of word variation a ChatBot uses to say "yes" creatively. </p><p></p><p></p><p></p><p>Or to what extent is ChatGPT by being trained on what we say just us made out of wires? Like it's not us yet, I think we can agree to that. But how much of us is it really? Would 9 more processing complexes be enough? 99?</p></blockquote><p></p>
[QUOTE="Celebrim, post: 9371790, member: 4937"] I don't think it's hijacking at all. I think that the philosophical implications here are central to understanding this and why analogies about meat and sausage are so much self-serving self-deception. I think it's important to address what it actually is, rather than living inside an unreflected upon mental construct that basically has nothing to do with reality. If we can't understand the thing - spoilers we can't understand it - we'd at least better be trying to do so. I mean, what is creativity? Back in the 1960's we thought intelligence was something that spontaneously arose out of complexity. People - smart people - thought that if they just made a machine that could play chess that it would spontaneously evolve into something intelligent. It was in retrospect the spontaneous generation theory of life applied to intelligence. It was like those 19th century scientists looking at cells for the first time and going, "Well it has a membrane and there is a black dot in the middle. It can't be that complicated". Now were are the point where I think we're getting close to being able to define intelligence and know what intelligence is, and the answer would knock those 1960s researchers out of their chairs and if we are honest should be knocking us out of our chairs right now. Turns out "intelligence" is a word like "magic" that we use for something we don't understand. And it's probably not, actually certainly not "a thing", but likely a bunch of things only some of which we as "thinking apes" actually have. It's a good question. I don't know. I had always considered creativity an error handling routine. My theory was it was based on faulty memory and was what we did to fill in the gaps in those memories. But I don't know, and I don't know how accessible that routine is to most people - even smart people. There is clearly a divergence here, as we are discovering that like the ability to actually see things in ones mind's eye that I had just assumed was ubiquitous is also an algorithm with a processor of diverse power across the human population. I think that's the inevitable conclusion. I mean, the more you study parrots and baboons, the more obvious it is that we have processors and processes they don't have, but it could be that human language production is more like a parrot and autocomplete than we really want to admit. As an autistic missing certain apparently common processors of my own, it explains a massive amount of what I thought was a bizarrely missing channel in human speech production which I never could figure out where the "normals" were getting the clues to fill in. Namely, it always bothered me that sentences weren't prefixed by their purpose and I couldn't understand how people fully understood language without that context, yet acted as if they did understand and had speakers that received the speech as if it had been understood. I always thought people filled the missing channel with intonation and body language, which is part of the processing complex I'm bad at/low end of the spectrum. But the more human speech I processed online, the more I realized that was wrong. Humans weren't even aware they were missing that sense channel and that was the result of so much human miscommunication. Worse, much of human speech was obviously being produced without the human understanding why they spoke as they did or what purpose they were trying to accomplish. And I think the answer is that we're all closer to parrots doing glorified autocomplete than we'd like to think. I think people string together words based on prior patterns they've heard and they keep doing that until they achieve some desired response. Like so many conversations just boil down to repeated queries of, "Are you on my side?" where the words don't even matter. Listen for them sometimes. Now that's not entirely fair. There is clearly something we have that parrots and gorillas don't have. But I wonder what the functional range is on that extra missing thing is. It's not always clear to me that people understand what they actually said much less why they said it. Do you notice how many words only survive in speech as part of certain phrases? I mean I do it too, even though my working vocabulary is in the 38000 words range, I'm not sure now that I don't just have a bigger word salad. I have so many interactions online these days were every single sentence that the person says I could goggle and find out there, and not just one example. And even where there is some word variation it's no bigger than the range of word variation a ChatBot uses to say "yes" creatively. Or to what extent is ChatGPT by being trained on what we say just us made out of wires? Like it's not us yet, I think we can agree to that. But how much of us is it really? Would 9 more processing complexes be enough? 99? [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*TTRPGs General
The AI Red Scare is only harming artists and needs to stop.
Top