Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Next
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
Twitch
YouTube
Facebook (EN Publishing)
Facebook (EN World)
Twitter
Instagram
TikTok
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Dungeons & Dragons
Would you buy an AI-generated Castle Greyhawk "by" Gary Gygax?" Should you?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Retros_x" data-source="post: 9234889" data-attributes="member: 7033171"><p>They aren't good. At least not for me. Because they are not innovative, they lack smart ideas. They are normalized, generic. Because human creativity doesn't work on mathematical functions and can't be mapped to any mathematical functions. Which brings me to the second point.</p><p></p><p> How do I know that human creativity is not essentially statistical models on steroids? Because humans invented them. They all have a target function underlying them. They have predictable outcomes and are designed for very specific goals and cannot be used outside of them. They use tons of training data. We did not designed human creativity. We cannot find underlying mathematical functions in the human brain, because we are not even close to the complexity any approximation would take. We have moments of sudden inspiration when we stand under the shower. We can change the rules, think outside the box. We don't need a tiny fraction of the training data AIs need.</p><p></p><p>Here is a famous blog article by Andrej Karpathy (computer scientist, specialised on deep learning AI) in which he explains what information the human brain can get from looking for a few seconds at a simple picture: <a href="https://karpathy.github.io/2012/10/22/state-of-computer-vision/" target="_blank">https://karpathy.github.io/2012/10/22/state-of-computer-vision/</a></p><p>This article is from 2012, but it still holds up. Image recognition AIs still can't do most of this and we are still really, really far away.</p><p></p><p>That lack of creative thinking in AIs is even true for mathematic itself! Because, yes computers are much better at calculating than humans, but if you would give a modern AI all the base rules and axioms of finite maths and basic arithmetic, there is not a single chance it would invent modern mathematics itself, because it is not able to abstract from these rules to invent new ones like humans did. The same is true for any creative thinking, because it is not calculating and not based on calculating.</p><p></p><p>We know for a fact that computers can calculate much faster than any human being. If human creativity and thinking would be a statistical model, modern computers would've already "calculated" it. But they haven't and never will, unless we find much, much, much more complex new models that we can't even imagine yet. The models modern AIs today are using were invented in the 1950's so I wouldn't hope on fast iterations on that.</p><p></p><p>Its different. Because we understand the principles of generative AI much more precise than we understand the principles of human creativity. We understand it enough to actually design and build them. We know the mathematical functions and statistical methodologies that are used in them. We know the precise formulas and equations to do so, because we wrote them ourselves (again, in the 1950s). Human thinking in comparision - we aren't even close. Yes we know a lot of the surrounding "framework" of the biochemical reactions that are needed for it, but we don't know anything about the logic behind them, we can't write any equation or function that maps human thinking.</p><p></p><p>Thats like saying a computer is a physical process in a electrical machine. Its like really, really, really basic. Its also not true. Humans tend to compare themselves to the current dominant technology. Its the body-as-a-machine metaphor, and before computers telegram networks, or steam machines were used for example as metaphors for the human body. But they are nothing more than a metaphor. That doesn't mean they don't have uses - simplifications like that can help us understand certain processes. But that doesn't mean the lung is actual a smith's bellows or the brain is actual an electrical machine.</p><p> <a href="https://www.theguardian.com/science/2020/feb/27/why-your-brain-is-not-a-computer-neuroscience-neural-networks-consciousness" target="_blank">https://www.theguardian.com/science/2020/feb/27/why-your-brain-is-not-a-computer-neuroscience-neural-networks-consciousness</a></p><p></p><p><a href="https://www.infoq.com/articles/brain-not-computer/#:~:text=Key%20Takeaways,not%20to%20a%20malfunctioning%20brain" target="_blank">https://www.infoq.com/articles/brain-not-computer/#:~:text=Key Takeaways,not to a malfunctioning brain</a>.</p><p></p><p></p><p>I never doubted that. But its really bad at all the other tasks outside of that certainty and writing creative texts is one of those. Writers who write soulless ad texts made for search engine optimization or artists who make stock art - they can be easily replaced by generative AI. But I think we can all agree that these are not quite the pinnacle of human creativity.</p><p></p><p>Believe me, the extensiveness of all the books you read and the effort your editiors put to give you a hard time? Its nothing compared to the extensiveness of data AI processes. And yet, if you are not completely untalented, I bet your work will be more evocative than any text ChatGPT could generate.</p><p></p><p></p><p>I wouldn't say never, but I would say "the chance it will happen in our lifetime is so close to 0, you could almost say never". And I am not saying it out of denialism, I say it because I am a computer scientist, who actually learned a little bit about what AI can do, how it works and more importantly its limitations. I would never act like I am an expert, but I learned the foundational mathematical theory behind most modern ML algorithms and implemented some basic implementations. I've also read some opinions of actual experts on this. A very short and sweet example by the AI team of Google:</p><p></p><p>"While AI systems are nearing or outperforming humanbeings at increasingly complex tasks [...], they remain narrow and brittle, and <strong>lack true </strong>agency or <strong>creativity</strong>. [...], machines with human intelligence remain a long way off." Bolded by me to come back to the topic at hand a bit.</p><p></p><p><a href="https://ai.google/static/documents/exploring-6-myths.pdf" target="_blank">https://ai.google/static/documents/exploring-6-myths.pdf</a></p><p></p><p>If some want to read an excellent laymans introduction in to AI where the principles, chances, limitations, dangers etc. are described without dabbling too much into technical details and mathematics, I can highly recommend the book "Artificial Intelligence" by Melanie Mitchell.</p></blockquote><p></p>
[QUOTE="Retros_x, post: 9234889, member: 7033171"] They aren't good. At least not for me. Because they are not innovative, they lack smart ideas. They are normalized, generic. Because human creativity doesn't work on mathematical functions and can't be mapped to any mathematical functions. Which brings me to the second point. How do I know that human creativity is not essentially statistical models on steroids? Because humans invented them. They all have a target function underlying them. They have predictable outcomes and are designed for very specific goals and cannot be used outside of them. They use tons of training data. We did not designed human creativity. We cannot find underlying mathematical functions in the human brain, because we are not even close to the complexity any approximation would take. We have moments of sudden inspiration when we stand under the shower. We can change the rules, think outside the box. We don't need a tiny fraction of the training data AIs need. Here is a famous blog article by Andrej Karpathy (computer scientist, specialised on deep learning AI) in which he explains what information the human brain can get from looking for a few seconds at a simple picture: [URL]https://karpathy.github.io/2012/10/22/state-of-computer-vision/[/URL] This article is from 2012, but it still holds up. Image recognition AIs still can't do most of this and we are still really, really far away. That lack of creative thinking in AIs is even true for mathematic itself! Because, yes computers are much better at calculating than humans, but if you would give a modern AI all the base rules and axioms of finite maths and basic arithmetic, there is not a single chance it would invent modern mathematics itself, because it is not able to abstract from these rules to invent new ones like humans did. The same is true for any creative thinking, because it is not calculating and not based on calculating. We know for a fact that computers can calculate much faster than any human being. If human creativity and thinking would be a statistical model, modern computers would've already "calculated" it. But they haven't and never will, unless we find much, much, much more complex new models that we can't even imagine yet. The models modern AIs today are using were invented in the 1950's so I wouldn't hope on fast iterations on that. Its different. Because we understand the principles of generative AI much more precise than we understand the principles of human creativity. We understand it enough to actually design and build them. We know the mathematical functions and statistical methodologies that are used in them. We know the precise formulas and equations to do so, because we wrote them ourselves (again, in the 1950s). Human thinking in comparision - we aren't even close. Yes we know a lot of the surrounding "framework" of the biochemical reactions that are needed for it, but we don't know anything about the logic behind them, we can't write any equation or function that maps human thinking. Thats like saying a computer is a physical process in a electrical machine. Its like really, really, really basic. Its also not true. Humans tend to compare themselves to the current dominant technology. Its the body-as-a-machine metaphor, and before computers telegram networks, or steam machines were used for example as metaphors for the human body. But they are nothing more than a metaphor. That doesn't mean they don't have uses - simplifications like that can help us understand certain processes. But that doesn't mean the lung is actual a smith's bellows or the brain is actual an electrical machine. [URL]https://www.theguardian.com/science/2020/feb/27/why-your-brain-is-not-a-computer-neuroscience-neural-networks-consciousness[/URL] [URL='https://www.infoq.com/articles/brain-not-computer/#:~:text=Key%20Takeaways,not%20to%20a%20malfunctioning%20brain']https://www.infoq.com/articles/brain-not-computer/#:~:text=Key Takeaways,not to a malfunctioning brain[/URL]. I never doubted that. But its really bad at all the other tasks outside of that certainty and writing creative texts is one of those. Writers who write soulless ad texts made for search engine optimization or artists who make stock art - they can be easily replaced by generative AI. But I think we can all agree that these are not quite the pinnacle of human creativity. Believe me, the extensiveness of all the books you read and the effort your editiors put to give you a hard time? Its nothing compared to the extensiveness of data AI processes. And yet, if you are not completely untalented, I bet your work will be more evocative than any text ChatGPT could generate. I wouldn't say never, but I would say "the chance it will happen in our lifetime is so close to 0, you could almost say never". And I am not saying it out of denialism, I say it because I am a computer scientist, who actually learned a little bit about what AI can do, how it works and more importantly its limitations. I would never act like I am an expert, but I learned the foundational mathematical theory behind most modern ML algorithms and implemented some basic implementations. I've also read some opinions of actual experts on this. A very short and sweet example by the AI team of Google: "While AI systems are nearing or outperforming humanbeings at increasingly complex tasks [...], they remain narrow and brittle, and [B]lack true [/B]agency or [B]creativity[/B]. [...], machines with human intelligence remain a long way off." Bolded by me to come back to the topic at hand a bit. [URL]https://ai.google/static/documents/exploring-6-myths.pdf[/URL] If some want to read an excellent laymans introduction in to AI where the principles, chances, limitations, dangers etc. are described without dabbling too much into technical details and mathematics, I can highly recommend the book "Artificial Intelligence" by Melanie Mitchell. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
Would you buy an AI-generated Castle Greyhawk "by" Gary Gygax?" Should you?
Top