Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
NOW LIVE! Today's the day you meet your new best friend. You don’t have to leave Wolfy behind... In 'Pets & Sidekicks' your companions level up with you!
Community
General Tabletop Discussion
*Geek Talk & Media
Sarah Silverman leads class-action lawsuit against ChatGPT creator
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="DaedalusX51" data-source="post: 9090646" data-attributes="member: 96233"><p>You obviously are very knowledgeable on this subject and I am only speaking as a software developer. This might be my lack of expertise in the field showing, but some of these things seem intuitive despite my lack of experience. (Dunning Kruger maybe?)</p><p></p><p></p><p></p><p>Shouldn't it's responses remain the same until additional training data is added? I would assume we don't want these models incorporating everything it experiences into it's memory or else it will become garbage really fast. </p><p></p><p></p><p></p><p>After adding in each new small set of training data, we should really be reverifying functionality. I understand this would take massive teams and years to accomplish, but what we gain in knowledge and understanding of evolving systems and emergent behavior would be tremendous.</p><p></p><p></p><p></p><p><img src="https://cdn.jsdelivr.net/joypixels/assets/8.0/png/unicode/64/1f44d.png" class="smilie smilie--emoji" loading="lazy" width="64" height="64" alt="(y)" title="Thumbs up (y)" data-smilie="22"data-shortname="(y)" /></p><p></p><p></p><p></p><p>Sorry if my analogy was confusing. I meant that we would learn about how intelligence emerges from increasing complexity. I was going off your explanation of it being similar to the neurons in a human brain. To try to understand how these models work when it's already so complex is like trying to understand our brain and consciousness now. It is much easier to understand it as we build it.</p><p></p><p></p><p></p><p>Yeah cloud computing is just using other people's hardware. This area of research is actually an amazing opportunity for the human race and I think it's pretty foolish that all these companies are trying to create all of these separate products. Most will likely spend more than they will ever make in return.</p><p></p><p></p><p></p><p>Yes that would potentially speed things up quite a bit.</p><p></p><p></p><p></p><p>I'm aware that it's not a one to one translation. It is different yet similar and we have a lot to learn from it.</p><p>I'm also familiar with Penrose and Hammerhoff's ORCH OR hypothesis. While interesting, there doesn't seem to be any evidence that there is quantum phenomenon in the microtubules of our brains. I think consciousness arises from sufficient complexity. In addition, the development of language allowed us to store and manipulate abstract concepts.</p><p></p><p></p><p></p><p>I am definitely not in that group. While their intelligence is unlike ours, it is a kind of intelligence. I just wish we took the opportunity to understand why they do what they do better and ensure that we are training them properly.</p><p></p><p></p><p></p><p>The human race does not truly want to know how consciousness is formed. We are too attached to our religious beliefs and hamstrung by our fear of death that we will try to come up with all sorts of reasons why it has to be something special. If we realized that the only difference between us and other life forms was that we created language to allow us to understand abstract concepts, we would look back at ourselves as monsters for all we have done.</p><p></p><p>I'm not against AI, but if we can't understand why it does what it does it is basically the blind leading the blind. I don't want us all to fall into a pit.</p></blockquote><p></p>
[QUOTE="DaedalusX51, post: 9090646, member: 96233"] You obviously are very knowledgeable on this subject and I am only speaking as a software developer. This might be my lack of expertise in the field showing, but some of these things seem intuitive despite my lack of experience. (Dunning Kruger maybe?) Shouldn't it's responses remain the same until additional training data is added? I would assume we don't want these models incorporating everything it experiences into it's memory or else it will become garbage really fast. After adding in each new small set of training data, we should really be reverifying functionality. I understand this would take massive teams and years to accomplish, but what we gain in knowledge and understanding of evolving systems and emergent behavior would be tremendous. (y) Sorry if my analogy was confusing. I meant that we would learn about how intelligence emerges from increasing complexity. I was going off your explanation of it being similar to the neurons in a human brain. To try to understand how these models work when it's already so complex is like trying to understand our brain and consciousness now. It is much easier to understand it as we build it. Yeah cloud computing is just using other people's hardware. This area of research is actually an amazing opportunity for the human race and I think it's pretty foolish that all these companies are trying to create all of these separate products. Most will likely spend more than they will ever make in return. Yes that would potentially speed things up quite a bit. I'm aware that it's not a one to one translation. It is different yet similar and we have a lot to learn from it. I'm also familiar with Penrose and Hammerhoff's ORCH OR hypothesis. While interesting, there doesn't seem to be any evidence that there is quantum phenomenon in the microtubules of our brains. I think consciousness arises from sufficient complexity. In addition, the development of language allowed us to store and manipulate abstract concepts. I am definitely not in that group. While their intelligence is unlike ours, it is a kind of intelligence. I just wish we took the opportunity to understand why they do what they do better and ensure that we are training them properly. The human race does not truly want to know how consciousness is formed. We are too attached to our religious beliefs and hamstrung by our fear of death that we will try to come up with all sorts of reasons why it has to be something special. If we realized that the only difference between us and other life forms was that we created language to allow us to understand abstract concepts, we would look back at ourselves as monsters for all we have done. I'm not against AI, but if we can't understand why it does what it does it is basically the blind leading the blind. I don't want us all to fall into a pit. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
Sarah Silverman leads class-action lawsuit against ChatGPT creator
Top