Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Dungeons & Dragons
DMs Guild and DriveThruRPG ban AI written works, requires labels for AI generated art
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="FrogReaver" data-source="post: 9078643" data-attributes="member: 6795602"><p>All hype. Give it a few years, maybe 10-20 tops and we will understand the mechanisms at play for <u>in-context learning</u> - which is the phenomenon I assume you are referencing.</p><p></p><p>On <u>in-context learning</u> - (extremely simplified) it's basically giving the AI a categorization list, giving it a word not on the list and asking it to categorize. Behavior is as expected if the categorizations on the list are semantically relevant. However, flip the categorizations and the LLM will pick up on that as well which shouldn't probably shouldn't be possible if it's only looking at word association probabilities for the word you gave it. </p><p></p><p>Until it's proven or stated otherwise I would assume the base process is actually different. Possibly the LLM doesn't actually care about the word you used to categorize something by. Instead it treats each categorization as a variable (an unknown word) and compares to it's data model to find the most likely actual word that maps to what it's treating as an unknown word. Then it just compares the test word to see which of those of those actual words are most associated with it. Final step it uses the known to unknown word map it previously created to translate the final output back to you. </p><p></p><p>And perhaps this there's some threshold for when it does one vs the other.</p><p></p><p>Anyways, the point is that in-context learning need not indicate - generalized intelligence, theory of mind, nor actually exceeding their parameters.</p><p></p><p></p><p>Or perhaps, a predictive text generator is actually an even more powerful tool than we initially realized.</p></blockquote><p></p>
[QUOTE="FrogReaver, post: 9078643, member: 6795602"] All hype. Give it a few years, maybe 10-20 tops and we will understand the mechanisms at play for [U]in-context learning[/U] - which is the phenomenon I assume you are referencing. On [U]in-context learning[/U] - (extremely simplified) it's basically giving the AI a categorization list, giving it a word not on the list and asking it to categorize. Behavior is as expected if the categorizations on the list are semantically relevant. However, flip the categorizations and the LLM will pick up on that as well which shouldn't probably shouldn't be possible if it's only looking at word association probabilities for the word you gave it. Until it's proven or stated otherwise I would assume the base process is actually different. Possibly the LLM doesn't actually care about the word you used to categorize something by. Instead it treats each categorization as a variable (an unknown word) and compares to it's data model to find the most likely actual word that maps to what it's treating as an unknown word. Then it just compares the test word to see which of those of those actual words are most associated with it. Final step it uses the known to unknown word map it previously created to translate the final output back to you. And perhaps this there's some threshold for when it does one vs the other. Anyways, the point is that in-context learning need not indicate - generalized intelligence, theory of mind, nor actually exceeding their parameters. Or perhaps, a predictive text generator is actually an even more powerful tool than we initially realized. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
DMs Guild and DriveThruRPG ban AI written works, requires labels for AI generated art
Top