Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Geek Talk & Media
How Generative AI's work
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="giant.robot" data-source="post: 9291056" data-attributes="member: 93119"><p>A small clarification, you made this point but I didn't find it clear in the description: when the GenAI is choosing the probability of the next word, it's calculating the probability of the next word in relation to the current sequence of words which includes the prompt. As the sequence changes the probability of certain next token increases or decreases. The probability of tokens after the word "The" at the beginning of a sentence are myriad. There's fewer good probabilities for the sequence "The dog" and so on as the sequence grows.</p><p></p><p>The training data is a bunch of plain language sentences. So if you ask an AI about hobbits, the training data near the word hobbit likely had a bunch of statements about hobbits. So when the AI encounters the "hobbit" token the probability of other tokens from the training data near the word "hobbit" end up showing up in the output.</p><p></p><p>The AI doesn't actually "know" anything about hobbits. It pulls up tokens that were near the word "hobbit" in the training data. If you poisoned the training data and used the word hobbit to describe refrigerators, when you asked an AI trained on this poisoned data you'd end up with stories of Maytag the Hobbit from the Home Depot Shire.</p></blockquote><p></p>
[QUOTE="giant.robot, post: 9291056, member: 93119"] A small clarification, you made this point but I didn't find it clear in the description: when the GenAI is choosing the probability of the next word, it's calculating the probability of the next word in relation to the current sequence of words which includes the prompt. As the sequence changes the probability of certain next token increases or decreases. The probability of tokens after the word "The" at the beginning of a sentence are myriad. There's fewer good probabilities for the sequence "The dog" and so on as the sequence grows. The training data is a bunch of plain language sentences. So if you ask an AI about hobbits, the training data near the word hobbit likely had a bunch of statements about hobbits. So when the AI encounters the "hobbit" token the probability of other tokens from the training data near the word "hobbit" end up showing up in the output. The AI doesn't actually "know" anything about hobbits. It pulls up tokens that were near the word "hobbit" in the training data. If you poisoned the training data and used the word hobbit to describe refrigerators, when you asked an AI trained on this poisoned data you'd end up with stories of Maytag the Hobbit from the Home Depot Shire. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
How Generative AI's work
Top