Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*Geek Talk & Media
ChatGPT lies then gaslights reporter with fake transcript
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Kromanjon" data-source="post: 9772357" data-attributes="member: 20680"><p>But it isn't click-bait. It's about a real thing that happened as a part of the journalists workday. He found it relevant to inform upon and did so as that is his job.</p><p></p><p></p><p></p><p>I agree with you here, I really do, and people acting upon wrong information is the very problem being reported on. The journalist brought to light that LLMs can give false information so that those watching would know as well to be cautious. Calling it click-bait is adding intention to the report that there is no evidence for.</p><p></p><p>Another thing relevant to this is that many educations nowadays can be found on-line and uses on-line information as part of their curriculum which proves that the correct information is out there to find as well. Just like libraries hold books that are used in educations as well.</p><p></p><p></p><p></p><p>And this gets to the root of the reported problem. The AI companies sell their programs to be this great thing, knowing that there are problems that might/will cause problems for the end user. Adding a little note that the answers might not be accurate doesn't fix that problem as long as they keep touting AI as the great problem solver. I would also go as far as to claim that the reporter in the clip used the AI in a way that it was supposed to be used and still it produced a false result showing that it was in fact the tool that was at fault.</p><p></p><p>As far as the term 'slop' being an issue, I'll leave that to [USER=1]@Morrus[/USER] to respond to, but I think he's already shown several examples of AI producing what could be considered 'slop' and explaining why this kind of "hallucination" leads to further 'slop' being created. In the video it had written an entire podcast that never took place for instance. The journalist managed to make content of it by reporting the problem, otherwise it would just have been slop.</p></blockquote><p></p>
[QUOTE="Kromanjon, post: 9772357, member: 20680"] But it isn't click-bait. It's about a real thing that happened as a part of the journalists workday. He found it relevant to inform upon and did so as that is his job. I agree with you here, I really do, and people acting upon wrong information is the very problem being reported on. The journalist brought to light that LLMs can give false information so that those watching would know as well to be cautious. Calling it click-bait is adding intention to the report that there is no evidence for. Another thing relevant to this is that many educations nowadays can be found on-line and uses on-line information as part of their curriculum which proves that the correct information is out there to find as well. Just like libraries hold books that are used in educations as well. And this gets to the root of the reported problem. The AI companies sell their programs to be this great thing, knowing that there are problems that might/will cause problems for the end user. Adding a little note that the answers might not be accurate doesn't fix that problem as long as they keep touting AI as the great problem solver. I would also go as far as to claim that the reporter in the clip used the AI in a way that it was supposed to be used and still it produced a false result showing that it was in fact the tool that was at fault. As far as the term 'slop' being an issue, I'll leave that to [USER=1]@Morrus[/USER] to respond to, but I think he's already shown several examples of AI producing what could be considered 'slop' and explaining why this kind of "hallucination" leads to further 'slop' being created. In the video it had written an entire podcast that never took place for instance. The journalist managed to make content of it by reporting the problem, otherwise it would just have been slop. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
ChatGPT lies then gaslights reporter with fake transcript
Top