Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions, OSR, & D&D Variants
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Upgrade your account to a Community Supporter account and remove most of the site ads.
Community
General Tabletop Discussion
*Dungeons & Dragons
GenCon 2023 - D&D Rules Revision panel
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Clint_L" data-source="post: 9088492" data-attributes="member: 7035894"><p>You identify how they are getting useful data about what people like and don't like - they are getting general data from the ranking, which can be done very quickly, and then specific feedback from written comments. That's how you "TELL THEM what [you] like and what [you] don't like."</p><p></p><p>Your point about the written comments taking time that you couldn't spare is a feature, not a flaw, of this type of methodology. They don't really want written feedback from folks who aren't deeply invested in that particular issue. They want written feedback from folks who feel strongly enough to find the time. For example, on the last survey I whipped through most of the responses, just giving a ranking and no comment. However, on a few specific points (monk basic design, Moon druid subclass, etc.) I gave significant, detailed written feedback.</p><p></p><p>This is a very standard design for a survey intended to 1) gauge overall reactions at a broad scale, 2) identify specific pressure points, 3) generate more specific feedback and suggestions on those pressure points. My work, for instance, does a very similarly constructed employee survey every year and uses it to identify management priorities - this is widespread methodology. WotC didn't just throw something together at the last minute; this is meticulously constructed survey that is very much up to current industry standards, and they have clearly invested substantial resources into this process. It is obviously being conducted by industry professionals.</p><p></p><p>Also, WotC has masses of data that we lack, which allows them (or more accurately, the professionals conducting the survey) to analyze the responses in aggregate and identify what the rankings, etc. mean<strong> in context</strong>. This is how they have established that a proposal that has fallen below the 70% satisfaction level is not currently worth pursuing for this project. That doesn't mean the idea is thrown in the trash - Ardlings, for example, fell well below that threshold, yet WotC has stated that they intend to keep working with the basic idea.</p><p></p><p>I think most in us lack the frame of reference or information to really make an informed criticism of how this survey is being conducted or analyzed. We tend to reach for what we know and understand, which is leading to a lot of false assumptions (e.g. that the satisfaction numbers are equatable to letter grades in school, or that the onerous nature of written feedback is a flaw).</p></blockquote><p></p>
[QUOTE="Clint_L, post: 9088492, member: 7035894"] You identify how they are getting useful data about what people like and don't like - they are getting general data from the ranking, which can be done very quickly, and then specific feedback from written comments. That's how you "TELL THEM what [you] like and what [you] don't like." Your point about the written comments taking time that you couldn't spare is a feature, not a flaw, of this type of methodology. They don't really want written feedback from folks who aren't deeply invested in that particular issue. They want written feedback from folks who feel strongly enough to find the time. For example, on the last survey I whipped through most of the responses, just giving a ranking and no comment. However, on a few specific points (monk basic design, Moon druid subclass, etc.) I gave significant, detailed written feedback. This is a very standard design for a survey intended to 1) gauge overall reactions at a broad scale, 2) identify specific pressure points, 3) generate more specific feedback and suggestions on those pressure points. My work, for instance, does a very similarly constructed employee survey every year and uses it to identify management priorities - this is widespread methodology. WotC didn't just throw something together at the last minute; this is meticulously constructed survey that is very much up to current industry standards, and they have clearly invested substantial resources into this process. It is obviously being conducted by industry professionals. Also, WotC has masses of data that we lack, which allows them (or more accurately, the professionals conducting the survey) to analyze the responses in aggregate and identify what the rankings, etc. mean[B] in context[/B]. This is how they have established that a proposal that has fallen below the 70% satisfaction level is not currently worth pursuing for this project. That doesn't mean the idea is thrown in the trash - Ardlings, for example, fell well below that threshold, yet WotC has stated that they intend to keep working with the basic idea. I think most in us lack the frame of reference or information to really make an informed criticism of how this survey is being conducted or analyzed. We tend to reach for what we know and understand, which is leading to a lot of false assumptions (e.g. that the satisfaction numbers are equatable to letter grades in school, or that the onerous nature of written feedback is a flaw). [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
GenCon 2023 - D&D Rules Revision panel
Top