Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Geek Talk & Media
PCGen 5.11.12 beta released
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="thpr" data-source="post: 3439752" data-attributes="member: 48911"><p>The BoD recognizes that it needs to address the roadmap, and that will happen in a future meeting. It was deferred until we went beta in the 5.11/5.12 cycle in order to avoid distraction; however, I do not believe a date has been set for that discussion.</p><p></p><p>While there is no official roadmap, let me address my **personal opinion** on next steps and specifically the editors.</p><p></p><p>I believe in heavily tested code. While the unit tests are not publicly available, my RPG-MapGen code has approximately 95% unit test code coverage. (though why I keep plugging a half-dead, half-completed project is beyond me). Thus, with respect to PCGen, I believe that a robust editor is a demonstration of a robust I/O system. This is because the non-UI code that supports the editor can be used as part of a test strategy of the application (specifically the I/O subsystem).</p><p></p><p>For those who are familiar with how PCGen imports files, there are Text (LST) files that contain Tokens that define the different objects (e.g. "SA:Monkey Hold" is an "SA" token that will add a Special Ability called "Monkey Hold" to the item defined on the line in which the token appears).</p><p></p><p>To me, part of ensuring a token is correctly parsed is the ability to do a "round robin" test to ensure that one can perform:</p><p>LST file -> internal structure -> LST file copy -> internal structure copy</p><p></p><p>...in order to ensure that the import from LST and output to LST are internally consistent (by testing equality of the copies to the originals). "Proving" correct behavior is much easier if one can assume the behavior is internally consistent.</p><p></p><p>Toward this end, for anyone who has looked at the PCGen CDOM code branch, you'll note that most of the tokens have an unparse() method that allows them to be converted from the internal data structure back out to an LST Token. Those that do not yet have an unparse() method are incomplete - those are future development efforts. Many of those that are complete already possess near complete unit test code coverage for what can be achieved with parse() and unparse() alone.</p><p></p><p>This isn't a complete solution; a quick counter-example to solely using uparse() to recreate LST files are the following two lines; which could be in two separate Feat.lst files:</p><p>FeatFoo <tab> SA:Monkey Hold</p><p>FeatFoo.MOD <tab> SA:.CLEAR</p><p></p><p>This should make one realize that another layer on top of unparse needs to be tracking the source and deltas between two LST files. This has not yet been built. I believe the design of this is in the CDOM architecture proposal document; however, I need to double check that and may have to release a new version in order to capture that design.</p><p></p><p>So to spell out what I consider to be the relevant steps of the token changes to build a new I/O system and editor:</p><p>(0) Prerequisite work: Complete conversion of CHOOSE to the new token format as part of the 5.13 Alpha cycle, as a prerequisite to properly parsing CHOOSE tokens. Also find any other token conversions and drive inclusion in the 5.13 cycle.</p><p></p><p>(1) Demonstrate round-robin testing of ALL tokens (this guarantees some level of integrity, and since it also creates a baseline of token parsing, allows early testing of CDOM compatibility for anyone who wants it). This is approximately 60% done today (since much of this can be done independently of (0) above.</p><p></p><p>(2) Demonstrate round-robin testing of entire PCC/LST file sets (this guarantees integrity across .CLEARs, .MODs, etc. and actually completes most of the non-UI portion of an editor)</p><p></p><p>(3a) Build a GUI for the editor. I haven't gone too far down this path of thought, but it at least can be done independently of the core changes (which would be a steps 3b and on... not defined here).</p><p></p><p></p><p></p><p>Someday. </p><p></p><p>I'm not even in a position to define a schedule for the editor (above), which are steps 1-3 of 6 or 7 steps required to complete conversion to a new core... which isn't even officially on the roadmap. While there is code in a branch, it is officially a proposal. Given the tremendous amounts of uncertainty involved, I don't believe a reliable statement can be made on this issue.</p></blockquote><p></p>
[QUOTE="thpr, post: 3439752, member: 48911"] The BoD recognizes that it needs to address the roadmap, and that will happen in a future meeting. It was deferred until we went beta in the 5.11/5.12 cycle in order to avoid distraction; however, I do not believe a date has been set for that discussion. While there is no official roadmap, let me address my **personal opinion** on next steps and specifically the editors. I believe in heavily tested code. While the unit tests are not publicly available, my RPG-MapGen code has approximately 95% unit test code coverage. (though why I keep plugging a half-dead, half-completed project is beyond me). Thus, with respect to PCGen, I believe that a robust editor is a demonstration of a robust I/O system. This is because the non-UI code that supports the editor can be used as part of a test strategy of the application (specifically the I/O subsystem). For those who are familiar with how PCGen imports files, there are Text (LST) files that contain Tokens that define the different objects (e.g. "SA:Monkey Hold" is an "SA" token that will add a Special Ability called "Monkey Hold" to the item defined on the line in which the token appears). To me, part of ensuring a token is correctly parsed is the ability to do a "round robin" test to ensure that one can perform: LST file -> internal structure -> LST file copy -> internal structure copy ...in order to ensure that the import from LST and output to LST are internally consistent (by testing equality of the copies to the originals). "Proving" correct behavior is much easier if one can assume the behavior is internally consistent. Toward this end, for anyone who has looked at the PCGen CDOM code branch, you'll note that most of the tokens have an unparse() method that allows them to be converted from the internal data structure back out to an LST Token. Those that do not yet have an unparse() method are incomplete - those are future development efforts. Many of those that are complete already possess near complete unit test code coverage for what can be achieved with parse() and unparse() alone. This isn't a complete solution; a quick counter-example to solely using uparse() to recreate LST files are the following two lines; which could be in two separate Feat.lst files: FeatFoo <tab> SA:Monkey Hold FeatFoo.MOD <tab> SA:.CLEAR This should make one realize that another layer on top of unparse needs to be tracking the source and deltas between two LST files. This has not yet been built. I believe the design of this is in the CDOM architecture proposal document; however, I need to double check that and may have to release a new version in order to capture that design. So to spell out what I consider to be the relevant steps of the token changes to build a new I/O system and editor: (0) Prerequisite work: Complete conversion of CHOOSE to the new token format as part of the 5.13 Alpha cycle, as a prerequisite to properly parsing CHOOSE tokens. Also find any other token conversions and drive inclusion in the 5.13 cycle. (1) Demonstrate round-robin testing of ALL tokens (this guarantees some level of integrity, and since it also creates a baseline of token parsing, allows early testing of CDOM compatibility for anyone who wants it). This is approximately 60% done today (since much of this can be done independently of (0) above. (2) Demonstrate round-robin testing of entire PCC/LST file sets (this guarantees integrity across .CLEARs, .MODs, etc. and actually completes most of the non-UI portion of an editor) (3a) Build a GUI for the editor. I haven't gone too far down this path of thought, but it at least can be done independently of the core changes (which would be a steps 3b and on... not defined here). Someday. I'm not even in a position to define a schedule for the editor (above), which are steps 1-3 of 6 or 7 steps required to complete conversion to a new core... which isn't even officially on the roadmap. While there is code in a branch, it is officially a proposal. Given the tremendous amounts of uncertainty involved, I don't believe a reliable statement can be made on this issue. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
PCGen 5.11.12 beta released
Top