RPG Evolution: The Right to Write

We previously discussed WOTC’s tangles with writers’ rights, but how does that apply to AI?

righttowrite.jpg

Picture courtesy of Pixabay.

This Again​

Judging from CEO Chris Cocks’ latest statements about generative artificial intelligence at a recent Goldmach Sachs event, it’s clear Hasbro is not going to give up on its plans to use AI to create content for its properties. The question is how writer and artist rights will be protected in the process.

Cocks threw out two ideas for how AI might be used: “new player introduction” and “emergent storytelling.” They are both very different but related proposals, and all of them begin with authors and artists’ works being used to generate a new outcome every time.

New Player Introduction​

In the case of “New Player Introduction” this is likely a generative AI who can answer questions, take players and game masters through an introductory game, or otherwise act as a “co-DM” for the dungeon master.

New Player Introduction surely involves art, maps, and text. In that case, artists and writers would need to be considered in that context. That said, this is a little more focused than Emergent Storytelling, where the (fantasy) sky is (literally) the limit.

Emergent Storytelling​

“Emergent Storytelling” is an outgrowth of “emergent gameplay” in which complex situations in role-playing games emerge from the interaction of simple game mechanics. That is, while Dungeons & Dragons has a set of core rules, the outcomes in campaigns are massively varied and usually stray far from what’s included in those rules. Of emergent gameplay spins “emergent narrative,” in which the game does not use a pre-planned structure at all. That is, every game is unique, every situation is potentially random from a list of near-infinite possibilities, and an AI would be constantly choosing what happens next in a “Choose Your Own Adventure” style-system that happens seamlessly behind the scenes.

The question is: what text and art is the AI using to generate this emergent storytelling?

The Fine Print​

Generative AI is a random numbers machine, using percentages to guess every word it responds with based on a vast database of user feedback what the best and most appropriate answer is. It is only as good as the data it’s trained on, and Cocks has made it clear that the company has an advantage because it “owns” a lot of content:

But when you talk about the richness of the lore and the depth of the brands–D&D has 50 years of content that we can mine. Literally thousands of adventures that we’ve created, probably tens of millions of words we own and can leverage.

Who is the “we” in this conversation? Artists and writers of course. How their content is used to feed an AI is a matter of speculation. Perhaps more relevant is if every creator during those 50 years signed contracts granting Hasbro rights to let an AI farm their content.

Odds are high they did. If WOTC learned anything from the kerfuffle over the Dragon Magazine CD-ROM Archive, it’s that they should include digital, perpetual rights in any work-for-hire contracts for their creative contributors. For a guidepost of what this might look like in a contract, WOTC has a legal page of General Terms, where it lists four different times (for user rights, unsolicited idea submissions, streaming, and user content) the following:

…you grant us an irrevocable, nonexclusive, perpetual, worldwide, royalty-free, fully sublicensable license to use, reproduce, distribute, adapt, modify, translate, create derivative works of, publicly perform, publicly display, digitally perform, make, have made, sell, offer for sale, and import your Submissions, including any and all copyrights, trademarks, trade secrets, patents, industrial rights, and all other intellectual and proprietary rights related thereto, in any media now known or hereafter developed, for any purpose whatsoever, commercial or otherwise, including giving the Submissions to others, without any compensation to you. To the extent necessary, you agree that you undertake to execute and deliver any and all documents and perform any and all actions necessary or desirable to ensure that the rights to use the Submissions granted to us as specified above are valid, effective, and enforceable. You also give up any claim that any use by us or our licensees of your Submissions violates any of your rights, including moral rights, privacy rights, rights to publicity, proprietary or other rights, and rights to credit for the material or ideas in your Submissions.

The words "reproduce, adapt, modify" and "create derivative works of" seem to apply to AI, as well as "rights to credit for the material or ideas." (I'm not a lawyer, and this is not legal advice). In short, WOTC probably has something like this in their contracts, and if so, has the right to reuse the content they paid a contractor or employee for, including to train AI.

What to Do About It​

It’s undeniable that generative AI is built on the labor of humans so that labor is potentially no longer going to be used or paid for. Like the lawsuits over the Dragon Magazine CD-ROM Archive, unless you’re an IP lawyer like Dave Kenzer of Kenzer & Company, your best bet as a contractor is likely to be part of an organization who can advocate collectively on your behalf. This certainly worked for the Science Fiction & Fantasy Writers Association (SFWA) on behalf of its fiction writers, who procured a settlement from WOTC. Notably, the one group not represented in settlements from WOTC over the CD-ROM Archive rights were game designers, who either had permissive contracts or lacked the organizational will of Kenzer & Company and the SSFWA.

Both the Writer’s Guild of America (WGA) and the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) recently engaged in collective bargaining on behalf of their members when it comes to AI. SAG-AFTRA is currently on strike against video game companies after 18 months of negotiations over the use of AI and game performers.

Tabletop game writers have a variety of options too, including the Authors Guild (covering freelance writers who have published 3+ pieces or made $5,000 in the past 18 months), National Writers Union (representing web content, work-for-hire, and contract writers), Writers Guild of America East (representing the online media industry), Communication Workers of America (including video game developers for World of Warcraft and Bethesda Game Studios), and Game Workers Unite (who counts tabletop role-playing games as part of its membership).

Whether or not game writers join a union, one thing’s for sure: they should read the fine print on their contracts.
 

log in or register to remove this ad

Michael Tresca

Michael Tresca

Generative AI is a random numbers machine, using percentages to guess every word it responds with based on a vast database of user feedback what the best and most appropriate answer is. It is only as good as the data it’s trained on, and Cocks has made it clear that the company has an advantage because it “owns” a lot of content:
There's nothing yet science has reliably done that proves human and animal brains aren't likewise just prediction and recognition networks.

Stating it as a difference like you have is firmly into speculation.
 

log in or register to remove this ad

So the question ultimately is: Do you have copyright over work generated by an AI exclusively trained on the work you already have copyright to? Because if the corpos have their way they'll absolutely push for an affirmative answer, which won't be the win independent artists think, but precedence does support that position.


Even OGC can be used under a different license if its use is transformative. And no, you cannot designate a copyright license to something which cannot be copyrighted.
In the US, at least, until the courts get involved, the Librarian of Congress (who determines Copyright policy) has said no work done by an AI can be copyrighted within the US. Manipulating that work by a human after creation can result in copyright of work done by an AI, but the current requirement is to show the human input. The wording implies that anything that isn't human modified won't be protected.

 
Last edited:

Very interesting article! WOTC's direction continues to be concerning, and it's good to see it in historical context.

A quick note—game designers are indeed eligible to join SFWA (though yes, to my knowledge Game Writing wasn't included in the organization at the time of the Dragon archives CD-ROM case). I'm a full member based on my indie game design work for RuneQuest, and also a member of the Game Writing committee. Game designers seeking to join do need a "speculative" element to their work, but that net casts very wide.
Excellent point and I didn't mean to imply they wouldn't accept game designers so much as it seemed the main thrust was fiction writers -- but we actually don't know if game designers were included in the negotiations. They certainly weren't called out in the settlement, but it's possible they were included.
 

There's nothing yet science has reliably done that proves human and animal brains aren't likewise just prediction and recognition networks.

Stating it as a difference like you have is firmly into speculation.
This is the "stochastic parrot" theory: AI Unplugged: AI Want a Cracker?

To your point, it's hard to "prove" sentience, and as I frequently point out, if it can fool one side into treating the other like it's sentient, does it matter? Philosophers argue it does.

That said, this isn't really the place to discuss this (feel free to contribute to my post elsewhere if you want to argue with philosophers about it, some of them read my newsletter).
 

Who can tell what is human or AI- or even human-modified vs AI-modified? Maybe now, but in 10 years when we are ready for 6.0 to come out and someone/something can create a new book based off everything online these next 10 years. It might be able to fix the "Fighters suck" threads we will be having.
 

One thing probably being missed is that AI can expand it's data. It has to start with some basic data but from that data it can keep extrapolating forever. And as people tweak their images using the AI, it can learn more from the choices the tweakers make. So long term it isn't going to need artists.

Also outside of actual story like images, dungeon maps and caverns won't be any problem. They could train that quickly. Same for world maps.

AI is coming and like the internet it cannot be stopped. We need to learn how to adapt.
 

One thing probably being missed is that AI can expand it's data. It has to start with some basic data but from that data it can keep extrapolating forever. And as people tweak their images using the AI, it can learn more from the choices the tweakers make. So long term it isn't going to need artists.
Bit of an aside this, but the current evidence is that when AIs learn from data generated by AIs, they do not get better -- in fact, they get worse. It's a major concern for the big AI companies -- as more AI content leaks into the world, it is becoming harder and harder to learn from new data without screwing over your model. And there is a limit to the non-AI content available. I believe (I cannot remember the source, might have been an internal source) that GPT-4 was trained on about 10% of the digital English text in existence, and with the laws of diminishing returns, expanding to 100% would not make a huge improvement.

It's not impossible that there is a finite limit to how good an AI can get using the current transformer-based neural net approach. My personal feeling is that there is a lot of tweaking that can still be done, but overall, if you compare today's best LLMs to the ones of last year, they are way better in terms of cost and token limits, but only somewhat higher quality.

So I really don't think AIs are going to replace writers, but they are going to make them more productive, which may mean companies will need fewer writers. The big question here is, would more content sell? If a company uses AI to (say) double writers' productivity, will they make more money keeping all their writers and doubling output, or by halving their writing staff?
 

So I really don't think AIs are going to replace writers, but they are going to make them more productive, which may mean companies will need fewer writers. The big question here is, would more content sell? If a company uses AI to (say) double writers' productivity, will they make more money keeping all their writers and doubling output, or by halving their writing staff?
This is a super good point. That question hasn't been answered and it will vary by field.

One example is computer programming. A lot of people think this is the end of programmers. The reality is the demand for software development is probably 100 x what can be done. Maybe even higher. The issue is the few really good programmers are scarce and even the competent ones are uncommon so they can't get what they'd really like. With AI, and ever greater productivity, they might shoot for higher goals. Or they might just save money and cut staff. It will probably be a little of both but which is dominant is yet to be seen.

In the field of artistry, I suspect the really great writers will still prosper. As will the really great artists. The formulaic writers though may become AI admins instead.
 

Related Articles

Remove ads

Remove ads

Top