RPG Evolution: Hasbro's AI Plans

We can make some educated guesses about Hasbro's AI plans thanks to a recent interview with CEO Chris Cocks.

We can make some educated guesses about Hasbro's AI plans thanks to a recent interview with CEO Chris Cocks.

sense-2326348_1280.jpg

Picture courtesy of Pixabay.

Not surprisingly, Large Language Model (LLM) Artifical Intelligence (AI) is on every business' plans, and Hasbro is no different. The question is how the company plans to use it ethically in light of several missteps in which Wizards of the Coast, the Hasbro division overseeing Dungeons & Dragons, failed to disclose that AI was involved in certain pieces of art. The ongoing controversies were enough to make WOTC update its AI policy.

An AI Product Every Two to Three Months​

That hasn't stopped former CEO of WOTC and current CEO of Hasbro Chris Cocks from expounding on his plans for AI:
...we’re trying to do a new AI product experiment once every two to three months. That’s tending to be more game-focused for us, a little more gamified. We’re trying to keep it toward older audiences, to make sure all the content is appropriate...You’ll see more of how we’re thinking about how we can integrate AI, how we can integrate digital with physical gaming over time...I think most major entertainment and IP holders are at least thinking about it.
What Cocks is talking about is how LLM AIs are sourced. The LLM controversies revolve around, among other things, that the AIs are trained on content without the owners' permission. In other words, although LLMs are often trained on publicly available content, the users sharing that content never imagined a robot would be hoovering up their dialogue to create money for someone else. The throughline to art is a bit easier to detect (as the above controversies show, harder to prove); but when it comes to text, like Reddit, user-generated content is invaluable. These AI are only as valuable as the content they have at their disposal to train on. This is why Poe.com and other customizable AI, trained on your own content, can be so useful to Dungeon Masters who want a true assistant that can sort through decades of homebrew content in seconds. I'll discuss using Poe.com in a future article.

Respecting Creators, Works of Art, and Ownership​

Cocks is keenly aware of AI's controversies, with the Open Game License and issues with AI-generated art:
We certainly weren’t at our best during some points on the Open Game License. But I think we learned pretty fast. We got back to first principles pretty quickly ... The key there is the responsible use of it. We have an even higher bar we need to hit because we serve audiences of all ages. We go from preschoolers on up to adulthood. I don’t think we can be very cavalier in how we think about AI...That said, it’s exciting. There’s a lot of potential for delighting audiences. We need to make sure that we do it in a way that respects the creators we work with, respects their works of art, respects their ownership of those works, and also creates a fun and safe environment for kids who might use it.
And now we come to it. So how would WOTC and Hasbro use AI that respects creators, their work, ownership and is fun to use?

How Might WOTC Use AI for D&D?​

Cocks give us some hints in his answers:
The 20-plus years that the Open Game License has been in existence for something like D&D, I think that gives us a lot of experience to navigate what will emerge with AI, and just generally the development of user-based content platforms, whether it’s Roblox or Minecraft or what Epic has up their sleeves.
The Open Game License (OGL), by its very nature, is meant to be used in much the same way LLMs try to use the entirety of the Internet. What was likely a thorn in the side of lawyers may well seem like an opportunity now. Unlike the Internet though, the OGL has a framework for sharing -- even if it wasn't envisioned by the creators as sharing with a machine. More to the point, everyone using the Open Game License is potentially adding to LLM content; databases of OGL content in wiki format are just more fodder for LLMs to learn. WOTC could certainly leverage that content to train an AI on Dungeons & Dragons just as much as anyone else if they so chose; however, a large company using OGL content to fuel their AI doesn't seem like it's respecting their creators and their ownership.

So it's possible WOTC may not use OGL content at all to train its AI. They don't need it -- there's plenty of content the company can leverage from its own vaults:
The advantage we have ... This is cutting-edge technology, and Hasbro is a 100-year-old company, which you don’t usually think is ... a threat ... But when you talk about the richness of the lore and the depth of the brands–D&D has 50 years of content that we can mine. Literally thousands of adventures that we’ve created, probably tens of millions of words we own and can leverage. Magic: The Gathering has been around for 35 years, more than 15,000 cards we can use in something like that. Peppa Pig has been around for 20 years and has hundreds of thousands of hours of published content we can leverage. Transformers, I’ve been watching Transformers TV shows since I was a kid in Cincinnati in the early ‘80s. We can leverage all of that to be able to build very interesting and compelling use cases for AI that can bring our characters to life. We can build tools that aid in content creation for users or create really interesting gamified scenarios around them.
The specific reference to 35 years of Magic: the Gathering content "that we can leverage" has been done before by WOTC's predecessor, when TSR created the Spellfire card game. TSR churned out Spellfire in response to Magic: The Gathering (before WOTC took over D&D). It relied heavily on (at the time) TSR's 20 years of art archives. One can easily imagine AI generating this type of game with art WOTC owns in a very short period of time.

But Cocks is thinking bigger than that for Dungeons & Dragons. He explains how he uses AI with D&D specifically:
I use AI in building out my D&D campaigns. I play D&D three or four times a month with my friends. I’m horrible at art. I don’t commercialize anything I do. It doesn’t have anything to do with work. But what I’m able to accomplish with the Bing image creator, or talking to ChatGPT, it really delights my middle-aged friends when I do a Roll20 campaign or a D&D Beyond campaign and I put some PowerPoints together on a TV and call it an interactive map.
In the future, WOTC could easily change their contracts to explicitly state that any art they commission may be used to train a future AI (if they don't already). For content they already own -- and WOTC owns decades of art created for Magic: The Gathering -- they may already be within their rights to do this.

Add all this up, and companies like Hasbro are all looking at the archives of information -- be it text, graphics, or examples of play -- as a competitive advantage to train their AIs in a way their rivals can't.

The Inevitable​

In short, it's not a question if WOTC and Hasbro are going to use AI, just when. And by all indications, that future will involve databases of content that are either clearly open source or owned by Hasbro, with LLMs that will then do the heavy lifting on the creative side of gaming that was once filled by other gamers. For Dungeons & Dragons in particular, the challenge in getting a game started has always been finding a Dungeon Master, a tough role for any gamer to fill, and the lynchpin of every successful D&D campaign. With D&D Beyond now firmly in WOTC's grasp, they could easily provide an AI platform on that service, using the data it learns from thousands of players there to refine its algorithms and teach it to be a better DM. Give it enough time, and it may well be an a resource for players who want a DM but can't find one.

We can't know for sure what WOTC or Hasbro has planned. But Cocks makes it clear AI is part of Hasbro’s future:
While there are definitely areas of concern that we have to be watchful for, and there are definitely elements to the chess game that we have to think about before we move, it’s a really cool technology that has a lot of playfulness associated with it. If we can figure out how to harness it the right way, it’ll end up being a boon for users.
In three to five years, we might have officially sanctioned AI Dungeon Masters. Doesn't seem realistic? Unofficial versions are already here.
 

log in or register to remove this ad

Michael Tresca

Michael Tresca

Jer

Legend
Supporter
I agree about the limitations of Dall-E, ChatGPT, etcetera.

At the same time, it is absurd to talk about AI while only referring to what exists today in the market place.
No - I'm talking about what the current research is. People are focusing on making systems like ChatGPT, Dall-E, etc. better at what they do. There is almost no research at all - and almost no money being thrown at - trying to build systems of the kind you're talking about. The money and research is and has been for a decade in building these kind of neural network big data systems that learn functions of their data. And those kinds of systems are not going to turn into humanity killing monsters because at a mathematical level that's not what they do.
 

log in or register to remove this ad

talien

Community Supporter
We know AI's don't have inner lives (and most will tell you that up front). But I look at AI now at the "MP3 Stage" of audio. Lots of experts scoffed at the idea that anyone would actually want audio to lose so much quality so that it was digital. But it turned out MP3s were "good enough" -- good enough because most people don't listen audio in ideal situations. They listen in the car, on the way to work, while exercising, with lots of background noise. Do they need the same level of audio you'd get from higher quality files or a record? No. Most people don't care that much. It's good enough.

AI doesn't need to have intent. But if it can fake it? Good enough. Fact is, Internet communication has degraded so much now thanks to social media "platform decay" that AI can seem more eloquent than humans. Once they start remembering things, hallucinate less (and to be fair, there's a lot of humans hallucinating on the Internet too), and can be persistent enough so their memory doesn't reset? Good enough.

What do most D&D groups want? Would they rather have a DM that can run an adventure "as is" vs. one who is creative enough to deal with what happens when the PCs go off the rails? An AI DM can read aloud text, make monster rolls, work procedurally through each encounter. Is that better than not playing at all?

Good enough.
 

Raiztt

Adventurer
We know AI's don't have inner lives (and most will tell you that up front). But I look at AI now at the "MP3 Stage" of audio. Lots of experts scoffed at the idea that anyone would actually want audio to lose so much quality so that it was digital. But it turned out MP3s were "good enough" -- good enough because most people don't listen audio in ideal situations. They listen in the car, on the way to work, while exercising, with lots of background noise. Do they need the same level of audio you'd get from higher quality files or a record? No. Most people don't care that much. It's good enough.

AI doesn't need to have intent. But if it can fake it? Good enough. Fact is, Internet communication has degraded so much now thanks to social media "platform decay" that AI can seem more eloquent than humans. Once they start remembering things, hallucinate less (and to be fair, there's a lot of humans hallucinating on the Internet too), and can be persistent enough so their memory doesn't reset? Good enough.

What do most D&D groups want? Would they rather have a DM that can run an adventure "as is" vs. one who is creative enough to deal with what happens when the PCs go off the rails? An AI DM can read aloud text, make monster rolls, work procedurally through each encounter. Is that better than not playing at all?

Good enough.
I weep.
 

Yaarel

He Mage
No - I'm talking about what the current research is. People are focusing on making systems like ChatGPT, Dall-E, etc. better at what they do. There is almost no research at all - and almost no money being thrown at - trying to build systems of the kind you're talking about. The money and research is and has been for a decade in building these kind of neural network big data systems that learn functions of their data. And those kinds of systems are not going to turn into humanity killing monsters because at a mathematical level that's not what they do.
The US government, the Chinese government, and many other governments, are in an AI arms race, that very much pursues the AI ability to understand context and intent.

Think about the AI weapon systems that are already operational now. These weapons must be able to accurately distinguish friend or foe among the targets.
 


talien

Community Supporter
Rightfully so, and the folks who believed audio quality mattered wept too. Then something weird happened, records have made a comeback.

D&D was in this state once too. People really thought D&D was destined to be some niche game forever, the butt of jokes. "Those weirdos who play in their mom's basement." Then the pandemic blew up and suddenly being online seemed terrible and we were so happy to see each other's faces again, and D&D is all about in-person gaming.

What I hope WOTC does is take this as an opportunity to train future DMs vs. just replacing them. But if it lets more people play D&D who wouldn't otherwise? That's still a net win in my book.
 

Blue

Ravenous Bugblatter Beast of Traal
On the art side sure - on the LLM side you need a lot more text to make it work. I was thinking more on the LLM side than on the art side. And even on the art side what the hobbyists are doing generally starts from open sourced models NOT from scratch, which is what the question was about - if you're starting completely from scratch with just your own art you need a lot more art than if you're adjusting an existing model.
Sure. Actually, I think that was one of the things that came about after it got released "into the wild" for AI art. LoRAs, LyCORIS and other adjustments that require minimal amount of training resources - those within easy reach of hobbyists. Also checkpoint merges, where they take trained models that have various things they want and bring them together in various ways to hopefully enhance those aspects.

On the text side, being able to train against your own data/documents is also something being done everyday. But I pretty sure you are right that it's not starting from scratch on those.
 


Mike Myler

Have you been to LevelUp5E.com yet?
Wotc owns all User Generated Content anyone has ever published onto DMsguild, and the license of DMsguild allows them to use it for any purpose indefinitely.
Thus the hairs standing on end. Others have commented that those lists don't have everything on them though, so maybe there are thousands of adventures in-house. 🤷‍♂️
 

Hussar

Legend
Thus the hairs standing on end. Others have commented that those lists don't have everything on them though, so maybe there are thousands of adventures in-house. 🤷‍♂️
I mean, I didn't look too hard at your link, but, considering it only started at 3e, there's a bajillion TSR modules from before 3e that WotC can mine.

I gotta admit, I'd be a bit leery of WotC using DM's Guild to train an AI in adventure design - let's be honest here, no one considered anything like that when they signed up for selling on DM's Guild. Yes, it's legal, but, it looks sketchy as all get out.

One would hope that WotC would simply reach out to creators on DM's Guild with a simple opt in/out option for their material. I wouldn't think that would be too difficult to do, would it? Creators that don't mind can opt in and allow their material to be used, and those that don't, can simply say so.
 

Remove ads

Remove ads

Top