Hasbro CEO Reiterates That AI Isn't Used to Make D&D Because of the Game's Audience and Creators

Cocks has spoken about AI extensively in recent months.
1773078976052.png


While Hasbro CEO Chris Cocks is a big fan of AI, he reiterated in a recent interview that the technology is not used to make Dungeons & Dragons and Magic: The Gathering. Recently, Cocks sat down with the Verge to discuss Hasbro's business and in particular how the company uses AI. While Cocks gave several examples of how AI is integrated within the company (it has a Peppa Pig AI provide feedback on Peppa Pig toys, for instance), he stated that not every facet of the company currently uses AI. "From a creative context, I think you have to think about it very carefully," Cocks said. "There are some brands that the audience, the creators, just don’t want it, so we don’t even have it in our pipelines for our video games or for Magic: The Gathering, or D&D. For things like toys where we’re basing it on existing IP, or like a long legacy of ideas, we are able to use it and use it pretty effectively."

The Dungeons & Dragons brand has strongly come out against AI, specifically when it comes to creative work. The brand currently bans the use of AI-generated artwork in its games and has repeatedly talked about how the game is made for people by people. However, Cocks has talked about his personal use of AI in his home D&D games and has strongly suggested integrating that technology into Dungeons & Dragons somehow.

Cocks previously bragged about how AI has been integrated into Hasbro's workflow, and the Verge interview talks about how AI has supplemented the business, mentioning that AI has been used to ideate toy ideas and simulate focus groups and play test labs. While Cocks sees AI as a way to "level up" the work of creatives as opposed to replacing them, he also admits that he's been wrong about technology disrupting the toy business before, specifically mentioning NFTs as an area that he got wrong in the past.

The interview also briefly mentioned the upcoming video game Dungeons & Dragons: Warlock, with Cocks noting that that game will be released in the "later part" of 2027.
 

log in or register to remove this ad

Christian Hoffer

Christian Hoffer

But in your example the Elders are feeding them lots of cake for breakfast. So much cake is becoming the normal and common breakfast food.

And the bakers? They are tricking the elders into thinking cake actually improves nutrition.

Please do not tell me who is who in my own analogy.

WE are the elders, meaning, older people, in general. The people you are pointing at are mountebanks, hucksters, and con artists, that again, we are supposed to provide critical thinking tools to defend against.
 

log in or register to remove this ad


In some disciplines, I have seen the old-school answer of grades being based on work done in class, by hand, without devices.
We do plenty of handwritten assessments because their IB exams are handwritten (the essays referred to above are multi-draft research papers). My 12s are writing another handwritten practice essay tomorrow. I’ve still had students cheat with AI, such as one I busted with a phone on her lap.

But that’s only one issue. The deeper one is that we don’t know what we are preparing them for. My argument to them is that their unique voice is the most important thing they own, and if they don’t develop it, then they are in trouble.

I am not anti-AI; I use it myself on various tasks. But it is only effective for me on tasks where I know what I am doing and can assess the AI accordingly. But if it prevents you getting to that level of expertise, then it is a disaster for learning. IMO.
 


SO they use it in toys because there is no one vocal about it. Sculptors used to be the ones that developed the toy ideas. Beware. Soon the community is going to relax on AI, there are already signs of it, and AI will be incorporated fully into D&D. Especially with the younger crowd growing with AI in their schooling. I haven't been to an education conference yet where they don't show you how to use AI in the classroom. These kids are not going to have the abhorrence Gen Z does for AI. I'm not that worried about AI in D&D, myself I can take it or leave it, but I speculate the bulwark against AI is going to lose.

Its already being accepted on a large scale.
Vehemently disagree. It sounds like you have personal experience if you're going to educator conferences, my personal experience is having children plus a bevy of nieces and nephews, up through university age (well, my eldest nephew just got his Masters), and them having wide swaths of online friends across geographical boundaries.

The people who sell things to educators want to integrate AI into everything. And please recognize that what the sellers are pushing and what kids learning want has no inherent overlap. It can overlap, it can be diametrically opposed. There's no correlation between them.

Students seem split between incredibly strongly against AI, and a smaller segment who use it to avoid doing any work. The anti-AI push makes the most vocal here at ENWorld seem milquetoast.

One of my kids has a youtube series (Animator vs. Animation) that they've been watching for years. A few months back they had a sponsor segment for technical classes, and one of the ones offered was AI. My kids was ready to swear off this creator they loved, I had to talk them down from this wasn't the creator's opinion, just a class offered by a sponsor that supported the series financially.
 

So, I can't speak to your personal experience, but I have a lot of professional experience and just did a conference with a lot of other educators, and I can assure you that AI uptake amongst high school and college-aged students in university prep classes is near 100% after rapid acceleration this year. Rapid as in, during Term 1 I had to deal with about half a dozen potential infractions over the term. Last week I had five on one day (these are a huge time sink, BTW, and soon we won't be able to enforce the issue in the same way).

What we are hearing from our alum who are currently in college, and from our friends and colleagues who teach at college, is that it is the Wild West right now. Some profs are vehemently anti-AI. Others endorse it. Many are 🤷‍♀️.

I ran a class on the ethics of AI art just last week with my Theory of Knowledge students and their reactions were basically incoherent. They claim to hate AI but also all use it and also think it can be used to create cool stuff and also think its stealing but also think it isn't... (Of course, they also get very passionate about artists' rights, but then tell me that downloading music without paying for it is perfectly ethical, so...)
 

And when they are 18, they can buy and eat all the cake they want. And while it may not be "cake" it is garbage and LOTS of people live that way.
Which sounds like parenting where they were just denied cake "because I said so" as opposed to taught why having cake for breakfast was a bad idea. Which is a "blame the parents" not a blame the person issue.

And they will soon enough learn why cake all the time is bad. They're still capable of learning, it's just now because of consequences, not because of someone sharing their experience with them and getting able to avoid it.

Lots of mistakes you need to make for yourself. That doesn't mean that "eating cake" (or whatever other garbage) becomes a lifelong thing for the majority of them.

I know people hate the idea of GenAI tripe replacing human made tripe, but it is coming. Look at the romance novel industry -- AI is running rampant and people are still buying those books.
Oh man, I'm in a Kindle Unlimited group, and the amount of AI books that aren't read and people warned away from is quite high. And with KU the "author" is only paid for pages read, so getting a few pages in and realizing it only pays cents. And Amazon, for all that they really are an evil corporation in other way, has been delisting AI generated works from the algorithm which means virtually no sales except through paid advertising. A lot of "low content" works like AI generated coloring books they are just disallowing.

Sorry, this particular example doesn't go the way you suggested it does.
 

I think one fundamental issue is that AI has forced educators to confront a fundamental flaw in how we teach, which is by assessing product rather than process. (e.g. to assess your learning, you write an essay and I mark it, on the principle that the essay is a good reflection of your personal mastery and how your learning compares to that of your peers).

This has always been problematic for a lot of reasons. It has always incentivized cheating. It has always standardized learners, when we know they are diverse. It has always caused students, teachers and parents to focus on the wrong things, valuing the measuring stick (products/grades) over the actual learning. It has always made smart people feel stupid because they weren't good at the specific measuring stick (like an essay) that was valued by the educational system.

But we got away with it because cheating was relatively difficult or expensive, and because marking product is SO MUCH easier and therefore economical than marking process at an individual level. And because of inertia, basically.

Now AI has made cheating basically free. And we are confronted by a system we've built that rewards it. So on top of not knowing anything about what we are now preparing them for, we are confronting the fact that our methods are busted, or that they've always been busted but now we can't hide. We are kind of f***ed.
 

We do plenty of handwritten assessments because their IB exams are handwritten (the essays referred to above are multi-draft research papers). My 12s are writing another handwritten practice essay tomorrow. I’ve still had students cheat with AI, such as one I busted with a phone on her lap.

But that’s only one issue. The deeper one is that we don’t know what we are preparing them for. My argument to them is that their unique voice is the most important thing they own, and if they don’t develop it, then they are in trouble.

I am not anti-AI; I use it myself on various tasks. But it is only effective for me on tasks where I know what I am doing and can assess the AI accordingly. But if it prevents you getting to that level of expertise, then it is a disaster for learning. IMO.
Have you read the MIT study about the neural and cognitive effects of LLM-assisted essay writing? I’d be very interested to hear your thoughts on it if so.
 


Recent & Upcoming Releases

Remove ads

Recent & Upcoming Releases

Remove ads

Top