Ryan Dancey & AEG Part Ways Following AI Comments

COO says that AI could make any of the company's games.
Alderac_brandpage_BS_1200x600_crop_center.webp


Ryan Dancey, the Chief Operating Officer of boardgame publisher Alderac Entertainment Group, no longer works for the company, following statements on social media where he claimed that AI could make most of the company's board games, and that D&D and Magic: the Gathering were the only new forms of gameplay in his lifetime. After another poster on LinkedIn claimed that "AI wouldn't come up with Tiny Towns or Flip Seven or Cubitos because it doesn't understand the human element of fun", Dancey responded that he had zero reason to believe that AI could not do such a thing.

"I have zero reason to believe that an Al couldn't come up with Tiny Towns or Flip Seven or Cubitos. I can prompt any of several Als RIGHT NOW and get ideas for games as good as those. The gaming industry doesn't exist because humans create otherwise unobtainable ideas. It exists because many many previous games exist, feed into the minds of designers, who produce new variants on those themes. People then apply risk capital against those ideas to see if there's a product market fit. Sometimes there is, and sometimes there is not. (In fact, much more often than not).

Extremely occasionally (twice in my lifetime: D&D and Magic: the Gathering) a human has produced an all new form of gaming entertainment. Those moments are so rare and incandescent that they echo across decades.

Game publishing isn't an industry of unique special ideas. It's an industry about execution, marketing, and attention to detail. All things Als are great at."
- Ryan Dancey​

The Cardboard Herald, a boardgame reviews channel, responded yesterday on BlueSky that "As you may have seen, [AEG] CEO Ryan Dancey stated that AI can make games “just as good as Tiny Towns or Flip 7 or Cubitos”, completely missing the inexorable humanity involved.We’ve spent 10 years celebrating creatives in the industry. Until he’s gone we will not work with AEG."

Today, AEG's CEO John Zinser stated "Today I want to share that Ryan Dancey and AEG have parted ways.This is not an easy post to write. Ryan has been a significant part of AEG’s story, and I am personally grateful for the years of work, passion, and intensity he brought to the company. We have built a lot together. As AEG moves into its next chapter, leadership alignment and clarity matter more than ever. This transition reflects that reality.Our commitment to our designers, partners, retailers, and players remains unchanged. We will continue building great games through collaboration, creativity, and trust."

Dancey himself posted "This morning [John Zinser] and I talked about the aftermath of my post yesterday about the ability of AI to create ideas for games. He's decided that it's time for me to move on to new adventures. Sorry to have things end like this. I've enjoyed my 10 years at AEG. I wish the team there the best in their future endeavors.

I believe we're at a civilizational turning point. That who we are and how we are is going to change on the order of what happened during the Agricultural and Industrial Revolutions; and it's past time we started talking about it and not being afraid to discuss the topic. Talking about AI, being honest about what it can and cannot do, and thinking about the implications is something we have to begin to do in a widespread way. Humans have a unique creative spark that differentiates us and makes us special and we should celebrate that specialness as we experience this epic change.

For the record: I do not believe that AI will replace the work talented game designer/developers do, nor do I think it is appropriate to use AI to replace the role of designer/developers in the publication of tabletop games. During my time at AEG I developed and implemented polices and contracts that reflect those views. It's important to me that you know what I believe and what I don't believe on this particular topic, despite what you may have read elsewhere."

Whatever your position on generative LLMs and the like, when the COO of your company announces publicly that all of the company’s games could have been made by AI, it’s a problem. UK readers may recall when major jewelry chain Ratners’ CEO Gerald Ratner famously announced that the products sold in his stores were “trash”, instantly wiping half a billion pounds from the company’s value back in the early 1990s. The company was forced to close stores and rebrand to Signet Group. At the time the Ratners Group was the world's biggest jewelry retailer. Ratner himself was forced to resign in 1992. The act of making a damaging statement about the quality of your own company’s products became known as “doing a Ratner”.

Dancey was VP of Wizards of the Coast when the company acquired TSR, the then-owner of Dungeons & Dragons. He is also known for being the architect of the Open Game License. Dancey has worked as Chief Operating Officer for AEG for 10 years, and was responsible for the day-to-day operations of the company, second-in-command after the CEO, John Zinser.
 

log in or register to remove this ad

The "problem" with AI is that it can't truly create; it can only mix and match what already exists. If you replace creative endeavors with AI, you'll eventually hit a point where there's no new creative input for the AI to learn from.

That's my big issue with how big business is looking at AI - they want to take human creativity and plus it into AI once and then forget about it. That will work in the short term, but not the long term. An AI can't make a value judgement; it has no idea if the product it's created is good or not. Even if AI can create 10000 rpgs a month, who's going to go through all of that to find out what is and isn't crap?

Ryan's definition of transformative and innovative is too narrow and there's no surefire way to know what will and won't resonate with consumers. D&D took two things that have existed for centuries (improv storytelling and die rolling) and put them together in a new way. But even D&D took years to really get rolling. How many CEO's do you think are likely to wait years for their AI generated game to catch on?

Magic is even worse. It's just a card game with randomized distribution. Apart from Magic, how many successful ccg's have there been? With rpg's, gamers can have dozens of titles, but no one has dozens of ccg's. We don't need AI generating more content - that adds to the problem rather than solving it.

The real problem is that tech CEO's and others are viewing AI as an easy way to make money with no overhead and as a way to dodge legal requirements (like copywrite). Adam Eisgrau has stated that companies shouldn't have to follow copywrite laws for no other reason than the AI business model doesn't work if they do.
 

log in or register to remove this ad

The "problem" with AI is that it can't truly create; it can only mix and match what already exists. If you replace creative endeavors with AI, you'll eventually hit a point where there's no new creative input for the AI to learn from.

That's my big issue with how big business is looking at AI - they want to take human creativity and plus it into AI once and then forget about it. That will work in the short term, but not the long term. An AI can't make a value judgement; it has no idea if the product it's created is good or not. Even if AI can create 10000 rpgs a month, who's going to go through all of that to find out what is and isn't crap?

Ryan's definition of transformative and innovative is too narrow and there's no surefire way to know what will and won't resonate with consumers. D&D took two things that have existed for centuries (improv storytelling and die rolling) and put them together in a new way. But even D&D took years to really get rolling. How many CEO's do you think are likely to wait years for their AI generated game to catch on?

Magic is even worse. It's just a card game with randomized distribution. Apart from Magic, how many successful ccg's have there been? With rpg's, gamers can have dozens of titles, but no one has dozens of ccg's. We don't need AI generating more content - that adds to the problem rather than solving it.

The real problem is that tech CEO's and others are viewing AI as an easy way to make money with no overhead and as a way to dodge legal requirements (like copywrite). Adam Eisgrau has stated that companies shouldn't have to follow copywrite laws for no other reason than the AI business model doesn't work if they do.
The short term is the only thing the people care about though. How that might things, or other people, down the road isn't relevant to their interests.
 

The impressive rate of improvement of the first few years of LLMs is already plateauing without massive amounts of manual work. There's simply not enough new data. At this point, they are only going to be marginally better for the foreseeable future. Extrapolating future increases based on that initial wave is unfounded (which is also why so many tech companies are growing more desperate about requiring AI in all of their services rather than offering it on its own merits and in the past year or two shifted from talking about how incredible AI is to how people NEED to use AI).
For a while this seemed true to me, as a user. But in the past months I've changed my opinion. The Claude Code, Codex, and OpenClaw projects, have revolutioned the value of LLMs. That's different than the LLMs themselves improving, but it suggests there is a ton of value in building the right system for LLMs.
 

The real problem is that tech CEO's and others are viewing AI as an easy way to make money with no overhead and as a way to dodge legal requirements (like copywrite). Adam Eisgrau has stated that companies shouldn't have to follow copywrite laws for no other reason than the AI business model doesn't work if they do.
"We'd go out of business if we had to obey the law" is a great line. Tells you everything you need to know about “AI” in one go. A whole lot of other businesses, too.
 






One thing that bugs me about AI is that I feel the C-Suite are using it in the wrong areas.

The author Joanna Maciejewska once said, "I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes."

Of course, this would require robots, but I think it would be a much better use of AI, and would help enable the idea of less work and more leisure time.
 

Related Articles

Remove ads

Remove ads

Top