Ryan Dancey & AEG Part Ways Following AI Comments

COO says that AI could make any of the company's games.
Alderac_brandpage_BS_1200x600_crop_center.webp


Ryan Dancey, the Chief Operating Officer of boardgame publisher Alderac Entertainment Group, no longer works for the company, following statements on social media where he claimed that AI could make most of the company's board games, and that D&D and Magic: the Gathering were the only new forms of gameplay in his lifetime. After another poster on LinkedIn claimed that "AI wouldn't come up with Tiny Towns or Flip Seven or Cubitos because it doesn't understand the human element of fun", Dancey responded that he had zero reason to believe that AI could not do such a thing.

"I have zero reason to believe that an Al couldn't come up with Tiny Towns or Flip Seven or Cubitos. I can prompt any of several Als RIGHT NOW and get ideas for games as good as those. The gaming industry doesn't exist because humans create otherwise unobtainable ideas. It exists because many many previous games exist, feed into the minds of designers, who produce new variants on those themes. People then apply risk capital against those ideas to see if there's a product market fit. Sometimes there is, and sometimes there is not. (In fact, much more often than not).

Extremely occasionally (twice in my lifetime: D&D and Magic: the Gathering) a human has produced an all new form of gaming entertainment. Those moments are so rare and incandescent that they echo across decades.

Game publishing isn't an industry of unique special ideas. It's an industry about execution, marketing, and attention to detail. All things Als are great at."
- Ryan Dancey​

The Cardboard Herald, a boardgame reviews channel, responded yesterday on BlueSky that "As you may have seen, [AEG] CEO Ryan Dancey stated that AI can make games “just as good as Tiny Towns or Flip 7 or Cubitos”, completely missing the inexorable humanity involved.We’ve spent 10 years celebrating creatives in the industry. Until he’s gone we will not work with AEG."

Today, AEG's CEO John Zinser stated "Today I want to share that Ryan Dancey and AEG have parted ways.This is not an easy post to write. Ryan has been a significant part of AEG’s story, and I am personally grateful for the years of work, passion, and intensity he brought to the company. We have built a lot together. As AEG moves into its next chapter, leadership alignment and clarity matter more than ever. This transition reflects that reality.Our commitment to our designers, partners, retailers, and players remains unchanged. We will continue building great games through collaboration, creativity, and trust."

Dancey himself posted "This morning [John Zinser] and I talked about the aftermath of my post yesterday about the ability of AI to create ideas for games. He's decided that it's time for me to move on to new adventures. Sorry to have things end like this. I've enjoyed my 10 years at AEG. I wish the team there the best in their future endeavors.

I believe we're at a civilizational turning point. That who we are and how we are is going to change on the order of what happened during the Agricultural and Industrial Revolutions; and it's past time we started talking about it and not being afraid to discuss the topic. Talking about AI, being honest about what it can and cannot do, and thinking about the implications is something we have to begin to do in a widespread way. Humans have a unique creative spark that differentiates us and makes us special and we should celebrate that specialness as we experience this epic change.

For the record: I do not believe that AI will replace the work talented game designer/developers do, nor do I think it is appropriate to use AI to replace the role of designer/developers in the publication of tabletop games. During my time at AEG I developed and implemented polices and contracts that reflect those views. It's important to me that you know what I believe and what I don't believe on this particular topic, despite what you may have read elsewhere."

Whatever your position on generative LLMs and the like, when the COO of your company announces publicly that all of the company’s games could have been made by AI, it’s a problem. UK readers may recall when major jewelry chain Ratners’ CEO Gerald Ratner famously announced that the products sold in his stores were “trash”, instantly wiping half a billion pounds from the company’s value back in the early 1990s. The company was forced to close stores and rebrand to Signet Group. At the time the Ratners Group was the world's biggest jewelry retailer. Ratner himself was forced to resign in 1992. The act of making a damaging statement about the quality of your own company’s products became known as “doing a Ratner”.

Dancey was VP of Wizards of the Coast when the company acquired TSR, the then-owner of Dungeons & Dragons. He is also known for being the architect of the Open Game License. Dancey has worked as Chief Operating Officer for AEG for 10 years, and was responsible for the day-to-day operations of the company, second-in-command after the CEO, John Zinser.
 

log in or register to remove this ad

AI trying to generate creative content learns next to nothing from being fed AI generated creative content.
actually it unlearns when you do that, and over generations (the output of one AI feeding into the next generation) becomes very mediocre in comparison to the starting generation that was fed on human generated input
 

log in or register to remove this ad

It lets them reduce staffing levels, which reduces overhead costs, which looks great on the quarterly and yearly reports to the SEC.
Anyone using AI to reduce staffing levels to this extent deserve the pain that is coming. The correct usage of AI, IMO, is to increase the quality-of-life and productivity of your workers, to allow them to more effectively drive your company.

Just one example of what I meant above about taking out drudgery - writing unit tests is essential. It's also, to a large extent, drudgery. Creating test data. Scaffolding tests around your functions. Injecting mock dependencies. There are some frameworks that make this less onerous, but even those take away from your productivity. With AI, I start from that, and add on the actual business rules that I need for unit tests. This is about 3-4 hours per 9-12 hours saved.

I also work on a service that is used by most of our developers across the entire company. Because of attrition that has nothing to do with AI, but more to do with we have to keep the lights on so our management hasn't restaffed for lost positions, we support this critical service with 3 developers + me being a half developer. And we have developers using this almost 24 hours somewhere in the world 5 days a week, not to mention the production usage. We have built up a large documentation store that almost no one uses, even for the simple questions. Using that as a RAG behind an AI agent, and forcing them to use that as a T1 triage brought down our number of support cases from impossible to merely hard to manage (about 35% over the 8 months its been in use). And when someone comes to us, we have them have the exact question they posed to the agent, so we can have a learning loop to help. It's helped to reduce frustration levels on our side and brought us back from the brink of our small team burning out. And we still have no more help, in case you were wondering.
 

Anyone using AI to reduce staffing levels to this extent deserve the pain that is coming. The correct usage of AI, IMO, is to increase the quality-of-life and productivity of your workers, to allow them to more effectively drive your company.

Yeah...

(Bloomberg) -- Jack Dorsey’s Block is cutting 4,000 employees, reducing its workforce by nearly half, in a move the financial technology firm is describing as a bet on artificial intelligence changing the future of labor productivity.


“I don’t think we’re early to this realization,” he said. “I think most companies are late. Within the next year, I believe the majority of companies will reach the same conclusion and make similar structural changes. I’d rather get there honestly and on our own terms than be forced into it reactively.”

Block’s stock jumped as much as 21% after trading opened on Friday.

Jersey Shore Nicole GIF by Jersey Shore Family Vacation
 

I didn't say that some won't do it. They're just going to experience pain on the other end. They could have made use of that trained staff to accelerate their bottom line. What's better? To reduce headcount, eliminating trained workers for savings and keep your growth stable, or to empower your already trained workforce to increase earnings and keep increasing earnings because you have more power by making use of both? My company has bet on the latter, and the stock price experienced a similar growth pattern due to the changes that did not have to do with reducing headcount. There's also the matter of morale on the workers that are left. You either tell them you're important and we're investing in you in addition to AI, or you're not important - we're going to invest in AI and cut half. And we'll cut you too if it seems that AI can do even more.
 

And we'll cut you too if it seems that AI can do even more.

I'm sure you've been around long enough, what have you experienced?

I've seen close to 10 waves of layoffs to make the books better. I've seen dozens of jobs outsourced.

You think the majority of tech isn't going to cut headcount if it can due to AI??

There are already studies showing entry positions are reduced dramatically.

I don't know, I read the tea leaves, and am hoping as I have every year for near 20 years.

"Hope I'm not next."
 

I'm sure you've been around long enough, what have you experienced?

I've seen close to 10 waves of layoffs to make the books better. I've seen dozens of jobs outsourced.

You think the majority of tech isn't going to cut headcount if it can due to AI??

There are already studies showing entry positions are reduced dramatically.

I don't know, I read the tea leaves, and am hoping as I have every year for near 20 years.

"Hope I'm not next."
One reason I settled where I am and stopped contracting was because of the culture and environment.

We've been through a few mergers/acquisitions and have had RIFs based on that. We've also had RIFs based on political maneuvering which sucked. And we have product lines that aren't performing cut and RIFs and that sucks. We've also had RIFs to move jobs to less costly cost-centers, which sucks even more. We haven't had any RIFs based on AI. We just end up getting more work and different work - they use it as an accelerator. A different kind of experience, but one that doesn't result in lost jobs. The only thing in relation to jobs is performance both personal and financial, for better or worse.
 

One reason I settled where I am and stopped contracting was because of the culture and environment.

We've been through a few mergers/acquisitions and have had RIFs based on that. We've also had RIFs based on political maneuvering which sucked. And we have product lines that aren't performing cut and RIFs and that sucks. We've also had RIFs to move jobs to less costly cost-centers, which sucks even more. We haven't had any RIFs based on AI. We just end up getting more work and different work - they use it as an accelerator. A different kind of experience, but one that doesn't result in lost jobs. The only thing in relation to jobs is performance both personal and financial, for better or worse.

To be fair, I've not seen a reduction due to AI, yet. Our usage seems to be lagging behind what I read, but given my experience otherwise? If people can be reduced, why wouldn't they?

Time will tell, but no matter what, I don't see a good end result here.
 

Related Articles

Remove ads

Remove ads

Top