Multiple "AI Art" Updates and Controversies in Tabletop Gaming

BackerKit bans, Wizards of the Coast replaces, and Essen Spiel caught using algorithmically generated artwork.

Three news stories this week came out about algorithmic generation aka "AI Art" in the tabletop gaming industry.

backerkit-ai-policy-1.png
BackerKit announced that effective October 4, no project will be allowed with any writing or art assets that were entirely created by algorithmic generation aka “AI”. From the blog post:

At BackerKit, our team is passionate about people’s passions. For ten years, we’ve supported creators in their journey to launch projects and build thriving creative practices and businesses. We’ve developed deep relationships and respect for the people who breathe life into crowdfunding projects, and we are committed to defending their well-being on our platform.

That’s why we are announcing a new policy that aims to address growing concerns regarding ownership of content, ethical sourcing of data, and compensation for the process of creating content. […]

As part of this consideration, BackerKit has committed to a policy that restricts the use of AI-generated content in projects on our crowdfunding platform.

This policy goes into effect on October 4, 2023.

[…] This policy emphasizes that projects on BackerKit cannot include content solely generated by AI tools. All content and assets must first be created by humans.

This doesn’t impact content refined with AI-assisted tools like “generative content fill” or “object replacement” (image editing software functions that help blend or replace selected portions of an image), other standard image adjustment tools (saturation, color, resolution,) or AI language tools that refine human-created text with modifications to spelling, grammar, and syntax.

Software assisted by AI, such as transcribers or video tracking technology are permitted under these guidelines. However, software with the purpose to generate content using AI would not be permitted.

The post includes image examples of what content is and is not allowed. Additionally, BackerKit will add an option to the back end for creators that will allow them to “exclude all content uploaded by our creators for their projects from AI training”. This is opt-out, meaning that by default this ban is in place and creators who want their work used for training generative algorithms must go in and specifically allow it.

altisaur.png

This move comes alongside a pair of recent controversies in tabletop gaming. Last month, Wizards of the Coast came under fire as it was revealed a freelance artist used algorithmic generation for artwork included in Bigby Presents: Glory of the Giants. Wizards of the Coast quickly updated their stance on algorithmic generation with a statement that the artwork would be removed from the D&D Beyond digital copies of the book and will place new language in contracts banning the use of algorithmic generation.

This week, Gizmodo reporter Linda Codega reported that the artwork in the D&D Beyond version of Bigby Presents has now been replaced with new art. No announcement was made about the new artwork, and Gizmodo’s attempts to contact Wizards of the Coast for a statement directed them to the statement made in August. The artist who used algorithmic generation, Ilya Shkipin, has been removed from the art credits from the book, and the artwork has replaced by works by Claudio Prozas, Quintin Gleim, Linda Lithen, Daneen Wilkerson, Daarken, and Suzanne Helmigh.

IMG_7777-e1696254650600.jpg

Meanwhile, the largest tabletop gaming convention in Europe, Essen Spiel, recently ran into the same controversy as promotional material for the convention used algorithmically generated artwork including the convention’s official app, promotional posters, and tickets for the event.

Marz Verlag, the parent company for the convention, responded to a request for comment from Dicebreaker:

"We are aware of this topic and will evaluate it in detail after the show. Right now please understand that we cannot answer your questions at this moment, as we have a lot to do to get the show started today," said a representative for Merz Verlag.

"Regarding the questions about Meeps and timing, I can tell you quickly that the marketing campaign [containing AI artwork] has been created way before we had the idea to create a mascot. The idea of Meeps had nothing to do with the marketing campaign and vice versa."

Meeps, a board game-playing kitten and totally innocent of the controversy (because who could blame a cute kitty), is the new mascot for the convention announced this past July voted on by fans and was designed by illustrator Michael Menzel.
 

log in or register to remove this ad

Darryl Mott

Darryl Mott


log in or register to remove this ad


Morrus

Well, that was fun
Staff member
Question: now that AI can do some/much of what programmers do, how many of you expect your video game companies to use "ethically sourced" code?

Or how about your lawyers? If a lawyer uses AI to generate a draft of a brief, are you ok with that as long as the final product is up to professional standards, or do you worry about "ethically sourced" contracts?
Your use of quotation marks is interesting. Does the phrase "ethically sourced" bother you?

I would like all companies to act ethically all of the time. Does that answer your question?

Do you not want companies to act ethically? I mean, "I want companies to act unethically" is a hard stance to defend, but I'm happy to hear it!
 


Morrus

Well, that was fun
Staff member
Oh, I want companies to act ethically as well, and I will (try to) not buy from those who do not.

I just don't find the use of technology to increase efficiency, even at the cost of jobs, to be inherently unethical.

It can be a bad business decision if the quality is less, sure.

It can be unethical in the execution (for example, it's pretty sleezy to require an employee to train their cheaper replacement, and then fire them when it's done.)

And I see nothing wrong with buyers organizing boycotts in order to protect jobs they want to protect.

But I'm just not sure why it's unethical to save money by using AI to generate illustrations. If the product is inferior, and consumers don't buy it because of that, the publishers will regret it.

Because the AI is not making new art, it's plagiarising existing art. That's the core of the issue.

And before anybody says it--it's not 'influenced by' previous art. It's not 'inspired by giants'. It's not painting or writing 'in the style of'. It mechanically copies it--heck, the original signatures sometime show up! AI is not capable of creativity yet. It can't make anything new, like a human can. Maybe one day it will, but right now that's a long, long way away. These are all Large Language Models; they're not really AI; they're not thinking, they're copying.

And sure, that's the artist's problem, not yours. I get it. But when making art is not a viable option any more, all you will have is regurgitated art. Nothing new, because nobody's making anything new.

So it's not just the artist's problem, it's society's problem, because we as humans value art. We don't want a dystopia with nothing new ever being created, just regurgitated images and movies and books. It's bad enough now, let alone when humans no longer have a part in it. So it is in our interests to protect the artists, the writers, the actors, etc, so we get to continue to live in a society where new stuff is made.

But if the consumers don't seem to ultimately notice/care, then why is it unethical?

Ethics aren't based on whether or not I notice them. I don't even know how to respond to that, other than yes, it's still unethical even if I didn't notice or care.
 

Abstruse

Legend
Question: now that AI can do some/much of what programmers do, how many of you expect your video game companies to use "ethically sourced" code?

Or how about your lawyers? If a lawyer uses AI to generate a draft of a brief, are you ok with that as long as the final product is up to professional standards, or do you worry about "ethically sourced" contracts?
The difference is consequences.

If you let an algorithm write code, you can end up with code that doesn't work or creates more bugs for other code in the engine or worst is actively malicious to the system. All cases end up costing the company money.

And the one lawyer who decided to use a Large Language Model algorithm to file a court document is currently undergoing a review of his license after the judge fined both him and the firm he was working with. And that judge was being very nice by only fining rather than going after contempt of court charges and further sanctions since they filed entirely fictional case citations.

It's why people talk so much about the use of algorithmic generation for art and creative writing - the only way for there to be negative consequences is if fans make there be negative consequences. Tell companies they want real human artists paid fair wages for their work or they won't buy their products. Do you think the ex-Amazon ex-Microsoft executives at Wizards of the Coast who approved spending money commissioning all that new art actually cared the original artist used "AI enhancements"? Or was it because the last time they didn't listen to the creative employees working on D&D saying that something would cause huge fan backlash did indeed cause a huge fan backlash so they listened this time and spent the money on new art?
 

Scribe

Legend
It's why people talk so much about the use of algorithmic generation for art and creative writing - the only way for there to be negative consequences is if fans make there be negative consequences.

There are negative consequences just as with generated code.

Generated code has bugs, leads to more bugs and yeah a human has to fix that.
Generated art, has AI artifacts and issues, and yeah a human has to fix that.

In either scenario a Human is being replaced by the generation of code/content.

In both cases, there is a cost to the Human, and consequence (for now) to the company.
 




Related Articles

Remove ads

Remove ads

Top