WotC WotC: 'Artists Must Refrain From Using AI Art Generation'

WotC to update artist guidelines moving forward.

After it was revealed this week that one of the artists for Bigby Presents: Glory of the Giants used artificial intelligence as part of their process when creating some of the book's images, Wizards of the Coast has made a short statement via the D&D Beyond Twitter (X?) account.

The statement is in image format, so I've transcribed it below.

Today we became aware that an artist used AI to create artwork for the upcoming book, Bigby Presents: Glory of the Giants. We have worked with this artist since 2014 and he's put years of work into book we all love. While we weren't aware of the artist's choice to use AI in the creation process for these commissioned pieces, we have discussed with him, and he will not use AI for Wizards' work moving forward. We are revising our process and updating our artist guidelines to make clear that artists must refrain from using AI art generation as part of their art creation process for developing D&D art.


-Wizards of the Coast​


F2zfSUUXkAEx31Q.png


Ilya Shkipin, the artist in question, talked about AI's part in his process during the week, but has since deleted those posts.

There is recent controversy on whether these illustrations I made were ai generated. AI was used in the process to generate certain details or polish and editing. To shine some light on the process I'm attaching earlier versions of the illustrations before ai had been applied to enhance details. As you can see a lot of painted elements were enhanced with ai rather than generated from ground up.

-Ilya Shlipin​

 

log in or register to remove this ad


log in or register to remove this ad

Umbran

Mod Squad
Staff member
Supporter
When ChatGPT inevitably morphs into GMGPT and spits out high-quality D&D modules at the push of a button, I assume we'll be making the same ruckus?

Presumably. It is easy to not care about what happens to visual artists, or authors, if you are not an visual artist or author.
 

Umbran

Mod Squad
Staff member
Supporter
Do you know why cover bands can do that without any issues? Because the venues pay for license rights for songs to be played live by any cover bands.

Yep. Doing cover songs was a standard of musical performance before there was ever an industry around it.

IIRC, artists don't have to pay a fee to cover a song live - that cost is payed by the venue, and is usually handled by a blanket agreements with BMI, ASCAP, and SEASAC, so that venues don't have to hunt down exactly who has the rights to which song.

To record and distribute a cover a song in the US, though, you need what is called a "mechanical license", and the cost is, again iirc, about 12 cents per copy.
 

The broader issue for me with respect to automation is, if something is automated, who gets to benefit from the supposed increased productivity and inefficiency. That's not a question about the technology, but about how our laws, regulations, and politics shape the use of that technology. In unregulated environments, new technologies can be leveraged to increase the wealth and power of a select few at the cost of the many, whose labor and life becomes more precarious. Hopefully what's starting out now as consumer activism and corporate responses (e.g. people complaining about AI art and companies being forced to take that into consideration) eventually results in a more robust legal framework, not just around AI but around this larger question of who gets to benefit. Getting in the way of the would be the large corporations themselves that most stand to profit, and I guess individuals who see (particular kinds of) technological change as inevitable and operating separately from any social/legal/political process (which has never been the case historically).
 

robus

Lowcountry Low Roller
Supporter
Yep. Honestly they shouldn't be able to Call these things AI. Nothing created by tech has been even remotely intelligent. Which when we see what the dumb AI's can do should scare everyone.
Artificial Imitation. It is aping human creativity, it doesn't have a single original idea. Its entire process requires the seed of a human idea. It's even showing that when it consumes its own content it devolves into chaos very quickly.

(Which is why I think this one moment of time is not going to last - the separation between human work and AI work has to be clearly delineated or else the AI ceases to function!)
 

Umbran

Mod Squad
Staff member
Supporter
Does the AI actually copy images or does it just copy style?

How does a machine define "style", except by reference to examples?

Even if the operation were like you say, the training of the machine requires making digital copies of the works, which is generally a copyright violation right there.
 

FrogReaver

As long as i get to be the frog
Artificial Imitation. It is aping human creativity, it doesn't have a single original idea. Its entire process requires the seed of a human idea. It's even showing that when it consumes its own content it devolves into chaos very quickly.

(Which is why I think this one moment of time is not going to last - the separation between human work and AI work has to be clearly delineated or else the AI ceases to function!)
the problem - one simple new technological development could cause this view to come crashing down. Suppose tomorrow they solved the problem of feeding AI as input what AI before generated as output.
 


robus

Lowcountry Low Roller
Supporter
the problem - one simple new technological development could cause this view to come crashing down. Suppose tomorrow they solved the problem of feeding AI as input what AI before generated as output.
Well then we'd truly have artificial intelligence as it would be able to understand what it was experiencing. I think that's still quite a ways off. Believe me, I'm quite worried about Generalized AI, but I think we're not much closer to that despite the appearances.

What we currently have is a very clever shortcut that relies on imitating human creativity by consuming patterns and using those patterns to produce new things but it has absolutely no understanding of the pattern and thus cannot regulate its inputs.
 

Morrus

Well, that was fun
Staff member
the problem - one simple new technological development could cause this view to come crashing down. Suppose tomorrow they solved the problem of feeding AI as input what AI before generated as output.
Not only do they already do this, it is already causing a problem — outputs are degenerating as the AI regurgitates itself. Even its maths ability has degraded. Without the original input of people, it’s nothing.
 

Related Articles

Remove ads

Remove ads

Top