• The VOIDRUNNER'S CODEX is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!

WotC WotC: 'Artists Must Refrain From Using AI Art Generation'

WotC to update artist guidelines moving forward.

After it was revealed this week that one of the artists for Bigby Presents: Glory of the Giants used artificial intelligence as part of their process when creating some of the book's images, Wizards of the Coast has made a short statement via the D&D Beyond Twitter (X?) account.

The statement is in image format, so I've transcribed it below.

Today we became aware that an artist used AI to create artwork for the upcoming book, Bigby Presents: Glory of the Giants. We have worked with this artist since 2014 and he's put years of work into book we all love. While we weren't aware of the artist's choice to use AI in the creation process for these commissioned pieces, we have discussed with him, and he will not use AI for Wizards' work moving forward. We are revising our process and updating our artist guidelines to make clear that artists must refrain from using AI art generation as part of their art creation process for developing D&D art.


-Wizards of the Coast​


F2zfSUUXkAEx31Q.png


Ilya Shkipin, the artist in question, talked about AI's part in his process during the week, but has since deleted those posts.

There is recent controversy on whether these illustrations I made were ai generated. AI was used in the process to generate certain details or polish and editing. To shine some light on the process I'm attaching earlier versions of the illustrations before ai had been applied to enhance details. As you can see a lot of painted elements were enhanced with ai rather than generated from ground up.

-Ilya Shlipin​

 

log in or register to remove this ad

MGibster

Legend
How does a machine define "style", except by reference to examples?
I don't know. How does a person? If I produced art that looked just like some Frazetta would have done, and I violating copyright? Unless I'm making a copy of a specific work of his, I'm guessing the answer is no.

But since AI doesn't really produce something in the style of someone, at least not yet, my original question was irrelevant. I was under the impression AI was creating original work in the style of other artists but I have been disabused of that notion.
 

log in or register to remove this ad

Snarf Zagyg

Notorious Liquefactionist
That is what teachers are really wrestling with. We have been teaching based on a theory of the mind that is rooted in a whole lot of cultural and metaphysical assumptions. We've known for some time that there are problems with this model, but education is a vast industry with a whole lot of inertia. Trillions of dollars and centuries worth of inertia. And this new technology is revealing that a lot of things we had assumed were uniquely special about human minds might not work at all like we thought.

We aren't close to wrapping our heads around the implications yet, or understanding what it means for education going forward. For us, this is an unparalleled existential crisis.

Yeah, while we are all talking about the legal ramifications (poorly, but I don't want to get into that) and the ethical implications related to art, there are going to be much broader and deeper implications in a large number of fields.

Moreover, this may have large and unknown (and at this time, unknowable) effects on a broad swath of knowledge-related fields. In the very near term, it's hard to say. But in the medium term (10 years) ... I expect that we will see differences.

Just think of the differences we see just between the introduction of the iPhone and today. And that's ... not the same. IMO, YMMV, etc.
 

FrogReaver

As long as i get to be the frog
So, I am not an expert on generative AI, but I've read up a bit because this is culturally relevant, and some of my early doctoral research was on training neural networks1.. Actual software engineers feel free to correct me.

How do Generative AIs work? Here's, broadly and generically, how:

1) Assemble a training set of data - generally, a bunch of actual examples that you want the AI to simulate. Usually a very large bunch, if you want good results. You also include metadata - with the actual Frazetta works you include in the training data, you tag them as being by Frazetta, or in Frazetta's style. You might also tag them as being fantasy, containing dragons, Conan, barbarians, axes, etc.

2) Train the AI - there are a bunch of ways to do this, but we can use one simple method to demonstrate some of the activity - you present the system with the entire set of tags, and one example from the training set, and ask the machine to guess what tags apply to the example.

If the system guesses right, parts of the algorithm that are responsible for that guess are strengthened. If it guesses wrong, the parts of the algorithm are weakened. In either case, it updates a "reference dataset" with the example, associated with the right tags, for later.

Lather, rinse, repeat. Each repeat alters its algorithm, and the reference set, to maximize its ability to answer correctly.

3) Then, to use the generative AI, you reverse the process. You hand it a collection of tags (the description of what you want it to produce), and it spits out a collection of stuff from its reference set that the algorithm says matches those tags.

So, since Frazetta signed all his work, that signature appears in every example of his work presented, so his signature is strongly associated with the tag of his name. The AI will often spit that out as an element in a query to give his style - for the machine, his "style" includes his signature, you see.

Thus - assembling the training data is an act that likely violates copyright for prose or visual art generative AIs, because you make a digital copy not for personal use to make that set. Then, also the reference set will still retain snippets of the original data, like Frazetta's signature, which will also violate copyright in much the same way as song-sampling can infringe on a musician's copyright.



1 I was working on training a neural network to simulate high energy particle physics events. My datasets were publicly available data from high energy particle accellerators/colliders. No sketchy source data for me!
I’m reminded of some old research. Military was trying to have computer pick out point of interest from satellite image.

They trained it on a bunch of satellite images. Except the images they wanted to identify positively just so happened to be taken on cloudy days. The computer made that connection and identified all cloudy days positively.

Much like the signature piece, the computer only understands by example. Common elements in the example are then associated with the request whether they should be relevant to it or not.
 

Lanefan

Victoria Rules
Why would a corporation buy it when they can wait for someone to post it online and scrape it?
Because scraping it isn't going to reproduce the original piece in its entirety, or even close.
Part of regulation would need to be an audit trail on the AI training sets.
Given how poorly any other internet regulations seem to be working, I'm not going to hold my breath on this one.
And how can you tell? Well, right now, how does the music industry know when someone has sampled a song they didn't have rights to?
They usually can't until they hear it on the radio, which means it's become successful enough that it's worth suing over.
 

Lanefan

Victoria Rules
An interesting note as to how deep this specific-to-D&D issue has penetrated the mainstream: today's Victoria Times-Colonist (the local paper) has an article on the front page of its business section headlined "D&D tells illustrators to stop using AI to generate art related to game".
 

FrogReaver

As long as i get to be the frog
Yeah, while we are all talking about the legal ramifications (poorly, but I don't want to get into that) and the ethical implications related to art, there are going to be much broader and deeper implications in a large number of fields.

Moreover, this may have large and unknown (and at this time, unknowable) effects on a broad swath of knowledge-related fields. In the very near term, it's hard to say. But in the medium term (10 years) ... I expect that we will see differences.

Just think of the differences we see just between the introduction of the iPhone and today. And that's ... not the same. IMO, YMMV, etc.
The more I learn about current generative AI’s limitations and how it actually works the less afraid I am of it taking tons of knowledge related jobs. I can see it more as a glorified assistant.
 

The more I learn about current generative AI’s limitations and how it actually works the less afraid I am of it taking tons of knowledge related jobs. I can see it more as a glorified assistant.
In a lot of fields, "assistant" is the entry level position. It might get hard to start your career if AI is doing most of the junior work.
 



Umbran

Mod Squad
Staff member
Supporter
Drawings seem completely outside its realm. There are other AI models trained specifically to create images, but this one, again, only knew words. It's just playing the game of "what is the next word I should spit out?" To test this, he needed a way for it to even be able to try to draw. So he does something clever.

He asks it to write a piece of computer code to draw something. And the coding language he asks it to use, he picks something intentionally obscure, not really meant for drawing pictures at all. It's called TikZ. OK, so he has this idea, gets out of bed, opens up his laptop, and types in draw me a unicorn in TikZ. He has two little kids asleep in the next room who are always talking about unicorns.

Sebastien Bubeck- And it started to output lines of code. I take those lines of code, put it into a TikZ compiler, and then I press enter. And then, boom, you know, the unicorn comes on onto the screen.

So, by my description, this becomes less mysterious. That episode of This American Life was from this June. But software engineers have been using ChatGPT as a coding tool since its release. The full data set for its training has not been revealed, but if it includes software man pages and GitHub, well, then its "guess the next word" will include code in its possible contexts.

And putting it into an obscure language is unremarkable, and may not be part of the AI proper - generative AIs have systems separate from the AI to help keep the output cogent and reasonable, and there are already systems that will translate code from one language to another. If a translation element is built into the formatting systems, that's not weird.
 

Related Articles

Remove ads

Remove ads

Top