WotC: 'We made a mistake when we said an image not AI'

It seems like AI art is going to be a recurring news theme this year. While this is Magic: the Gathering news rather than D&D or TTRPG news, WotC and AI art has been a hot topic a few times recently. When MtG community members observed that a promotional image looked like it was made with AI, WotC denied that was the case, saying in a now-deleted tweet "We understand confusion by fans given...

Screenshot 2024-01-07 at 18.38.32.png

It seems like AI art is going to be a recurring news theme this year. While this is Magic: the Gathering news rather than D&D or TTRPG news, WotC and AI art has been a hot topic a few times recently.

When MtG community members observed that a promotional image looked like it was made with AI, WotC denied that was the case, saying in a now-deleted tweet "We understand confusion by fans given the style being different than card art, but we stand by our previous statement. This art was created by humans and not AI."

However, they have just reversed their position and admitted that the art was, indeed, made with the help of AI tools.

Well, we made a mistake earlier when we said that a marketing image we posted was not created using AI. Read on for more.

As you, our diligent community pointed out, it looks like some AI components that are now popping up in industry standard tools like Photoshop crept into our marketing creative, even if a human did the work to create the overall image.

While the art came from a vendor, it’s on us to make sure that we are living up to our promise to support the amazing human ingenuity that makes Magic great.

We already made clear that we require artists, writers, and creatives contributing to the Magic TCG to refrain from using AI generative tools to create final Magic products.

Now we’re evaluating how we work with vendors on creative beyond our products – like these marketing images – to make sure that we are living up to those values.


This comes shortly after a different controversy when a YouTube accused them (falsely in this case) of using AI on a D&D promotional image, after which WotC reiterated that "We require artists, writers, and creatives contributing to the D&D TTRPG to refrain from using AI generative tools to create final D&D products."

The AI art tool Midjourney is being sued in California right now by three Magic: The Gathering artists who determined that theirs and nearly 6,000 other artists' work had been scraped without permission. That case is ongoing.

Various tools and online platforms are now incorporating AI into their processes. AI options are appearing on stock art sites like Shutterstock, and creative design platforms like Canva are now offering AI. Moreover, tools within applications like Photoshop are starting to draw on AI, with the software intelligently filling spaces where objects are removed and so on. As time goes on, AI is going to creep into more and more of the creative processes used by artists, writers, and video-makers.

Screenshot 2024-01-07 at 19.02.49.png
 

log in or register to remove this ad

Parmandur

Book-Friend, he/him
We used to think that AI couldn't produce such complicated images at all. Declaration that such feats are impossible is... maybe not a good bet to make.

Better to say that we can't do that... yet. But, again, short of cryptographic signing, in principle there's no digital asset that generative AI systems cannot create. There are only ones they haven't yet been trained to create.



If you make such layers part of the training data, it is entirely possible to do.



"A long way" doesn't mean what it used to. For the next couple of years, maybe this will suffice, but if there's sufficient money at stake, the technology will catch up sooner than we'd want it to.



No, again, they'd need to start with data on how an artist works. Current generative AI is trained on only the end products, because those are easy to get off the internet. Feed it files with those histories, though, and it becomes a different ball game.

Watch for image creation tools to start having terms that allow access to your data for "diagnostic purposes"...



That is not fundamental to the technology. That is merely what we have trained it to produce. Note that producing art and producing text is not even fundamental to the technology. For example, back in the day, I did research in training neural networks to simulate high energy particle collisions for tuning data analysis tools at accelerators.

Since the AI doesn't understand what it is doing, it also doesn't actually care what it is doing - what is fundamental to the technology is intake of digital data and output of things that are similar to that data. And that's about it, fundamentally speaking.
I could see companies like WotC, who I genuinely believe don't want to use AI (for mundane business reasons of IP control more than ethics, but still), having to cut out digital art as a medium: require all art to be on a physical medium, with proof of work for the physical art objects.
 

log in or register to remove this ad

dragoner

KosmicRPG.com
I could see companies like WotC, who I genuinely believe don't want to use AI (for mundane business reasons of IP control more than ethics, but still), having to cut out digital art as a medium: require all art to be on a physical medium, with proof of work for the physical art objects.
I think it will be AI checking art for AI, all made by the same company, really brilliant, create a problem only you can solve. lol
 

No.

I understand that neither of you are artists (at least in the sense of using digital tools), but you're just making stuff up. There are two issues here:

1) You can require, as @Parmandur correctly points out, proof of work. If you actually did the art, in any normal way, you have this. Period.
So, here's a thing I can show the image I generated along with the prompts, then show you the image AFTER I did image to image or in-printing or even both depending on what I felt was needed then show you image after some upscaling tweaks as a final image.

In fact as an example, here's an article in which the author shares his (yes i checked the user's profile) workflow on upscaling images to over 12K https://civitai.com/articles/3582/u...-my-workflow-for-upscaling-images-to-over-12k
 
Last edited:

Umbran

Mod Squad
Staff member
Supporter
I could see companies like WotC, who I genuinely believe don't want to use AI (for mundane business reasons of IP control more than ethics, but still), having to cut out digital art as a medium: require all art to be on a physical medium, with proof of work for the physical art objects.

It shouldn't be necessary.

For now, providing digital artifacts as evidence of provenance should work.

It also occurs to me that a proper contract can handle the rest - if the publisher asks for art made by a person, and you actively provide false evidence, that's a violation of contract and fraud. If you are caught, that's a different kettle of fish. Making a cheap buck is one thing. Getting sued for fraud is another.

And, perhaps more importantly, if you create tools that have no other purpose than fraud, that's also a different kettle of fish. I am not sure that big providers of AI tools would want to be named in such fraud suits.
 

Parmandur

Book-Friend, he/him
It shouldn't be necessary.

For now, providing digital artifacts as evidence of provenance should work.

It also occurs to me that a proper contract can handle the rest - if the publisher asks for art made by a person, and you actively provide false evidence, that's a violation of contract and fraud. If you are caught, that's a different kettle of fish. Making a cheap buck is one thing. Getting sued for fraud is another.

And, perhaps more importantly, if you create tools that have no other purpose than fraud, that's also a different kettle of fish. I am not sure that big providers of AI tools would want to be named in such fraud suits.
A couple years ago, nobody would have even thought any of this was an issue. Really getting out of control fast.
 

As soon as art editors start checking work like that across industries as general policy, we will see AI created to also produce that history.

Generative AI is not currently designed to create it, but there's no digital asset it technically cannot put together.

Ultimately, we may need something like a company that makes digital art tools that creates trust that none of the tools use generative AI, and uses something like PGP to sign the art so produced to certify it AI-free.
None of the AI companies have any incentive to create a model that creates a fake and believable edit history. The only use for such a model is to commit fraud, for "artists" contracted to deliver non-AI art but wanting to fudge it. That's not a very large potential customer base, and training new models is really expensive.
 

None of the AI companies have any incentive to create a model that creates a fake and believable edit history. The only use for such a model is to commit fraud, for "artists" contracted to deliver non-AI art but wanting to fudge it. That's not a very large potential customer base, and training new models is really expensive.

Yeah, creating what are essentially fraud tools would only serve to delegitimize the programs they are making, rather than pushing them as a norm.
 


Vincent55

Adventurer
That's not necessarily true, and even if it were that doesn't give you permission to use their work without permission or recompense. Just because Taylor Swift makes money off her albums and concerts doesn't mean I can start using her songs in commercials without her permission.



But you can copyright works, and if they're using those works as part of your process, it seems to me that it should be done with permission given that the output could not exist without the input. I think that's the thing here: this is not a machine making something up on its own, but basically doing a complicated form of a tracing where it is hacking together different things.

The simple, ethical way of doing this would be to simply actually pay for permission or to pay artists to create pieces to a collection that the AI can work off of.
I don't think you are getting my point, the AI is using an artist's style not reproducing a piece of art he had done that is illegal, you can't own a style of art, impressionist, Dadim Cubism etc. Anyway, you all seem to be just not getting it so bye
 

yes and no

We're speaking of an artist training a model to create intermediate layers so he can pass off AI-assisted work for non-AI-assisted work in order to breach a contract. WotC suppliers would need to make a lot of money to make it viable, even if, as you rightly pointed out, the price of training data is dropping. I don't see anyone doing that honestly in the near future.
 

Remove ads

Remove ads

Top