D&D 5E Replacement art is up for Bigby's AI art on D&D Beyond!

Right, but some people at the time claimed that the original art came from a different artist, and he then used the AI tools to enhance it directly and turn it into the published version, which would be bad. HOWEVER...
You are absolutely right. At the time people made all sorts of unfounded claims. Hence why my rant on misinformation and people being lazy.

I do appreciate you looking up the facts (not being lazy G) and posting back here. It just frustrates me that in ten years more people will still probably remember the misinformation about this event than those who will remember the truth :(
 

log in or register to remove this ad

Marandahir

Crown-Forester (he/him)
Ack still seeing people making unfounded claims on how AI filters work.

The so-called AI was trained on all the art it could find on the internet. It's not a just another tool in the toolbox, it's an amalgam of other people's works with the original work.

It falls into a similar territory as music sampling, only there it's a very specifically chosen piece of music drawn on, and the legal grey area is a fight worth fighting over.

So-called AI-enhanced art is dangerous to the art community as it could render new art "obsolete" by big business interests who utilize AI to regurgitate other people's works. Luckily, the legal space here seems to be very congruent with the idea that you do not own works you create with "AI"-enhancements, and thus there's less of an incentive to publish such knowingly.
 

could render new art "obsolete" by big business interests who utilize AI to regurgitate other people's works.

Although a business like Disney would benefit from a landscape that said you can only train AI on art that you already own. Disney owns an extraordinary amount of content and is able to train very good AI just on that. Sony used AI (trained on their own content, apparently) to animate parts of "Across the Spider-Verse" (source).

These enormous companies, with massive content libraries, would benefit from a ruling that prevented the creation of models trained on publicly available content.

In the same way, Google's updated terms of service allow it to use any data it collects for search to train it's AI models (source). If you don't want Google to use your website data, it's very simple - exclude your website from Google search (and disappear off the internet). I assume Bing has a similar terms of service. A legal ruling that restricted everyone else from using that data to train their text AI models (LLMs) would greatly benefit Google and Bing.
 

Marandahir

Crown-Forester (he/him)
could render new art "obsolete" by big business interests who utilize AI to regurgitate other people's works.

Although a business like Disney would benefit from a landscape that said you can only train AI on art that you already own. Disney owns an extraordinary amount of content and is able to train very good AI just on that. Sony used AI (trained on their own content, apparently) to animate parts of "Across the Spider-Verse" (source).

These enormous companies, with massive content libraries, would benefit from a ruling that prevented the creation of models trained on publicly available content.

In the same way, Google's updated terms of service allow it to use any data it collects for search to train it's AI models (source). If you don't want Google to use your website data, it's very simple - exclude your website from Google search (and disappear off the internet). I assume Bing has a similar terms of service. A legal ruling that restricted everyone else from using that data to train their text AI models (LLMs) would greatly benefit Google and Bing.
And here's the catch I hadn't heard. Yup, so this was all good PR while probably Hasbro's working on making it legal but less visible for future projects. UGH. Thanks for the sources and info.
 

Clint_L

Hero
Luckily, the legal space here seems to be very congruent with the idea that you do not own works you create with "AI"-enhancements, and thus there's less of an incentive to publish such knowingly.
That is not what current US law states anywhere. There are huge grey areas, but what people seem to be misconstruing is a 2021 ruling that AI without any significant human input cannot be copyrighted under U.S. law. That leaves a vast amount of grey area, and the USCO has further specified, this year, that work created with AI absolutely may be copyrighted if there is "sufficient" human creativity.

In other words, if you just type in the prompt: "do an image of a magic-infused kaiju-T-rex with multiple limbs," the result almost certainly is not copyrightable. In the US. If you actually create such a painting and then ask an AI to enhance, it almost is certainly copyrightable. In the US (the US: not the centre of the universe. Other countries have their own laws). Again, creatives have been using various types of AI-assisted enhancement on all kinds of art for decades now, without issue.

So you are badly misunderstanding the current state of US law on AI enhancement. These laws are still being created and there is plenty of litigation ongoing, and we have probably barely scratched the tip of the iceberg. But there absolutely is not some giant legal prohibition against using any form of AI in your creative work in the US, or mandating that you lose copyright protection the instant you use AI. WotC likely didn't drop this artist because of legal worries, they dropped him because of PR.

Note: not a lawyer, but I checked with my bestie who is a lawyer with decades of experience, though not in the US (New York State) since the early 2000s; he now practices in Canada. I wish someone like Snarff would weigh in. My buddy also pointed out that his law firm has been using AI-generated text on many legal documents for some years now, saving quite a bit of money by doing so.
 
Last edited:

Getty Images has created their own AI generator trained exclusively on their own vast library of images. They guarantee anyone using the images is protected from copyright claims - presumably because when you upload content to Getty, you sign over vast amounts of your rights.

They are promising to share revenue from AI sales with all creators, but given the huge number of creators it will be fractions of a penny.


As an aside, it's worth noting that Getty has sued Stability AI for using their database of images without consent!
 

Stormonu

Legend
Bah. To me the whole thing is a tempest in a teapot.

I did find this when looking through the Dragonlance adventure. Clearly, some of the objects in this are CGI, and that has me wondering if AI may have had a hand in sharpening or generating some of the detail. If AI was used in this, then nobody cared at the time to point it out.

1695725614247.jpeg
 



Parmandur

Book-Friend
Bah. To me the whole thing is a tempest in a teapot.

I did find this when looking through the Dragonlance adventure. Clearly, some of the objects in this are CGI, and that has me wondering if AI may have had a hand in sharpening or generating some of the detail. If AI was used in this, then nobody cared at the time to point it out.

View attachment 296276
Digital painting ≠ "CGI" or "AI"
 

Remove ads

Top