RPG Evolution - The AI DM: The Trouble with Art

AI's recent surge in popularity generated art that sometimes looked like someone else's. How can gamers use it ethically?

robot-1904270_960_720.jpg

Picture courtesy of Pixabay.

The Problem​

Because what we term "AI" are Large Language Models (LLM), the "intelligence" part of "Artificial Intelligence" is actually us. LLMs use data sets to generate their content, much of it publicly sourced from what's freely accessible on the Internet. And that's where AI art gets into trouble.

Art that is AI generated uses its data set to blend it into something recognizably similar to user-entered parameters but (according to AI developers), uniquely different. The problem is that often the art is TOO similar; so similar that it looks just like an artist's work, down to faking signatures.

Which raises a legitimate concern: if AI art can effectively mimic an artist's style for free, will anyone still pay the artist?

How Did We Get Here?​

Part of the problem is that artists advertise their by sharing it for free on the Internet. In the physical world, an artist might hang art at a booth. Only the memory of that art is in the mind of potential customers. They don't walk away with a copy.

But on the Internet, everything is copied for future reference. Google's image searches can dig deep into sites to find pictures independent of their creators' sites. That said, Google doesn't store copies (a fact that was critical in a court decision). Pinterest, however, does.

Pinterest doesn't just store a thumbnail graphic, it stores a full-sized copy. By merely pinning any graphic, users are unwittingly giving Pinterest advertising revenue and potentially violating copyrights. Examples abound of this, but the most common is a "phantom pin" in which the pin no longer links to the site, essentially keeping a photo on the Internet long after the artist has revoked permission.

Unfortunately court cases have not swung in favor of artists, ruling that it's the people pinning the content, not the site, that is the problem. This is all coming to a head because some art LLMs use Pinterest as a dataset, thereby creating content inspired by artists who never consented to their art being used in the first place.

What to Do About It​

The biggest problem with AI art is the kind that's generated from scratch. This is the type that uses Pinterest to generate its images. Fantasy art in particular is dominated by Magic: The Gathering, and it's not uncommon to try to create a monster via AI only to be served up what looks like card art.

Similarly, it's nearly impossible to make a creature have spider-like characteristics without Spider-Man's red-and-black web pattern and large white eyes. Spider-Man's so popular as art that he effectively has replaced what real spiders look like on the Internet, warping AI's perception of what "spider-like" means.

The obvious answer for game developers is to not use AI-generated art. Paizo won't. Wizards of the Coast won't. Most other major RPG publishers won't. This is important, because these statements aren't just a commitment to artistic ethics: it means these companies will continue paying artists for their art.

But there are other ways that art can be ethically sourced. One way is to use AI to modify art so it looks like a different style. I'm particularly fond of taking art I've created (and own) and asking an AI to make it look more realistic. Conversely, you can apply these types of AI filters to documents that were intentionally released into the public domain with clear licenses. Using AI this way, it can turn clipart into three-dimensional monsters and characters, or turn a standard creature into something more exotic (a bull can become a metal gorgon, a bird can become a phoenix, a human bard can become an undead bard).

For game masters who are using art for their home games, AI art can act as a tool to illustrate what's happening in a game: character portraits, maps, landscapes, monsters, and magic items.

For artists, offering free content to potential customers now comes with significant risk. It's always been possible for users to just steal art, but thanks to AI it can now be stolen at scale without tracing it back to the original owners. AI isn't currently required to show its homework, and until it does, there's a legitimate argument that posting anything for free is no longer worth the risk. A login or paywall may be increasingly necessary for artists to balance advertising their services while protecting their work.

Unfortunately for many artists, it already may be too late. Even if you take your art down today, Pinterest is saving it without your consent, and LLMs are using that data to build its art without proving where it got it from. As publishers, declaring when and where AI art is used (or not used) is an important first step.

But the group most influential in the future of AI art is us. Perhaps the best we can do is ask for AI art to be labeled and then make our down decisions about whether or not to purchase it.
 
Last edited:

log in or register to remove this ad

Michael Tresca

Michael Tresca


Blue

Ravenous Bugblatter Beast of Traal
Just going to put out some contrary ideas here.

First, picture a blind human who is trying to make realistic renderings. We wouldn't expect anything photorealistic. Why? Because they haven't learned what things look like. People spend their entire lives looking at what things look like. When my youngest was young and watched Steven Universe, they picked up a lot of style in their art. That later has mostly been replaced over the years, but even today when they are doing animation now you still see traces of how simplification-for-animation comes through - learned from observing what other people have done.

Things like AI art are doing the same thing - they are experiencing lots of different art out there and learning how things go together. Just like humans do. And yes, they can be used for "bad" things, like copying a specific artist's style, just as human artists can. We call that forgery, or at a lesser level derivative works - which can be protected against legally.

But the majority isn't derivative works, any more than seeing a picture of Times Square in Manhattan means that a human artist is trying to duplicate it when they create a cityscape. It seems that people focus on that the models were trained on images without being able to realize that humans do the exact same thing.

My eldest child's career is art-focused, and my youngest is heading that way. AI art are tools. Where it is now there's a large amount of human effort needed - prompting, selecting, inpainting, combining (since AI art is notoriously bad at keeping triggers separate from each other when you have multiple focuses). Pandora's Box has been opened, artists need to learn to use these tools. Spellcheck reduced one need for editors but definitely didn't get rid of them. Computer aided "tweeners" that did between frames in animation didn't get rid of animators. My own IT field has seen immense amounts of automation over my decades of work - and mostly what that has done is freed me to work on the more interesting parts of what I do. The future will have these and other tools, and like other fields art needs to move along with them.
 

Zaukrie

New Publisher
My theory is that I can use it in a home game....but that I need to not use it for my products. I do have two that have about 5 total pieces of art before I thought about the issue.

Unless Europe makes it illegal, though, AI art will be unstoppable (and even then, it might be). "Progress" is very hard to reverse.
 




Hand of Evil

Hero
Epic
This is why image tagging (may be going by a different names) is and will become more important than ever. It not only will stop fakes but will provide a means of protection. Adobe Photoshop and others are putting a "information" tagging system in place. You should be seeing more of it in the near future.
 

OTHG

Explorer
This is why image tagging (may be going by a different names) is and will become more important than ever. It not only will stop fakes but will provide a means of protection. Adobe Photoshop and others are putting a "information" tagging system in place. You should be seeing more of it in the near future.
I hear you but I wonder if the horse isn't already out of the barn and currently galloping down the interstate.
 

Today AI can create awesome portraits, but if we are talking about two or more characters interacting, then you have to need the right "promts" and pregenerated models.

With AI you can create photorealistic NSFW content, and event that type of content it would be illegal with real people. When AI you could use the picture of a famous Hollywood child star to create a picture of this wearing a bikini in the sunny beach.

With AI you could create mash-up portraits mixing two different franchises, for example Star Wars characters but version Warhammer 40.000.

Other point is the AI drinking from sources could be breaking accidentally some copyright, for example to create the image of a character with the face of a model or actress.

You can use AI art to create pictures for your games, because it is allowed when it is no-profit. And you can't create pictures with certain characters from fiction if any body hasn't created before the model. You can't create pictures of lord Soth and other characters of Dragonlance if nobody create the models before.

And AI art can't be protected by copyright. This could be really important.
 

Related Articles

Remove ads

Latest threads

Remove ads

AD6_gamerati_skyscraper

Remove ads

Upcoming Releases

Top