WotC Would you buy WotC products produced or enhanced with AI?

Would you buy a WotC products with content made by AI?

  • Yes

    Votes: 45 13.8%
  • Yes, but only using ethically gathered data (like their own archives of art and writing)

    Votes: 12 3.7%
  • Yes, but only with AI generated art

    Votes: 1 0.3%
  • Yes, but only with AI generated writing

    Votes: 0 0.0%
  • Yes, but only if- (please share your personal clause)

    Votes: 14 4.3%
  • Yes, but only if it were significantly cheaper

    Votes: 6 1.8%
  • No, never

    Votes: 150 46.2%
  • Probably not

    Votes: 54 16.6%
  • I do not buy WotC products regardless

    Votes: 43 13.2%

Status
Not open for further replies.
Generative AI doesn't copy. If I ask for an orc wielding an axe it doesn't find an image with those tags and gives it to me. Instead it takes tiny bits from every image it knows of to match those tags and makes something similar to all of those. Just like if I were to draw an axe I would find a bunch that interest me and draw something like one of those.

I'm not opposed to AI tools paying for training material, but then again, I would think if such were required of AI, it would also be required of art teachers who use other artist's work.
If you're suggesting "ai" can create something I disagree as I think anyone with a cursory understanding of what a LLM is would agree.

Given they can only regurgitate what they have been shown, and often verbatim, it is clearly a different circumstance than a human taking inspiration from another creator.

"Ai" just traces art from "enough" people to obscure it is just a collage and not a unique creation inspired by others with it's own creativity applied.

As far as textual responses from "ai" there are plenty of blatant examples of word for word plagiarism.

As I have said LLM'S have uses, I just can't fathom how they can be legally or ethically used to create commercial products, nor can I understand why anyone would pay a company for something they could just have a LLM make for them without paying WotC for asking the LLM for them 🤔
 

log in or register to remove this ad

Last edited:

I'm not a physicist, but Newton's Laws are not lies. They have been accepted consensus science for so long because his theories fit the data as we understand it. If physicists have improved our understandings in recent decades, disproving Newton's Laws
Not even in recent decades. Einstein overturned Newton in 1915
 

What humans have considered theft has changed over time. At one point in time taking something by force or arm was acceptable, and in some rare cases still is. And their is no one consistent view. Look at parts of the open source movement.
Quoth wikipedia:

Programmers who support the open-source-movement philosophy contribute to the open-source community by voluntarily writing and exchanging programming code for software development.
"Voluntarily writing," as in, consenting to have their work freely distributed.

It is literally all about the consent.

And if you look at the core of your argument (as I understand it), that you can not ethically use the creations of another person to train or learn from without it being stealing. That would make almost every creator (artist, writer, code developer, engineer, etc) a thief. Because they all learn from those who came before them and build upon it.
:rolleyes: This argument again. Sigh.

There is a big difference between "drawing inspiration from" something and "actively taking something without permission."

But you are not going to like that. Let's see if we can delve into that deeper down below.

So, if I could solve world hunger, but I would have to steal the solution from someone who couldn't use it to solve world hunger it wouldn't matter. Interesting.
There is also a big difference between "solving world hunger" and "producing art or writing for a game." I mean, they're so different than it's ridiculous to even compare the two.

But we have been creating things to replace human labour for... thousands of years (at least). Why is this one technology different from all the ones that have come before it?
For one thing, we create things to replace human labor, in large part, so that people would have to engage in creative endeavors.

Also, I'm pretty sure that most farmers are OK with tractors replacing their labor, and most families are OK with laundry machines replacing their labor, and so on. How many actual artists and writers are OK with AI replacing them?

And third, machines that are made to replace labor are not trained how to do it by stealing from people. Just like calculators weren't programmed by stealing other people's math homework.

Nor did they consent to let other artists or writers learn from their works. Yet we don't say those other humans are unethical from learning from those who have come before them.
Few artists have actually said that others can't learn from them or copy them. I'm sure a some have, but most haven't. In fact, I'd wager that most artists want people to learn from them. And those that don't are typically very careful about putting copyrights on their work or suing those who clearly took their influence, or otherwise making sure people know what happened was Not OK.

And again, drawing inspiration from someone is not the same thing as stealing. If someone actually copies another person's work and tries to claim it as their own, we call that plagiarism and copyright infringement, which are Not OK.

But when you use AI to create a picture or a written document, you are, in fact, either taking credit for using material stolen from other people's work, or--if you say something like "I asked ChatGTP to make me a thing"--saying that you used a program that steals from other people's work.

It can and does both. Sure, someone can use it to create their final work. Or they can use it as a tool and part of the process. An artist can use it to create a draft which they then refine. Or can use it to refine a draft that they created by adding textures and fills etc. Writers can use it to generate drafts compositions of their ideas which they then manually refine. They can use it to check spelling, grammar, tense, and tone of something they have drafted.
It's funny how you use "we have been creating things to replace human labour for... thousands of years" as an argument, but don't also realize that people have been proofreading and designing their own work for thousands of years.

And in this hypothetical, how much of the final product is going to be original and how much will be AI? We already know that normal spellchecks aren't great--why would you think an AI spellcheck would be better? And if you use an AI to generate a draft composition, how does this help you improve as a writer?

Also, you need to learn the difference between AI and digital art. If you're an artist who does digital art, you still need to learn how to properly use textures and fills to make your art actually look good.

That's another bad thing about AI: you're using it as an alternative to actually learn and improve. AI doesn't teach you any skills. It doesn't make you a better or more creative artist or writer. Which means that if you start out mediocre, you're going to continue to be mediocre, because you will never learn to be good by using it. If you start out crappy, you'll continue to be crappy. You'll never even learn to be as good as mediocre.

Is that what you want the future of gaming to be?

It can create, and it can assist.
Unless it's actually sapient, it can't create.

Hmm, I don't agree that everyone has the latent talent that your assumption makes. And even if they do, do they have the other resources necessary to obtain such skill? And then, even if they can and do, is it equivalent to say; "You can make your own image in ten hours, after you have spent (30x365x2=) 365 hours to learn how to do so, and that is in all ways better than you spending ten minutes to create an image with procedural generation."
You want to learn how to draw? You need: something to draw with and something to draw on.

Again, the problem y'all are having is you think there's only one type of art and only one way to achieve it.

This particular piece of art was made in MS Paint.
1745299512564.png


This one was done with Crayola crayons
1745299603967.png


There's tons of art made with nothing more than ballpoint pens.
1745299842764.png


So you don't need fancy materials.

All you need is time. There's probably thousands, even tens of thousands of tutorials for beginners on youtube. There's scores of books on how to draw at your local library.

The best way to learn how to write is to read a lot.

And again, it doesn't have to be perfect because there's no such thing as perfect and there's no one art style. These are all dragons.

1745300158997.png
1745300172630.png
1745300186584.png
1745300241212.png
1745300267897.png
1745300323703.png
1745300347787.png
1745300416159.png
1745300440798.png
1745300786809.png
1745300292505.png
1745300567923.png
1745300614991.png
1745300704840.png
1745300735611.png
1745300947712.png
1745301022201.png
1745301429038.png


Let me repeat: There are many different ways to draw. There is no one perfect way to draw.

(One of those pieces is mine. Hint: It's the crappy one.)

Sure, maybe the one you created yourself if worth thousands of more than the AI one, but that's your value system, not everyone's.

So though you probably won't agree, I've addressed the first part of this statement so let's look at the second. Is it ok for a company to use AI in their publications?

Well, to me, it depends. Things that would influence that answer are; what AI did they use? How was it trained? Did they acknowledge the use of AI tools? What's the alternatives?
Most of the time, the alternative is paying someone to create art.

To me, a company using AI would be about as tacky as if they used nothing but a box of cheap clip art, like the type they used to sell in boxes of CD-ROMs at Staples. Cheap junk that isn't worth whatever the company is trying to charge for it.
 

Generative AI doesn't copy. If I ask for an orc wielding an axe it doesn't find an image with those tags and gives it to me. Instead it takes tiny bits from every image it knows of to match those tags and makes something similar to all of those. Just like if I were to draw an axe I would find a bunch that interest me and draw something like one of those.

I'm not opposed to AI tools paying for training material, but then again, I would think if such were required of AI, it would also be required of art teachers who use other artist's work.
So... it copies.
 

"Here's a thing that is useful. But to use it, you have to steal other people's artwork and writing. Also, it tends to produce stuff that is sub-par and often has big mistakes in it."

I feel like the usefulness of the product is far outweighed by the other two parts of it. And, sure, one of these days AI may not produce stuff that is sub-par and has big mistakes, so you're still left with that question: is it OK to use a device trained on other people's creative endeavors in order to avoid paying people for their time and effort, especially when it's for a company to use in their publications?

To me, it doesn't matter if it's useful.
To me, it matters. If something is useless and harmful, we should just discard it, while if something is useful but harmful we should try and see if it can also be made safe.

And again, those are two incredibly different things because calculators are not trained on stealing.
And again, that matters when discussing whether AI is beneficial, not whether it is useful, meaning it provides a functionality that was not previously available, or available in a different way.

But I feel that at this point we have made each other positions clear and are just going in circles.
 

I already know my art talent ranks right up there with that of a half-cut brick. Add to that that I'm fussy when it comes to art (if I ever had someone do art on commission they'd be quite justified in strangling me after I sent it back for the 17th time for more fine-tuning and revision), and AI becomes the answer. It doesn't care if I try 1000 times to fine-tune it or revise it or whatever, the robot just keeps on keeping on and the "scattergun effect" means sooner or later it'll get it right.
That’s actually not necessarily true. GenAI cannot produce results that don’t exist in their training data. For example, I invite you to try to get an AI image generator to give you a picture of a full wine glass. You can literally try a billion times with a billion different prompts, and you might get some interesting results, but I guarantee you none of them will be a full wine glass.
 

no, even without talent you will improve, further than someone with talent and no training, just not as far as someone with talent and equal training

Do something for 10000 hours and you will be good at it, talent or not
Talent is a myth. What looks like talent is mostly the result of hard work and practice. Some people take to some endeavors faster or slower than others, but nobody is inherently good at anything.
 

To me, it matters. If something is useless and harmful, we should just discard it, while if something is useful but harmful we should try and see if it can also be made safe.


And again, that matters when discussing whether AI is beneficial, not whether it is useful, meaning it provides a functionality that was not previously available, or available in a different way.

But I feel that at this point we have made each other positions clear and are just going in circles.
For me when at it's inception the programmers (creators) chose to steal content rather than negotiate licenses and give credit to other creators, without which there would be no content to train the LLM'S on, only to license and sell their wok which is built on theft from others are both hypocritical and ill-gotten gains which should be subject to confiscation released as Open source and placed in creative commons.
 

Well, 30-60 minutes per day is a major commitment for a hobby. This is coming at the expense of gaming or reading or music or cooking or running or dance or what have you. And most people with busy lives are only going to be able to pick one or two or maybe three, if they're lucky, to commit to in that way.
Yep, this is why skilled labor usually costs money. Art is a skill, and the production of art objects is labor.
 

Status
Not open for further replies.
Remove ads

Top