WotC Would you buy WotC products produced or enhanced with AI?

Would you buy a WotC products with content made by AI?

  • Yes

    Votes: 45 13.8%
  • Yes, but only using ethically gathered data (like their own archives of art and writing)

    Votes: 12 3.7%
  • Yes, but only with AI generated art

    Votes: 1 0.3%
  • Yes, but only with AI generated writing

    Votes: 0 0.0%
  • Yes, but only if- (please share your personal clause)

    Votes: 14 4.3%
  • Yes, but only if it were significantly cheaper

    Votes: 6 1.8%
  • No, never

    Votes: 150 46.2%
  • Probably not

    Votes: 54 16.6%
  • I do not buy WotC products regardless

    Votes: 43 13.2%

Status
Not open for further replies.
Well, all those other topics could be considered tu quoque - aka "whataboutism".

The idea that, if you handle one ethical consideration in a particularly way, that you must treat all similar considerations in the same way is actually a logical fallacy. Each ethical consideration can, and should, be made on its own merits, not on the merits of other ethical issues.
But the idea that awareness of a wrong makes one culpable is a singular philosophical idea. You can talk about degrees of harm and perhaps the extent of culpability based on whether the commodity or service is a need versus a desire,but those are questions of scale, not kind.
 

log in or register to remove this ad

"They Lost" is nowhere to be seen in my previous posts. Not sure where you are getting that from...

On the contrary, they already won in many ways. OpenAI just got a government contract for 500 billion if I remember correctly. They obviously have the upper hand, compared to all the artists, writers, voice actors, animators, designers, and other creatives who typically can't afford to hire good legal counsel, OpenAI can weather any legal storm just by sitting back and waiting while those affected by their deliberate actions to harm them are suffering right now in the present.

Not to mention that nobody seems to care about the environmental impact.

However, their position of priviledge does not justify any of their actions. This "move fast and break things" motto is leading us down the path of no return.
Not from you. But other posts in the thread that said LLMs can't be creative, can't be imaginative, and therefore are not going to be comparable to a human in output. I know some of the people taking that line said it would be close enough that people would be happy buying from the LLMs.
 

The majority of writers and artists are already not living off their works. “Don’t quit your day job” might be used as an insult unfortunately often, but for most creatives that’s just being realistic. That’s not a knock against their skill, a lot of my favorite works were made by people who did it as a side project. It’s what I’m going to have to do too, if my stuff ever gets published I know I have barely any chance of it getting well known enough to live off it.

Want to help the mass of artists? Protect the day jobs. The people who could sell anything on name recognition alone will be just fine.

Putting my cards on the table here: until those changes in the job market are made, I’ll just have to stick with the industry that asked “can you do the job?” rather than “where do you see yourself in 5 years?” or other inanity.

Edit: One of my aside comments was made more out of bitterness than anything, took it out.
 
Last edited:

Not from you. But other posts in the thread that said LLMs can't be creative, can't be imaginative, and therefore are not going to be comparable to a human in output. I know some of the people taking that line said it would be close enough that people would be happy buying from the LLMs.
Thank you for pointing that out, but do you care to respond to my last post points?
 

Thank you for pointing that out, but do you care to respond to my last post points?
I don't think you will like my answer. I understand that they are resource intensive technologies. I think that is acceptable given the benefits of them. Compare something like crypto, which has far fewer positive impacts.

Regarding the ethics of web scraping for training, I don't find it ethically suspect. Perhaps this is a consequence of my background in the sciences, where greater accessibility is good thing. I would love for a LLM to scrape everything I ever produced and to use it as training data. But it includes my (limited) created output as well--anything I write for RPGs, any resources I prepare, any character builds, mechanics--feed it all in, as far as I'm concerned.
 

Not from you. But other posts in the thread that said LLMs can't be creative, can't be imaginative, and therefore are not going to be comparable to a human in output. I know some of the people taking that line said it would be close enough that people would be happy buying from the LLMs.
An LLM cannot be creative, but other programs that are at least closer to AI could, and that might utilize LLMs as a component. LLMs are a very specific thing.

That some people can't tell the difference is a whole different issue.
 

But the idea that awareness of a wrong makes one culpable is a singular philosophical idea.

More like it is a broad philosophical generalization. Like generalizations in general ( :p ), it should be examined in each particular case separately.

You can talk about degrees of harm and perhaps the extent of culpability based on whether the commodity or service is a need versus a desire,but those are questions of scale, not kind.

Well, you see, that assumes the conclusion. You have to demonstrate that the only questions are ones of scale, and also demonstrate that "questions of scale" are not themselves reasons to make different judgements.

Mis-application of an ethical standard is apt to cause harm. Thus, you cannot ethically assume that an ethical approach formulated for one particular case applies to all similar cases - for in the points where similarity fails may be harm!

Ergo, as much as is practical, one should examine particular cases, not jump to an overall standard. The standard is, at best, a place to start a consideration.
 

Now why should I buy this person's AI product when I can use AI to do the exact same thing?
You shouldn't. I think the example provided was of someone gaining exposure by using AI to tidy up and "professionalize" some otherwise-very-creative scratch notes with the intent of simply putting it out there for people to use, no money involved.

In other words, it's in effect being released directly into the public domain.

I've got numerous adventure modules of my own with which I'd like to do exactly this: tidy them up then release 'em straight to the public domain. I just need the OGL to be made expressly applicable to pre-3e editions, which hasn't happened yet despite the OSR workarounds. (I think I also need a few laws to change as I'm not sure if one can legally release directly to public domain, at least as far as the USA is concerned)
The art and actual text would all be fake AI slop.
Just because it's AI-made doesn't make it slop. Some AI art is IMO very good.
 

Sure, but user error isn't a compelling argument against a tool. At best it is a requirement for skilled users.
if I already know what the result is, why do I ask an AI? If I have to double check everything it says, does it really save me any time?

I get it, for D&D some hallucinations rarely matter, in some other fields they could endanger lives however
 

Not from you. But other posts in the thread that said LLMs can't be creative, can't be imaginative, and therefore are not going to be comparable to a human in output. I know some of the people taking that line said it would be close enough that people would be happy buying from the LLMs.
It can be both: LLMs are absolutely not creative and certainly not imaginative. But lots of people won't care, because their appreciation of both creativity and imagination is shallow, casual, and easily sated.
 

Status
Not open for further replies.
Remove ads

Top