WotC Would you buy WotC products produced or enhanced with AI?

Would you buy a WotC products with content made by AI?

  • Yes

    Votes: 45 13.8%
  • Yes, but only using ethically gathered data (like their own archives of art and writing)

    Votes: 12 3.7%
  • Yes, but only with AI generated art

    Votes: 1 0.3%
  • Yes, but only with AI generated writing

    Votes: 0 0.0%
  • Yes, but only if- (please share your personal clause)

    Votes: 14 4.3%
  • Yes, but only if it were significantly cheaper

    Votes: 6 1.8%
  • No, never

    Votes: 150 46.2%
  • Probably not

    Votes: 54 16.6%
  • I do not buy WotC products regardless

    Votes: 43 13.2%

Status
Not open for further replies.
Trying to get back on topic, I don't think that comparing (as Umbran stated) scientific fields (which are peer-reviewed and typically open-source) to the publishing industry is going to do anyone any favors, probably best to keep them as separate discussions.

I agree. The point to my raising the other forms of use is that people... tend to get myopic when not made aware of broader scope.

I think it is still fair to not want to buy game products that made heavy use of generative AI, for example. I just think the impact to visual art and entertainment writing are not a basis to condemn the technology entirely. It is a tool. There are good and bad uses for it, like with all tools.
 

log in or register to remove this ad

Using AI to add details to those notes, and then making them pretty enough to sell, is stealing, though. So nope.
So here is the problem I see with this perspective: the same training data was used to do both of those things (collate and collect, and fill in details and lay out). Therefore, if one is stealing, so is the other. If the question is whether using AI is unethical because of the way they were trained, then ANY use must be unethical for one to be ethically consistent.

The problem with using AI generated content to make a game you (or WotC) sells is not the ethics of the training -- after all, neither you nor I nor WotC trained the AI -- it is the ethics of not paying artists and writers to make art. It is injecting slop into your product and then making people pay for it.
 

The problem with using AI generated content to make a game you (or WotC) sells is not the ethics of the training -- after all, neither you nor I nor WotC trained the AI --

If one were ignorant of how that training was done, perhaps then one could be given a pass. But, once one knows, willing use of the technology becomes complicity in that process.
 

I get what you are trying to say, I do.

I veered off of topic by mistake (and was reminded by Umbran), so I will try to get back on topic.

Trying to get back on topic, I don't think that comparing (as Umbran stated) scientific fields (which are peer-reviewed and typically open-source) to the publishing industry is going to do anyone any favors, probably best to keep them as separate discussions.

As for "this kind of attitude," well techbros have done this to themselves by trying to cram the word "ai" into everything they can for profit. You can't really have a discussion about generative ai without somebody straying off course (it was me this time!) into other fields, because they have used the term so broadly, that pinpointing what it is actually being applied to is becoming harder by the day.
For my part, I agree with your statement that there have not been that many practical applications when looking solely at text or image generation. That said I think we are still in the exploratory phase of those applications and the improvements are really rapid. I'm not sure if they'll ever reach the point of genuine creativity. But I think it is premature to declare that they've lost.

(Or, apparent creativity. I think most people would agree humans can think creatively in Chess or Go. Machines are now better at these and make moves that appear creative, and would be interpreted as creative if a human made them. Are they actually? Probably not, and certainly not in the same way).

Is 'this RPG book would be very creative if a human wrote it' a plausible expectation?
 

My point is that "generative AI", as a technology, is not limited to English text and fantasy art with too many fingers! It can be applied to any form of data!

I, personally, as a physics graduate student, did research on using early forms of the technology (before the term "generative AI" even existed) on high energy physics data, to help configure particle detectors and their software.

Our website is only dedicated to one tiny corner of the possible use of the technology. Condemning the tech in general based on our corner is... logically flawed.
100% Agree. I am not condemning any technologies, what I am opposed to is the unethical use of that technology which can cause harm to people (whether intentionally by bad actors, or indirectly by "hallucinating" false facts, such as the Gen-AI Mushroom book that can potentially become hazardous to your health).

I have also previously stated that I am in favor of the improvement of medical diagnostic tech, but I think we have already covered that this is not related to the conversation about generative ai within the context of publishing ttrpg's.

ATM, no regulations exist whatsoever. You can have gen-ai generate a mushroom book for you, complete with ai generated images of mushrooms, publish it on a major platform, and face zero consequences when someone ends up getting poisoned as a result of trusting the information in that ai generated book.

You can generate a book based on the work of another author, use their name in the publication, then publish it on a major platform, and it looks just like the original authors work (at the surface level), this happens all the time now.
 

Within the RPG context, there is a huge amount of purely human work already, so it'd not like there's a gap to fill anyway. WotC already produces a lot fewer rulebooks and adventures than other major companies, so it's not like they're running low on human talent to keep up the pace.
 

If one were ignorant of how that training was done, perhaps then one could be given a pass. But, once one knows, willing use of the technology becomes complicity in that process.
That is an interesting debate, but probably not one fit for EN World unless we want to start talking about child labor, human trafficking and slavery, and impending environmental catastrophe.
 

Is 'this RPG book would be very creative if a human wrote it' a plausible expectation?
Like I said upthread, there will likely be a point in the relatively near future where most people will not be able to tell the difference, or at least nor be willing to put in the effort required to find out. I think the question is moot at that point.
 

That is an interesting debate, but probably not one fit for EN World unless we want to start talking about child labor, human trafficking and slavery, and impending environmental catastrophe.

Well, all those other topics could be considered tu quoque - aka "whataboutism".

The idea that, if you handle one ethical consideration in a particularly way, that you must treat all similar considerations in the same way is actually a logical fallacy. Each ethical consideration can, and should, be made on its own merits, not on the merits of other ethical issues.
 

For my part, I agree with your statement that there have not been that many practical applications when looking solely at text or image generation. That said I think we are still in the exploratory phase of those applications and the improvements are really rapid. I'm not sure if they'll ever reach the point of genuine creativity. But I think it is premature to declare that they've lost.
"They Lost" is nowhere to be seen in my previous posts. Not sure where you are getting that from...

On the contrary, they already won in many ways. OpenAI just got a government contract for 500 billion if I remember correctly. They obviously have the upper hand, compared to all the artists, writers, voice actors, animators, designers, and other creatives who typically can't afford to hire good legal counsel, OpenAI can weather any legal storm just by sitting back and waiting while those affected by their deliberate actions to harm them are suffering right now in the present.

Not to mention that nobody seems to care about the environmental impact.

However, their position of priviledge does not justify any of their actions. This "move fast and break things" motto is leading us down the path of no return.
 

Status
Not open for further replies.
Remove ads

Top