WotC Would you buy WotC products produced or enhanced with AI?

Would you buy a WotC products with content made by AI?

  • Yes

    Votes: 45 13.8%
  • Yes, but only using ethically gathered data (like their own archives of art and writing)

    Votes: 12 3.7%
  • Yes, but only with AI generated art

    Votes: 1 0.3%
  • Yes, but only with AI generated writing

    Votes: 0 0.0%
  • Yes, but only if- (please share your personal clause)

    Votes: 14 4.3%
  • Yes, but only if it were significantly cheaper

    Votes: 6 1.8%
  • No, never

    Votes: 150 46.2%
  • Probably not

    Votes: 54 16.6%
  • I do not buy WotC products regardless

    Votes: 43 13.2%

Status
Not open for further replies.

log in or register to remove this ad

Again: search engine. You can always complain to whoever hosts wikidot that they are hosting copyrighted material without permission. It may even get the site removed.
I don't see why you're linking me a "how google works" article. I understand how search engines work. Perhaps you can state the point you're trying to make?

Is it that there is no problem with search engines facilitating piracy? Or no point in trying to stop them? Or are they unable to meaningfully address it?
We have given them this latitude because the only (existing) legal routes for denying them that latitude functionally make it so search engines cannot meaningfully exist.
I'm not asking for us to come up with a legal system here. I'm asking about the ethics. Here, let's make the argument precise:

1) Google directs people to pirated content very effectively

2) Google knows this and is able to modify search to do so less effectively

3) They choose not to, beause people like pirated content and like that they can find it using Google. The legalities are grey enough that google can get away with it. As a result they can sell more ads.

If (1)-(3) are true, is google acting ethically? Or do you take issue with any of the propositions?

I suspect we'll end up in a similar place with LLMs. Libgen is out, web crawls are ok, and the legalities will be grey enough they can get away with it. As you state for search, I'm not sure LLMs can exist in the same way without web based data.

If 5 years from now the legal system decides that is ok, do we then conclude the LLMs are ethical because they are legal? I don't think that tracks.
 

Working with the hypothetical of the poll. What if WotC’s next product was made with AI? By this I mean there is AI art in there, writing done with AI and the company is open about using it.

Would you stop buying their products if you haven’t done so already?
Even without the ethical angle, I wouldn't want that product. The content would have no value to me.
 

1) Google directs people to pirated content very effectively

2) Google knows this and is able to modify search to do so less effectively

3) They choose not to, beause people like pirated content and like that they can find it using Google. The legalities are grey enough that google can get away with it. As a result they can sell more ads.

If (1)-(3) are true, is google acting ethically? Or do you take issue with any of the propositions?

I'd say the answers depends on what one's own ethical principles are, and therefore, I am not sure it can lead to a consensus.

Person A will say "in my ethical system, private property is theft, therefore there is no reason call this content pirated. It's just content, and I don't see any ethical problem with it -- irrelevant of whether it's legal or not."

Person B will say "in my ethical system, everything that increase the total global happiness is ethical, so Google allowing many people to access the pirated content creates a lot of small happiness. Sure it causes a little less happiness to the author, but overall it's a net positive, so it's ethical".

Person C will say the exact same thing but conclude that the net result is negative, saying Google is unethical.

Person D will say "in my ethical system, one shouldn't profit from the labour of other people, so google is unethical, even when it links to non-pirated content".

Person E will say "Google's owner do not follow the right God, and my ethical system makes my God the source of everything right, so Google is unethical by design".

I am not sure there will be a way to reconcile all those positions, contrary to the legal question "is Google legal where I live?" that can be answered by reading the law (if it's clear when you live) or paying much money to a lawyer (if it's not). And the same will happen irrespective of the topic: I doubt a consensus can happen on the ethicality of democracy, monarchy, killing people, smoking, cars, the Jacquard loom, yelling fire in a theater, marriage and apparently, ear-piercing anymore than we could on search engines.


If 5 years from now the legal system decides that is ok, do we then conclude the LLMs are ethical because they are legal? I don't think that tracks.

Probably not, but law isn't a matter of individual preference. They might help discern thigns that are categorically unethical, since they would be banned absolutely everywhere, as soon as a society was powerul enough to enforce banning things it thought wrong, but I don't think there are lot of examples of this.
 
Last edited:


I am not sure there will be a way to reconcile all those positions, contrary to the legal question "is Google legal where I live?" that can be answered by reading the law (if it's clear when you live) or paying much money to a lawyer (if it's not).
Agree in general. I don't think the case for "LLMs are bad but Google is not" is all that strong, so I'm thinking it through.
 

Agree in general. I don't think the case for "LLMs are bad but Google is not" is all that strong, so I'm thinking it through.

I can find a way to rationalize it.

Let's assume that you define something that both are enabled by Google and LLMs, like "learning how to play D&D".
If someone lists that as an unethical activity, one could say that LLMs are doing evil directly, by teaching D&D, while Google is only accomplice to the evil done by the writers of D&D books.

So, if one both has "D&D is evil" and "Being an accomplice is OK, because all the responsability rests on the original author" as basic truth of his ethical system, he could be OK with Google and not LLMs. Actually, he would also need a third clause "Any single use of a tool makes the tool itself unethical".

Or one could just decide that "Nobody should use LLMs" is among the categorical imperatives. Passes the test of universality and humanity.
 
Last edited:

I don't see why you're linking me a "how google works" article. I understand how search engines work. Perhaps you can state the point you're trying to make?
Because you didn't seem to quite get why the 5e wikidot was the first result. If you know how search engines work, then it shouldn't be much of a surprise.

Is it that there is no problem with search engines facilitating piracy? Or no point in trying to stop them? Or are they unable to meaningfully address it?
That's a bit like saying brick-and-mortar stores facilitate shoplifting.

I'm not asking for us to come up with a legal system here. I'm asking about the ethics. Here, let's make the argument precise:

1) Google directs people to pirated content very effectively

2) Google knows this and is able to modify search to do so less effectively

3) They choose not to, beause people like pirated content and like that they can find it using Google. The legalities are grey enough that google can get away with it. As a result they can sell more ads.

If (1)-(3) are true, is google acting ethically? Or do you take issue with any of the propositions?
1) Stores have plenty of blind spots where cameras and employees can't see effectively, thus enabling shoplifting.

2) Stores know this and are able to modify their layout to make shoplifting less effective.

3) They choose not to, because people like to shoplift and like they can do so in the store.

OK, you have to realize that what you're saying makes no sense. In reality, it's this:

1) Google (also Bing; I checked) leads people to pirate sites because those sites know how to use the search algorithms to their advantage. Also, word of mouth led to those sites becoming popular, which causes them to get shown higher in a search result.

2) It's not cost-effective for Google to hire people to go to every single website in existence to determine whether or not the site contains pirated material.

2a) Nor is it Google's job to do so, any more than it's the job of whoever put out the Yellow Pages to make sure that every company that put out an ad in them was completely on the up-and-up.

2b) Instead, it's the job of WotC's legal team to go around, find sites, and take them down.

2bi) WotC has apparently chosen to pick their battles in this case, probably because the 5e wikidot and similar sites aren't going out of their way to broadcast their ill-gotten gain.

This also has nothing to do with using AI instead of paying actual human writers and artists. Please stop with the whataboutisms.

I suspect we'll end up in a similar place with LLMs. Libgen is out, web crawls are ok, and the legalities will be grey enough they can get away with it. As you state for search, I'm not sure LLMs can exist in the same way without web based data.
You can entirely train them on material you own and/or that is in the public domain.

However, then you're still using AI instead of paying actual writers and artists.

If 5 years from now the legal system decides that is ok, do we then conclude the LLMs are ethical because they are legal? I don't think that tracks.
Lots of things that are or were legal are unethical, and lots of things that are ethical, or at least not unethical, are or were illegal. There's no point conflating the two.
 

So, if one both has "D&D is evil" and "Being an accomplice is OK, because all the responsability rests on the original author" as basic truth of his ethical system, he could be OK with Google and not LLMs. Actually, he would also need a third clause "Any single use of a tool makes the tool itself unethical".

Or one could just decide that "Nobody should use LLMs" is among the categorical imperatives.
I agree that's the basic structure of the argument. The weak point, to me, seems to be "being an accomplice is ok", especially in the case that the accomplice is 1) aware of the crime, 2) aware their service helps people commit the crime, 3) could make their service less helpful for those purposes but chooses not to, and 4) instead actively profits off the crime.
 

Because you didn't seem to quite get why the 5e wikidot was the first result. If you know how search engines work, then it shouldn't be much of a surprise.
My point wasn't surprise.

My point was that the fact that these results are so highly ranked suggests google could do something about them. We all know the sites that pirate; we see them all the time. Would it be that difficult for google to implement a rule like: "if we find credible evidence of piracy on a site, we deindex that domain for 6 months"?

Google already has deindexing procedures in place. Why not apply them?
That's a bit like saying brick-and-mortar stores facilitate shoplifting.
This is a poor analogy because the store here owns the material that would be shoplifted. A better one would be a storefront that sold schematics of other store's security protocols.

2) It's not cost-effective for Google to hire people to go to every single website in existence to determine whether or not the site contains pirated material.
Agree. I don't think that's the right way to implement this policy.

I do think it is easy for users to identify pirated material on sites that are widely visited, and straightforward for google to verify or reject such reports. Again--they already have procedures to deindex sites. They don't apply them to sites with pirated material that are top results and widely used for years.
This also has nothing to do with using AI instead of paying actual human writers and artists. Please stop with the whataboutisms.
If you're not interested in this discussion feel free to leave. I think it's been interesting.
Lots of things that are or were legal are unethical, and lots of things that are ethical, or at least not unethical, are or were illegal. There's no point conflating the two.
Agree.
 

Status
Not open for further replies.
Remove ads

Top