The AI Red Scare is only harming artists and needs to stop.

Moral panics, unfortunately, happen. They are a common fallout of human cognition.

Saying that such panic shouldn't happen, or has to stop, is equivalent to saying, "We should be much more rational than we are." Which is a great notion, but it gets hard to get a handle on practical solution.
I think Friend Computer from Paranoia may have actually had the right idea. We may need to give mood stabilizers and tranquilizers to everybody
 

log in or register to remove this ad

To be blunt: without political intervention, your job will become as obsolete as shorthand stenographers and calculators (persons paid to work out calculations). The only way it won't is if the great filter is "High tech civs blow themselves to smithereens before becoming multi-planetary." The question isn't "if" but is "when?"

Several direct parallels for high skill work replaced by tech...
  • Gregg Stenographers - replaced by the dictaphone and stenotype stenographers.
  • Stenotype Stenographer: being replaced by taped and digital audio.
  • Dictaphone transcriptionist: slowly being replaced by automated transcription.
  • Human Computer (person paid to manually do mathmatics): started being replaced in the 1880's with tabulators; finally died in the 1970's.
  • Punchcard/punchtape typist. Displaced out of use by the 1990's as computer mass storage devices and file systems on disks (floppy and hard) improved the UX.
  • Telegrapher (be it Morse or other such encodings): replaced by phones, and later computers.
  • Film Projectionist: went from one per screen per shift to one per shift, then to one or two per complex. Why? changes in multi-reel movie projection. Pushbutton switching, then to platten systems. Currently, the projectionists mostly set up the rooms projection hardware, and splice the film into a big loop for the platten systems. And further, it's going away due to digital projection tech. Most of which can be set up once, and then national can load it with video remotely... or by sending a suitably large thumb drive...
The main difference with with all of those is the replacing technology wasnt reliant on the work of the people using the old technology. AI cannot create anything new. It can at best rehash whats put put into it in different ways.
Which means AI as a technology needs artists to exist and continue creating but now without the getting paid portion.
 

To be blunt: without political intervention, your job will become as obsolete as shorthand stenographers and calculators (persons paid to work out calculations). The only way it won't is if the great filter is "High tech civs blow themselves to smithereens before becoming multi-planetary." The question isn't "if" but is "when?"

Several direct parallels for high skill work replaced by tech...
  • Gregg Stenographers - replaced by the dictaphone and stenotype stenographers.
  • Stenotype Stenographer: being replaced by taped and digital audio.
  • Dictaphone transcriptionist: slowly being replaced by automated transcription.
  • Human Computer (person paid to manually do mathmatics): started being replaced in the 1880's with tabulators; finally died in the 1970's.
  • Punchcard/punchtape typist. Displaced out of use by the 1990's as computer mass storage devices and file systems on disks (floppy and hard) improved the UX.
  • Telegrapher (be it Morse or other such encodings): replaced by phones, and later computers.
  • Film Projectionist: went from one per screen per shift to one per shift, then to one or two per complex. Why? changes in multi-reel movie projection. Pushbutton switching, then to platten systems. Currently, the projectionists mostly set up the rooms projection hardware, and splice the film into a big loop for the platten systems. And further, it's going away due to digital projection tech. Most of which can be set up once, and then national can load it with video remotely... or by sending a suitably large thumb drive...
None of those technologies involved the inventors stealing the work of the people doing the jobs they replaced and then continuing to steal their labor to keep it going.

The only 'political' intervention necessary is to enforce copyright laws and other laws protecting intellectual property and AI creators will ram into a stone wall unless they start paying for the labor they've stolen. Which they've already admitted.

AI isn't 'the next step' after artists, it can't exist without them and can't keep going without them. That's not progress, that's plagiarism.
 

@Anon Adderlan , here we have one of our own who's art has been used to train. He can be suffering professional harm as his original artwork is getting flagged as AI made instead of him.

Legally he can sell his artwork, no one has the right to demand he give it to them for free. Yet someone has taken it and trained without purchasing. And several for commercial use looking at the names, which can be more expensive.

Please tell him to his face that he has no right to prevent his artwork from being used without notice, license or compensation.

Please tell him to his face that you would prefer that even if he could get fairly compensated you hope that he does not, because having to compensate artists would have a chilling effect on indy AI.

Please tell him that he must suffer any professional problems with his original artwork getting flagged as AI. Which is what you started your post with - AI art getting banned from a convention. That artists must pay those professional risks without choice nor action on their part, so that AI art can move forward.
This /\

I liked this post because it cuts right through all the BS.

On the subject of detectors, I also concur about ai detectors getting it wrong all the time, as I have tested hundreds of samples of my own art, some going back twenty years, and some come up as 50% ai generated (or more) when they are literally drawn by hand twenty years ago with a sharpie.

One possible solution is to require that ai generated content be labelled accordingly, wherever it is posted, to avoid any confusion. Even on a forum, I think it would help people make more informed decisions if they were aware if a project or a ttrpg contains gen-ai content.

The challenge of course is transparency, as there are currently a lot of folks who are using ai generated content without stating so right up front. Just like DrivethruRPG & other platforms where you sell your game require that you disclose what ai tools you are using, I think that you should declare you are using the tools anywhere you are posting about your project.

At least this would allow those of us who don't want to interact with gen-ai content to be able to find projects that are closer to what we are looking for, while also cutting out the drama by just stating it right out in front ("we used ai" is a much better look than "we are sorry we lied about using ai").

--> One final thought: Everything has a price...

My science teacher once told me in school "either you pay now, or you pay later."

Generative AI might be "free" (though in reality they will eventually shift to subscription models, so no it won't be free forever), as in you type some words and you get an output, but what you might be missing in that equation is the cost in social goodwill.

While it might have been free to produce the output, and then put it in your game, the price you actually paid was the potential goodwill of your customers, many of which are artists, writers, and creatives that work in the ttrpg industry. Yes, a lot of creators support each other, and a lot of artists & writers buy each others books because that's how that same 20$ bill has been passed around for decades (that's an inside joke for creatives).

By not understanding that part of your audience, you are only doing harm to your own project. Creatives and artists respect the work you put into a project regardless of how fancy the art is. We don't care how much money you put into it or how much it made on kickstarter, we care about using what skills we have honed over the years to produce something within our skillset. Plenty of amazing games use modified stock/public domain art, and have persisted for years on the goodwill of their customers, because they recognize the hard work that was put into the project.

Look at the original Mothership rulebook, the art is kind of terrible, but it has a very distinct look, and despite the art not being top notch, they have dozens of successful kickstarters, and they are now able to hire some really great artists with they money they raise. They started really really small, made their own art to start, and now they can afford to hire professional artists when they want to.
 

Generative AI might be "free" (though in reality they will eventually shift to subscription models, so no it won't be free forever), as in you type some words and you get an output, but what you might be missing in that equation is the cost in social goodwill.

While it might have been free to produce the output, and then put it in your game, the price you actually paid was the potential goodwill of your customers, many of which are artists, writers, and creatives that work in the ttrpg industry. Yes, a lot of creators support each other, and a lot of artists & writers buy each others books because that's how that same 20$ bill has been passed around for decades (that's an inside joke for creatives).
That is a very good point.

I love Parnormal Power; it is hands-down my favorite way to use psionics in my 5E D&D game, and I would be very sad if I didn't have it. But I couldn't support the Kickstarter in good conscience until @Steampunkette offered a version that had been illustrated with commissioned artwork. (Thanks again for doing that, SP. The art is beautiful!)

So, yeah--Steampunkette ran a successful project (4275% funded) but also banked a lot of social goodwill in the process.
 
Last edited:

I suspect the truth lies somewhere in the middle. AI models are trained on material from the Internet, much of which is under copyright. However, it does not COPY these materials but rather draws statistical inferences from them. The models do not contain the original data. But...at the same time, they can be used to create works that echo the original source material. That's the way the modelling works. Are these derivative works in the copyright sense? That's a question for the legal system to sort out. There are some interesting precedents for this kind of argument. When online phone directories appeared, the companies that make Yellow Pages sued various online services for copyright violation. The courts (pretty much worldwide) argued that phone numbers were numbers. And that numbers and other mathematical properties could not be copyrighted. A more pertinent example might be the various legal actions against Google, arguing that the process of scanning the Internet to build its search index violated copyright. The most important case involved the scanning of books so Google could index the contents to help searchers discover relevant works. All the major publishers sued Google over this practice and lost. The argument was that this was a transformative use of the source material permitted under copyright law. Plus Google was not exposing the source material to users of its search engine - it was merely using them to build its search engines. Are these precedents relevant to AI? Who knows. I suspect there will be several rounds of court cases before we land on a clear position one way or another. In the meantime, most companies will continue to avoid the use of AI due to the current legal risks. Having said all of this, the concerns of authors and artists about this technology do have some validity too. Unfortunately the new technology is being pushed by tech giants who see it primarily as a means to reduce wages and cut jobs. Most are wedded to an obnoxious libertarian philosophy that's rapacious and hostile to government regulation. So authors and artists have good reasons to fear. And the law does need to offer them some protection against abuse. What form that intervention should take? I don't think we'll know for several years. Personally, I would argue in favour of a system similar to that introduced for older recording media. A few cents from every sale of the recording media is paid into a central fund used to reimburse artists for potential copyright violations involving the media. Such a scheme could work for AI, especially if the reimbursement was based upon how many of the artist's works were used to train the model. But there is still a problem - with older technologies the money rarely reached the artists themselves because the music labels inserted themselves as intermediaries, snapping up most of the money earmarked for artists. A new scheme would need careful design to ensure the same doesn't happen here too.
 

AI models are trained on material from the Internet, much of which is under copyright. However, it does not COPY these materials but rather draws statistical inferences from them. The models do not contain the original data.
I don't think that's correct.

My artwork was found in the LAION-5B data set, which is used by AI image generators like Stable Diffusion to "draw inferences from," as you say. Those are then used by companies like NightCafe, Midjourney, DreamStudio, and several others. Six original images of mine were copied there without my permission (or even my knowledge), presumably from my Facebook page.
 
Last edited:

I don't think that's correct.

My artwork was found in the LAION-5B data set, which is used by AI image generators like Stable Diffusion to "draw inferences from," as you say. Those are then used by companies like NightCafe, Midjourney, DreamStudio, and several others. Six original images of mine were copied there without my permission (or even my knowledge), presumably from my Facebook page.

Regardless of whether copying your data into a data set was right or wrong, you didn't actually contradict him. It is reasonable to distinguish between the training data and the model.
 

Six original images of mine were copied there without my permission (or even my knowledge), presumably from my Facebook page.
You may want to research Meta's privacy policy and terms of service.

I'm not a lawyer, so I can't say for sure what the legalese means, but the last time I glanced at the ToS, I saw something about Meta claiming the right to use any user's Content "to conduct research" or something like that. If I were an evil tech giant and I got (or claimed) permission to use all of my user's Content for research purposes, I'd make sure my legal department was ready to argue in court that training an AI qualifies as "conducting research."
 


Remove ads

Top