The AI Red Scare is only harming artists and needs to stop.


log in or register to remove this ad

NFTs? Hydrochlorofluorocarbons? Whale oil and leaded gas?
Most of those were superseded that did the same thing better without the downsides, though. (Except for NFTs, those had no real use.)

The downsides of AI art are kind of the point. You can instantly, as above, get a very nice picture of a dank starship corridor...which means you don't have to pay an artist, which means they can't make a living.
 

Soon enough, though, both hardware and software advances will allow some of those AI programs to become streamlined enough that you can run them on your own computer/smartphone/tablet.

In theory, and it is likely to be a Moore's Law thing. It is less about increasing efficiency of the calculation steps and more about getting sufficient memory on the device to hold it.

The question is whether it makes financial sense, in practice, to give a pre-trained AI directly to users, or just give them an interface. The more ubiquitous the connectivity, the less sense it makes to have it run on a local device.
 

The downsides of AI art are kind of the point. You can instantly, as above, get a very nice picture of a dank starship corridor...which means you don't have to pay an artist, which means they can't make a living.

And that is a bit of a limiting factor. Remember that the AI needs to be trained. Once you drive artists out of making a living, you cease having new art to train the AIs on, and the AIs stagnate, which isn't so good for business.
 

First, let me thank @CleverNickName for his meaningful and thoughtful responses. Very thought provoking.
It's a complicated issue, and I don't think anyone really knows what to do about it right now.

The best advice I've got:
  • support local artists and local businesses whenever you can.
  • don't use AI-generated images and text, and don't buy products that do
  • don't post your art on social media for scrapers to find.
  • don't use AI-scraping apps (like photo filters) on your phone.
  • don't "like," download, or share AI-generated material. "Don't feed the algorithm."
  • sign petitions to strengthen the legal protections of artists and writers who are fighting for their livelihoods.
As soon as they can hold AI liable for malpractice or incompetence, I know my job as an engineer will be threatened too.
 
Last edited:

Well, no wonder my own original artwork is being flagged as "created by AI." According to Have I Been Trained, my artwork has been found in the LAION-5B data set, which is being used to train Midjourney, Stable Diffusion, and like a million other AI generators.

Obviously I have never given anyone permission to use my artwork in this (or any other) manner, but it's not like the people working on AI trainers ever cared about permission. The whole point of their product is to avoid messy little details like "artists" and "contracts." Having to pay artists for commercial use of their own work would defeat AI's purpose.

...my mood has taken a turn.
 
Last edited:

For their main assessment, my IB (International Baccalaureate) Theory of Knowledge students have to write an essay exploring one of six knowledge questions. They write it as a series of drafts, each of which gets teacher feedback, before uploading the final version for external moderation.
I spent two years in the 90s trying to get someone to turn in the annual paper on truth in with blank pages.
 

Please explain how a human brain does it.

Edit: because here's the thing: in my profession (teaching) we are really struggling with what to do about AI, since in many ways, it writes better than most humans. But also since it suggests that a lot of the things we thought were exceptional about humans...maybe not so much.
Having spent a few years teaching elementary music and 5-6 classroom...

Yeah, the process we use in classroom looks a lot like that used in describing AI training. Repeatedly show correlated materials, and sooner or later, those correlations sink in. Reward successful output of the expected material given the input.

It's also interesting to note that the same training mode used with AI small multilayer nets works with ex-vivo neural cell organoids for playing videogames...

Which makes the LLM look like it may be even closer to how things work in the brain than many think.

The fundamental issue: Many humans believe humans are a unique clade with a unique place in the universe. This is known by several names... but the one I prefer is "Humanocentrism"... and "Exceptionalism of Humanity" has been used by a few researchers recently...

The evidence coming out of animal research is, more and more, humans are different in ability from other apes only as a matter of degree, not a matter of fundamental structural differences. And that many mammals are much closer than most people are comfortable thinking about. Many birds in the 1.5-5kg range are a lot smarter than people want to think, and the humanocentrists in the review process block a lot of papers from publication, simply by being overly dismissive with citing "clever Hans" and "pareidolia."

It is much easier for many people's world view to see humans as exceptions than to accept that we're just really smart animals, different only in degrees.¹ And for a great many, it's religious in origin.
 


These are just the latest examples I found, and enough harm has been done at this point that I'm no longer humoring the gaslighting. No, AI training is not theft. No, AI training is not a violation of copyright. No, you don't have a point if your response to AI taking jobs is to quit taking jobs yourself. Its use has become just another thing someone can be falsely accused of with little recourse. And if you really want to continue pushing for 'ethical' training, just remember that indies are unlikely to ever afford the rights to enough content to train on, while Big Tech already has rights to all the content they'll ever need. And even if indies did there's no way for them to prove it. I'll let you decide who benefits more from that state of affairs.
Quite the screed.

Do you hold to it when it's not abstract?

Well, no wonder my own original artwork is being flagged as "created by AI." According to Have I Been Trained, my artwork has been found in the LAION-5B data set, which is being used to train DALL-E, Midjourney, Stable Diffusion, and like a million other AI generators.

Obviously I have never given anyone permission to use my artwork in this (or any other) manner, but it's not like the people working on AI trainers ever cared about permission. The whole point of their product is to avoid messy little details like "artists" and "contracts." Having to pay artists for commercial use of their own work would defeat AI's purpose.

@Anon Adderlan , here we have one of our own who's art has been used to train. He can be suffering professional harm as his original artwork is getting flagged as AI made instead of him.

Legally he can sell his artwork, no one has the right to demand he give it to them for free. Yet someone has taken it and trained without purchasing. And several for commercial use looking at the names, which can be more expensive.

Please tell him to his face that he has no right to prevent his artwork from being used without notice, license or compensation.

Please tell him to his face that you would prefer that even if he could get fairly compensated you hope that he does not, because having to compensate artists would have a chilling effect on indy AI.

Please tell him that he must suffer any professional problems with his original artwork getting flagged as AI. Which is what you started your post with - AI art getting banned from a convention. That artists must pay those professional risks without choice nor action on their part, so that AI art can move forward.
 

Remove ads

Top