The AI Red Scare is only harming artists and needs to stop.


log in or register to remove this ad


AI is here to stay. That's all you need to know.

Best is to ignore this panic best you can. All their efforts are doomed to fail anyway. Plus, it's a very small minority.

Focus on the positives, since they are huge: AI is a net gain to humanity. The state where only a small subset of creatively gifted people are able to create art is over, which is incredibly empowering to the rest of us.

Same with every other technological breakthrough. Once only trained tailors could make clothes. AI is just the latest power loom. It's one more step on the incredible thing called the industrial revolution, that slowly drags humanity into the future where we finally won't need to work just to survive. :)

But I'm sure people were accusing textile workers of theft and taking people's jobs back then too.
 

But I'm sure people were accusing textile workers of theft and taking people's jobs back then too.
AI creators ARE stealing. They're knowingly using intellectual property that doesn't belong to them for training the AI, that's theft. OpenAI outright admitted they can't create AI without violating copyright laws:


Textile workers didn't go around stealing materials from other people, have a robot do all the real work, and then claim the end result belonged to them, that's the difference.
 

You do know the RIDICULOUS number of false positives the "AI testers" give as a large amount of digital tools will set them off. Plenty of works known from years before AI have come up positive.

Anyone actually educated ignores these, and ignore those who use them because they are showing themselves to be ignorant of the actual situation.

These are just the latest examples I found, and enough harm has been done at this point that I'm no longer humoring the gaslighting. No, AI training is not theft. No, AI training is not a violation of copyright. No, you don't have a point if your response to AI taking jobs is to quit taking jobs yourself. Its use has become just another thing someone can be falsely accused of with little recourse. And if you really want to continue pushing for 'ethical' training, just remember that indies are unlikely to ever afford the rights to enough content to train on, while Big Tech already has rights to all the content they'll ever need. And even if indies did there's no way for them to prove it. I'll let you decide who benefits more from that state of affairs.
This argument comes down to "we can do the right thing, but only the big companies can afford to compensate the artists, so screw the artists so the indy AI people can play as well as the big companies do?"

Sorry, "feel sorry for the indy AI, and show it by letting everyone rake over artists" doesn't fly.

(I'm ignoring the obvious falsehood about copyright, at least in some countries there have already been successful lawsuits.)
 



I don’t care what system of government prevails, if you take someone else’s creation and labor and make use of it without compensation, it’s wrong.

Most of us with a sense of fairness can agree on that.

I don't agree with that description at all. The legal standard of fair use in the USA is whether or not the work is transformative. The work is clearly transformative. As a practical matter, everything that gets uploaded to the internet gets copied millions of times as an inherent aspect of the technology. The fact that a computer skimmed over billions of images and learned something from examining it doesn't violate the rights of the owners of those images. The fact that the computer skimmed over trillions of words of copyrighted material in order to learn something about language does not violate the rights of owners of those words. What the computer is doing is comparable to what anyone who reads or sees copyrighted material and has their behavior influenced by that. The people who wrote those skimming programs were doing nothing that was dissimilar to a search engine reading text in order to create search indexes. The index is a transformative work. And they had every right to use things in this manner.

When these algorithms produce art it is clearly original art. It's clearly transformative. You can't mimic the work of the AI by creating a mosaic or a collage. I see so many bad attempts to prove that the AI did something unfair that involve representing the AI's work as a human created mosiac or collage. The irony.

When a composer like John Williams makes a piece of music, it's often inspired by music he has heard that was created by others. But, despite the phrases that might be familiar to musical aficionados, it is clearly transformative work. Being inspired by something else does not violate copyright.

Fundamentally, the argument you make has nothing to do with fairness. It's the same argument skilled scribes were making against typewriters or human computers were making against mechanical computers - my skills and my job had a certain amount of value that I worked hard to achieve and now your machine is threatening that value. And yes, if that were true that could potentially suck, but fairness isn't the problem.
 

There is a lot more to what you said here than the part I quoted. But I quoted this part because it got me thinking.

Isn't there a strong argument to make that every fantasy adventure book has just been derivative of the Odessey? Isn't what we are accusing AI of being immoral based upon the same concept that every writer or story teller has been doing for generations (or thousands of years). i.e. taking what they have heard and read from others and piecing it into their own representations? (Yes, I know in some ways what generative AI is doing is somewhat different. But I don't think conceptually it is completely different.)


These are statements are not legally accurate nor are they fair interpretations of the statements you are quoting.

If you post a story on the internet and I read it along with thousands of other stories and then take all those stories into my mind and write my own story. That story I write will be bits and pieces of what you wrote. But I will not be stealing what you wrote.
Fair point.
 

The real culprit are the companies who stole work of art to train their models and then released it to the public. If they hadn't done that there would be no panic.

With respect, everyone should be responsible for their own behavior. That some Big Corporation did something wrong should not be an excuse for anyone to abrogate responsibility for their own choices.
 

Remove ads

Top