The AI Red Scare is only harming artists and needs to stop.

Artists didn't consent to their art being used to train AI, weren't compensated, and weren't credited.

So what? There is no obvious reason why they should be. Again, if the text and images had been produced by a human without a tool, then they would be obviously unique and creative works. So how does the fact that a tool was used change the fact that the results are unique and creative. If it was legal for someone to read or view the work, and that reading or viewing required the temporary creation of a copy of that work (as everything on the internet does) then how was scanning it with a tool and keeping only a temporary copy a violation of copyright?

I get it. It's new. It's scary. This is a novel situation in the history of humankind. But this is less of a copyright violation than if Data checked a book out of the library and read it at superspeed, because Data would be storying a copy of the book in his digital mind if he did that and the machines that read these books or scanned this art didn't.

If AI-creators actually believed they weren't violating copyright they'd be ripping of Disney's copyrighted work.

I thought I'd already dispensed with this through practical experiment. It's clear that publicly available AI is well aware of Disney's trademarked and copyrighted content and has almost certainly included it in its training in some fashion. Moreover, as far as I know the major AI players have not disclosed their training sets and treat them as proprietary, so I'm not sure where this idea that the AI image creators can't rip Disney's works come from. Certainly, a lot of images inspired by Stars Wars - a Disney IP - have created using publicly available image generators without custom data sets, because I've created some myself for my own games. So, as far as I can tell by practical experience, googling this topic, and observing what is out there in AI art, you are misinformed and spreading misinformation.

UPDATE: By way of reference to what I mean here is a zany, stupid, image not worth publishing of a camel flying an X-Wing fighter, but which illustrates that leading AI generators know a lot about Disney copyright works:
 

Attachments

  • xwing.jpg
    xwing.jpg
    21.1 KB · Views: 55
Last edited:

log in or register to remove this ad

Given that we haven't a clue how the human brain works, that you would confidently declare that amazes me. How the heck do you know what method the human brain uses? Go ahead and win a Noble prize and a lot of other acclaim by revealing such secrets of the mind.
I'm going to rock your world: We know many ways the brain doesn't work. I can also confirm that the mind doesn't work via gears and pendulums. Or miniature giant space hamsters. Or your third grade science project.

Get a hold of yourself. We can eliminate possibilities along the route in a quest for understanding. The structure of the brain vs. the creation of the model and then the statistical evaluation against the model with the prompt are not the same type of mechanism.
 
Last edited:

I'm going to rock your world: We know many ways the brain doesn't work. I can also confirm that the mind doesn't work via gears and pendulums. Or miniature giant space hamsters. Or your third grade science project.

That the brain doesn't use gears or pendulums can be observed. That it doesn't use say NOR and NAND gates isn't something I'd be willing to state concretely. That it doesn't use some approximation or variation on a statistical model is not something we can yet observe because the structure of the brain is way too complicated and too poorly understood.

I'm not sure why you think mentioning that it doesn't use giant space hamsters helps your point. Yes, we have observed the workings of the brain enough to know that it's not got a giant space hamster in it. But we haven't observed the workings of the brain well enough to know that some Markov chain is not being implemented at an organic level. We can't even understand really how the neural nets we've created come up with answers, so how can we be sure how the human neural net comes up with answers? How in the world did you leap from the idea that we can look in and not see a giant space hamster, to that we can look in and not see a particular algorithm at work? Perhaps well-known algorithms organically evolve and are used at various points in the brain's logic chain as efficient solution solvers? How would we see or not see them within the complex web of neural interfaces? Perhaps there is a little Markov chain predictive model selecting the right adjective it thinks should be used for a size of something? No one so far as I know knows, and I've both got a degree in computer science and a background in biology. I'll gladly bow to your superior understanding if you happen to have a PhD thesis in this topic (and I'd love to hear your learned discourse because I find this endlessly fascinating), but I've talked to a lot of PhDs and I'm guessing using some algorithm I couldn't define "no".
 

I know a lot of people (not personally, just in general) are against AI art, but personally I love it, I have a lot of fun creating stuff for my games, it might be an image of an NPC that or maybe a location that I can show my players. Friends and I have also, used it to create an image of our PC's, it works well (sometimes ruined by those extra digits or crazy amount of scabbards).
 

AI creators are blatantly committing theft of intellectual property, AI is spewing misinformation, AI is being used to cheat, and AI is consuming AI-created content to create more AI-created content in an incestuous loop that fills the internet with word salad.

The same people defending AI also claimed cryptocurrency and NFTs were the next big thing and anyone pointing out the obvious crime and dishonesty involved just hated progress.

'THE BLOCKCHAIN' was supposed to upend the world of finance and instead it was just recreating currency and stock markets without any of the regulations that kept those things from just being giant scams.
I get it. It's new. It's scary. This is a novel situation in the history of humankind.
So was Mr. Burns's sun blocker, which you're defending the digital equivalent of.

 



"how art has always worked" - incorrect.

The models use statistical analysis to produce, which is a very different method than the human brain.
Which irrelevant to the fact that a human using Photoshop and PC using Midjourney are both just looking at a bunch of artists' work and making something new. They're both "stealing" to the exact same degree. In renaissance academies, students trained by copying the masters to develop their skill. Fan artists were looking at comics to develop their style before generative art was a thing. Making something new after analyzing existing art is a fundamental part of how art has always worked.
 

The world would be much better off if the web "to steal" was reserved for only cases where the original owner is deprived of the thing stolen.

If I take your car so now you have nothing to drive around in, I have stolen your car.

If I build an exact replica of your car, but you still have your care, I may have broken some laws, but it would be so useful if we'd agree that whatever I did, I did not steal your car.
 

Which irrelevant to the fact that a human using Photoshop and PC using Midjourney are both just looking at a bunch of artists' work and making something new.
By that reasoning the sausage-maker throwing stolen meat into his sausage-grinder isn't violating any laws since he's "making something new."

And it's not 'looking,' it's stealing. You do get there's a difference between a human looking at other art for inspiration and an AI consuming the entire thing to use, right?

The world would be much better off if the web "to steal" was reserved for only cases where the original owner is deprived of the thing stolen.

If I take your car so now you have nothing to drive around in, I have stolen your car.

If I build an exact replica of your car, but you still have your care, I may have broken some laws, but it would be so useful if we'd agree that whatever I did, I did not steal your car.
So you're saying digital piracy isn't theft?

How'd that work out for Napster? Because the lawsuit Metallica filed put the lie to that claim really quickly.
 

Remove ads

Top