The AI Red Scare is only harming artists and needs to stop.

I've followed this discussion (which is actually pretty informative, thank you everyone!) because I'm really not that knowledgeable about AI art, but very interested in getting an idea of how it works and what it implies ethically. My takeaway at this point, from a purely moral standpoint, is:

What seems morally (and maybe also legally) wrong is using art without permission for commercial purposes (I'm not even sure if the question where and for how long it is copied/stored is that essential). That seems to be beyond the realm of fair use for me. Someone is taking people's art and training an AI with it to later sell the services of said AI, without the creators of the art seeing any recompensation. I don't think the question whether the resulting art could be considered plagiatory is that important.

It still turns on the idea that an AI that learns to do art is held to a different standard than a human who, using the same material to learn to do art, isn't. That's self-evident to some people, and anything but to others.
 

log in or register to remove this ad

I've followed this discussion (which is actually pretty informative, thank you everyone!) because I'm really not that knowledgeable about AI art, but very interested in getting an idea of how it works and what it implies ethically. My takeaway at this point, from a purely moral standpoint, is:

What seems morally (and maybe also legally) wrong is using art without permission for commercial purposes (I'm not even sure if the question where and for how long it is copied/stored is that essential). That seems to be beyond the realm of fair use for me. Someone is taking people's art and training an AI with it to later sell the services of said AI, without the creators of the art seeing any recompensation. I don't think the question whether the resulting art could be considered plagiatory is that important.
One benefit, if I'm recalling correctly, is that if someone does create a commercial product that used AI to create it wholecloth whether images or content, they can't copyright it,* it's free for anyone to use. I think this specifically will prevent many larger entities that like to own the images they use from fully adopting it.

*this is from a half remembered article about a ruling in the USA, it might be different elsewhere, I've no idea how it works here in NZ.
 

It still turns on the idea that an AI that learns to do art is held to a different standard than a human who, using the same material to learn to do art, isn't. That's self-evident to some people, and anything but to others.
That's probably a sticking point. As long as you consider AI a tool (as I would do), I'd say the acting party are the people using the AI as a tool (or selling it's use to others) - so they are using other artist's creating for commercial ends without permission or recompensation, which sounds morally wrong to me.
 

Actually, in many cases, musicians are. Garth Brooks. Uncle Cracker. Gwen Stefani.
Any band doing a cover, many of which do get just far enough to sustain a new copyright.
One major difference between performing (or recording) a cover and ingesting art to train an AI… the cover song is licensed and generates revenue for the rights-holders. There’s a whole system of licensing and royalty payment set up to handle this and any artist or venue who fails to uphold their end can be held accountable.
 

By that reasoning the sausage-maker throwing stolen meat into his sausage-grinder isn't violating any laws since he's "making something new."

And it's not 'looking,' it's stealing. You do get there's a difference between a human looking at other art for inspiration and an AI consuming the entire thing to use, right?


So you're saying digital piracy isn't theft?

How'd that work out for Napster? Because the lawsuit Metallica filed put the lie to that claim really quickly.
Digital piracy is certainly theft from a legal point of view.
 



Hence my question about using public-domain material to train on.

Yep. Project Gutenberg is sitting right there. It'll leave you with an anachronistic-sounding generative AI, but what do you want for free, right?

Is it, though, if it's doing the same thing you or I could (and would) in writing a college essay, in taking and synthesizing those source materials and producing what it's being asked for?

What does it mean to be "doing the same thing"? The process the AI uses isn't the same as the process that occurs in the human brain - the hardware and software are extremely different. The results are in many ways similar. IN one way they are doing the same thing, in another, not at all.

(never mind that photos IMO should not be copyrightable in the first place; that's another can o' worms)

I'll mind it.
Taking a really good photo is a skill. And two photographers can each take a photo of the same subject, and get very different results. That difference is what makes it copyrightable.
 

My problem is I'm remarkably unconvinced anyone making claims regarding that subject for any purpose is really as knowledgeable as they think they are here; there's way too many incentives for people to believe what they want to believe regarding this topic, including people who study one or the other half of it professionally.

Sure.
So, maybe this thread isn't for you. That's okay.
 


Remove ads

Top