AI/LLMs AI art bans are going to ruin small 3rd party creators

Your own description of the process you use is merely placing many, many orders -- kind of like an event planner does. That doesn't make them a cook either.
It does make the creation mine, though. The AI is just a tool, not a creator. Creation comes from the thought and idea behind the end result, not from simple assembly.

If I just throw out a prompt and take whatever picture comes back to me, the creation and vision is that of the artist whose artwork is stolen by the AI.

If I direct the AI tool to make precisely what I am envisioning, it is not taking the artwork from anyone else, but rather is assembling my creation. My vision.
 

log in or register to remove this ad

It does make the creation mine, though.
While not yet quite settled law, this is mostly an untrue statement. You have not added enough humanity for a process that only uses LLMs to be considered your own work.
If I direct the AI tool to make precisely what I am envisioning, it is not taking the artwork from anyone else, but rather is assembling my creation. My vision.
As long as you ignore how the foundation models were trained you are correct. Of course, they were in fact trained.

*You have yet to say you're using one of those boutique, university models built entirely off of public domain works. If you are I'd be rather impressed.
 

As long as you ignore how the foundation models were trained you are correct. Of course, they were in fact trained.
Not clear why AI being trained matters in regards to using AI to make exactly what one envisions through a process of gradual refinement until the image perfectly matches the vision. AI can be used in more ways than - give me a single image I’m going to use whatever it spits out.
*You have yet to say you're using one of those boutique, university models built entirely off of public domain works. If you are I'd be rather impressed.
Is your claim that such a model works fundamentally differently than one not trained on only publicly available works?
 

There's only one that applies to the initial scraping, and that requires ascribing personhood to the AI: educational use - violations solely for the purpose of educating a person in techniques.
Why wouldn’t some variation of transformative fair use potentially apply?
 

While not yet quite settled law, this is mostly an untrue statement. You have not added enough humanity for a process that only uses LLMs to be considered your own work.
Law and reality don't always align. I am the creator, even if the law doesn't recognize it.
As long as you ignore how the foundation models were trained you are correct. Of course, they were in fact trained.

*You have yet to say you're using one of those boutique, university models built entirely off of public domain works. If you are I'd be rather impressed.
Training is irrelevant to creating via using AI as a tool like I am describing. The only difference is whether I'm using an AI whose training was moral or immoral, not whether the end result is my creation or not.

If I use a picture generated by an immoral AI and then completely change that picture to match my vision, the end result is every bit as much my creation as if the AI was a moral one. Why? Because the base picture isn't what it was before.
 

Why wouldn’t some variation of transformative fair use potentially apply?
Intent is an element of transformative use. Artist's intent, specifically, of the artist making the derivative to use it transformationally to make a statement.
Outcome is the basis of educational exception: The education of the person for whom copies are made to be used by.

Scraping the net for content is not transformative at all - there's no transformative use by the person doing the copying (ie, setting up the scrape parameters) to make a statement.

Educating the AI is the intent of the scrape, not a transformative social statement. But the courts rejected it as a defense since AI lacks personhood to be educated. And you can't get an AI to prove personhood without training data sufficient to do so. And getting even halfway personlike appears to need data sets so large that only rampant scraping has been adequate.

Unless, of course, one goes to the absurd idea that the intent of the scrape is to show just how mundane humans are that a machine can replicate their behavior patterns in language use. But that doesn't rise to the level that most juries would accept. Never insult your jury. ;)
 

Not clear why AI being trained matters in regards to using AI to make exactly what one envisions through a process of gradual refinement until the image perfectly matches the vision. AI can be used in more ways than - give me a single image I’m going to use whatever it spits out.
I have no problem with iNaturalist. They come by their foundational data honestly.

IMHO

The manner in which the AI is trained matters because it impacts the marketability of the art created with it.

I think it diminishes and insults both artist and the art, because the tool provided is flawed. (Tainted foundational data)

Something like the electrical wiring where you live. It is mostly hidden from view and very useful; but if the installer cut corners to save money, it could burn the building down.


Mural or graffiti? Murals that are vandalized? Graffiti that is vandalized? Graffiti that is removed/painted over?

The foundational questions are:
Who owns the wall?
Who owns/is responsible for the art?
What does the wall owner think about this?
What are their rights?

Is your claim that such a model works fundamentally differently than one not trained on only publicly available works?
 

Law and reality don't always align. I am the creator, even if the law doesn't recognize it.

No, you want to be considered the creator.

But the credit of creatorship is a social construct, and is useless unless others agree with you. What really matters is if others accept you as the creator, for that is where rights and honors and profit lies.

In effect, your piece was ghost-written, or ghost-painted. Your name is on it. You may have been involved. That involvement was not sufficient for many here to accept you did anything worth crediting.
 

If I use a picture generated by an immoral AI and then completely change that picture to match my vision, the end result is every bit as much my creation as if the AI was a moral one. Why? Because the base picture isn't what it was before.
Did you change the picture, or did you ask the AI to do it? You've glossed over the critical part.
 

Training is irrelevant to creating via using AI as a tool like I am describing. The only difference is whether I'm using an AI whose training was moral or immoral, not whether the end result is my creation or not.

Nope.

"...whether the end result is my creation or not," is a legal question of ownership and rights. The provenance of the tools, and the nature of the creation with respect to other works, very much bears on who gets those rights, and has done since before generative AI.
 

Recent & Upcoming Releases

Remove ads

Top