AI is stealing writers’ words and jobs…

TheSword

Legend
I beg to differ. Terms like theft carry a moral weight other infractions do not. Tax evasion can be described as theft using a broad definition, and yet many people wouldn't balk at deducting non eligible expanses. I think using them inappropriately could be way to make things look worse than they really are. It can be an approximation (and therefore it would be nitpicking to point it out) or an hyperbole (and thus deserving pointing out to defuse the rhetorical effect).

Also, it's especially important because it crosses international boundaries to be precise, because several countries can find several solutions to the problem, each adapted to their own situation (no copyright (San Marino), no art [aren't some muslim countries banning the representation of man altogether? or is it a thing of the past], opt-out, opt-in, no AI [Dune?])... There is no reason that what is good for the people of X is good for the people of Y. There are countries that make mandatory things that are forbidden in other and many more examples of things deemed totally harmless in some that are offenses elsewhere. Those countries can be both right at the same time.





By using the word "fair", you're positionning the debate into the field of morality. Everyone is using people's work without fair compensation. We don't pay anything to the inventors of the language we use, to the people who designed the numbers we use... because the law has made boundaries on the limit of time copyright protection applies to. Why is X years fair, and X-1 unfair? It's a tough question, but that's the job of lawmakers to take every view into account and produce texts that are optimal. Also, we are all dwarfs sitting on the shoulders of giants, and nobody would imagine that a scientist who make a breakthrough is a thief because he worked on the ideas of others without compensation. That ideas and concept can't be protected is also legal, but could be seen as unfair by a definiton as wide as "using people's work".




And would it change anything? What is your stance on open source weights, who by definition don't earn anything to their author? This a concrete example, not an abstract theoretical one, as many very good LLM are open-source.



No, but it compensate creators in general, who had access to free education in art, free museums with extensive displays to get their inspiration from, public infrastructure like the Internet and grew in peaceful enough countries that they could become artists instead of being enrolled in a militia. Collective compensation is as fair as individual compensation, from a moral perspective, if it is devoted to enrich the social background that made learning artistic skills possible. Car pollutes, fumes diminish the quality of life of people and yet they are not compensated individually, but fuel taxes are devoted (in part) to fund environmental policies, so the damage I sustain by living in a city is compensated by having a natural reserve at the other side of the world. Do you think it's unfair? I don't. A fair compensation doesn't necessarily mean an individual compensation.



I am not what if'ing. I am proposing solutions, based on existing process not unlike the one you describe. That you felt robbed isn't something I'll dispute, and as a user of CD to store archive of data, I felt robbed by a similar policy of having to pay a tax despite not using it to infringe copyright (in France's case, it's was implemented as a tax on physical media). Interested parties have a hard time determining what's best, because of course their own interest blur the thing. At the time I'd have said "it's not my problem, the artists should just sue whomever is using CD to copy films and music instead of robbing me for using my own data", which was materially unfeasible. It took a lot of restraint to see that it was the a good possible middle ground, especially when funding public policies instead of being reversed to a few "big names" production houses.
What a beautifully written post.
 

log in or register to remove this ad

Art Waring

halozix.com
Adobe last updated it's Terms and Conditions Aug 2023. Like any other app or program that makes such changes, a pop up appears that requires you to acknowledge that you read and understand the new Terms and Conditions. If you do, you are agreeing to have your stuff used for machine training.

It seems likely that it was updated to allow machine learning at that time and anyone who continued using Adobe after that gave away their privacy with the processes for content that they created. It also seems likely that like me, almost no one actually read the Terms and Conditions before clicking that they agreed and just kept using Adobe.

It doesn't have to, but the company decided that they wanted it in there for use in training their AI, so they put that clause in and had people agree to it in order to continue using their product.

I'm curious about whether those who did not agree to it have had their processes used for the machine learning.

If you agree to give it away, you don't get paid. The man who invented the polio vaccine gave it to the world instead of keeping it for himself and making money off of it. There are music artists who give their art away for free.
I can't really disagree with any of this, I just think that a bit of transparency would have done them some good here.

Changing terms and conditions is one thing, and I don't know how far down the ai clause was hidden because I refuse to use Adobe CC atm. However, I think that if you are advertising your product as "fairly licensed" you should have a transparent process for proving that point. If artists are saying this isn't the case, then that surely warrants some investigation.

This also glosses over the fact that the machine learning clause goes beyond just using your data, they can train on anything you do using their app, which IMO is an extreme breech of privacy.

I will go back to my previous statement on the matter. Defaulting to opt-out only, or scummy changes to terms in usage agreements places the blame solely on artists. Its not Adobe's fault you got robbed of all your IP (intellectual property) and decades of hard work, its your own fault for not being aware of their scummy changes to their terms. This is incredibly toxic for artists, in a world that already treats artists like dirt, this only entrenches that attitude.

Artists have no protection under the law like corporations do, but yet people argue that corporations deserve to get away with outright theft under a nicer phrase ("its all fair use" -OpenAI) while artists get the short end of every stick?
 


If that end result was given by any person or algorithm it would be deemed copyright infringement.

We're discussing AI models, not end result. If I use an AI model to create an image of Elsa from Frozen and redistribute it, I'd certainly infringe copyright. But if I create an image of Elsa from Frozen using Photoshop, or using a brush, and redistribute it, I'd also infringe copyright without any need of AI. The demonstrated ability of a tool to be used to generate copyright-infringing material has no bearing on the tool itself infringing copyright or not. This question is something that can't be proven, imho, by examining the output of the tool. But it's a complex question that's unlikely to be solved unanimously by courts. On the other hand, establishing in the law clear ways to create legal models is probably the safest bet to get them quickly and hassle-free (until we get models trained on CC-art and art from the period when we were free of copyright). And sure, even a 100% compliant EU opt-out model could create Elsa-from-Frozen and redistributing those would still be a copyright infringement. Much like they could recreate Suprematist composition: White on white from a very basic prompt.
 
Last edited:

FrogReaver

As long as i get to be the frog
We're discussing AI models, not end result. If I use an AI model to create an image of Elsa from Frozen and redistribute it, I'd certainly infringe copyright. But if I create an image of Elsa from Frozen using Photoshop, or using a brush, and redistribute it, I'd also infringe copyright without any need of AI. The demonstrated ability of a tool to be used to generate copyright-infringing material has no bearing on the tool itself. The question being "is the tool infringing on copyright or not" is something that can't be proven, imho, by examining the output of the tool. But it's a complex question that's unlikely to be solved unanimously by courts. On the other hand, establishing in the law clear ways to create legal models is probably the safest bet to get them quickly and hassle-free (until we get models trained on CC-art and art from the period when we were free of copyright). And sure, even a 100% compliant EU opt-out model could create Elsa-from-Frozen and redistributing those would still be a copyright infringement. Much like they could recreate Suprematist composition: White on white from a very basic prompt.
It’s more than that. I have to intentionally use photoshop to create copyrighted material. I don’t have to have any intention of doing so with AI.

I’m with you that it’s a tool. Not all use cases are bad etc etc. But right now it’s impossible for the user of the tool to know if the AI gave them copyrighted material or not.
 

It’s more than that. I have to intentionally use photoshop to create copyrighted material. I don’t have to have any intention of doing so with AI.

You asked something by mentionning explicitely D&D 5e in the prompt, I don't think you can say that your intent wasn't to generate something copyrighted. If I ask an AI to generate a picture of Big Mac, I shouldn't be surprised to find a McDonald's logo somewhere, while it would be more jarring on a generic burger prompt. Also, as a human, when you notice something generated is protected under copyright (like if you had asked for a yellow square with red border, stealing my masterpiece I previously posted on this thread), the onus is on you not to redistribute the work and actually infringe copyright. Private use isn't infringing anything.
 
Last edited:

Scribe

Legend
Terminator.JPG


Andy Samberg GIF
 

FrogReaver

As long as i get to be the frog
I mean how else would one prove that these models are (and for those keeping score, they are) built on outright theft?

Ask your model to generate "Terminator Style Robots" if they are not immediately recognizable as The Terminator (TM) your model sucks.
I find it better to focus on what they create rather than how they create it.

The ‘how’ brings comparisons to how humans ‘use’ copyrighted works by virtue of them being culturally relevant to drive their own art. The ‘how’ also brings highly technical bits of how computers work into the convo.

Neither are great starting points for this convo.

Instead focusing on what is produced is much harder to justify. However it works on the inside it’s clearly able to copy based on what we see it produce.
 

FrogReaver

As long as i get to be the frog
You asked something by mentionning explicitely D&D 5e in the prompt, I don't think you can say that your intent wasn't to generate something copyrighted. If an ask an AI to generate a picture of Big Mac, I shouldn't be surprised to find a McDonald's logo somewhere, while it would be more jarring on a generic burger prompt. Also, as a human, when you notice something generated is protected under copyright (like if you had asked for a yellow square with red border, stealing my masterpiece I previously posted on this thread), the onus is on you not to redistribute the work and actually infringe copyright. Private use isn't infringing anything.
I can say my intent wasn’t. I can even go further and say wanting something in the style of 5e doesn’t require copyright violations to create.
 

Instead focusing on what is produced is much harder to justify. However it works on the inside it’s clearly able to copy based on what we see it produce.

Much like a printer. It can copy perfectly, and much more clearly than an AI model, a copyrighted work. It is also not blamed for it (it can't know if the reproduction will be legal or not), and the person using it to redistribute copyrighted material it printed will be sued, as it should. Not HP or Xerox.

If we were to follow that line of reasoning, then AI models trainers should be facing lifetime jail time because the tool can be used to replicate low quality money bills, which is considered counterfeiting in some (many? I'd even say most...) juridictions and usually carry a harsh penalty. Also, and to look at more ghastly consequences, there are juridictions where, outside of any copyright problem, blasphemy isn't seen as a human right and can even command the death penalty. It would certainly stand to reason, if you blame the toolmaker from the tool output, that the ability to generate a blasphemous image should have them be put to trial. While it's certainly a possible position, I think it's best to limit responsability to the person using the tools, like we do with any other tool.

Another problem I see with focussing on the output is that using this method you can't distinguish between a legal model or not. If I create a 100%-legal model because, say, I am a public library and therefore I can create models by scraping, even if the authors have opted-out, a user like you will very well be able to ask the model to create something that will be from a book (for example, because you ask for 10 most significant passages from the D&D 5e rules) and yet the model (and my model-making activity) won't infringe copyright by definition. You would, if you were to redistribute the output, of course.

I can say my intent wasn’t. I can even go further and say wanting something in the style of 5e doesn’t require copyright violations to create.
Many AI have a hard time distinguish a prompt written as "a girl with red eyes and a blue dress" and "a girl with blue eyes and a red dress". Expecting them to understand by themselves the subtle difference between "in the style of" and "like" is probably out of their leagues from a long time. Until then, it's something that is best left at the determination of the human user of the tool, much like we do with brushes and printers, whether or not redistribute the output and actually break the law.
 
Last edited:

Remove ads

Top