• The VOIDRUNNER'S CODEX is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!

AI is stealing writers’ words and jobs…


log in or register to remove this ad

Maxperson

Morkus from Orkus
Oh, and about Adobe. They claim they have 100% gotten consent to train their models, but many artists are saying that this is a blatant lie.

Article: Artists accuse Adobe of tracking their design process to power its ai.
Maybe I'm just not understanding this, but why is this bad? If I use C++ to create a fantastic program, the program is my creation. Learning and using C++ is the process. Similarly, if I want to create a great painting, I need to learn the processes involved in proper brush strokes and much more. It seems to me that the process isn't what is owned, but rather the final content.
 

Maybe I'm just not understanding this, but why is this bad? If I use C++ to create a fantastic program, the program is my creation. Learning and using C++ is the process. Similarly, if I want to create a great painting, I need to learn the processes involved in proper brush strokes and much more. It seems to me that the process isn't what is owned, but rather the final content.

I suppose any way of making automatic art, whether it's by statistically predicting the color of the next pixel or creating a magical portrait that display an image your your specifications through astral power, will be met with the same negativity. Which makes me think the debate is not really about AI and the legality of its data sourcing but about "artists are screwed, what will they do?". And coders as well (the job effect of AI is not limited to making pretty pictures).
 
Last edited:

Opt-out options are a cop out, allowing companies to deflect blame to artists. If they were actually interested in treating artists fairly it would be an opt-in system. Furthermore, tons of companies offer opt-out options that are never enforced. There is currently no law forcing companies to enforce opt-out lists, and they are free to ignore them at their convenience.

No, the point of TDM is that indeed, if you don't respect it, you can't use scraping and you can be sued under regular copyright laws. That's why they called that the TDM exception. It conditions the legality of scraping to respecting opt-out. I don't see how you can say "there is no law..." when we're discussing a specific piece of legislation.

As a traditional artist, opt-out only is not a valid solution. Period.

That I can understand. However, the job of lawmakers is to find the best outcome for society, not just artists. It seems that the flow of EU legislation (TDM exception in the DSM directive, last week's AI act...) seems to shaping the path toward more AI, with more controls for regulator (and exceptions to exempt itself from control) rather than the contrary.


Not to mention, that the opt-out option completely ignores the issue of scraping and using personal data without consent. For the one company that does use opting-out, there are thousands of companies that ignore the issue completely.
Sure. Big companies are regularly fined for not respecting the laws (as recently as yesterday, Google was fined 250 millions USD for failing to respect copyright with regard to use of excerpt of newspapers by the French regulator), and maybe the fines aren't enough to make them comply. But what dissuasive means are necessary to hurt the GAFAM-like companies into complying with the laws seems to be a distinct debate from AI.


There is still a very, very long way to go towards making ethically-sourced training models for generative ai. Your attempts to steer the narrative away from this are also an attempt to sweep the damage dealt to artists under the rug while you champion ai as the next best thing since sliced bread.
Since everyone has his own views on what's ethical or not (do you think Proudhon would approve of intellectual property?), I prefer to look at what's legal or not.



This goes beyond just scraping data, Adobe is also training directly on your creative process without telling you. This is beyond vile in every way I can imagine. The only way to protect yourself as an artist is simply to NOT USE THE TOOLS.

If they don't tell you, it's probably illegal. If there is, buried in the contractual clauses, a "you agree we can do what we want with the data we gather on the way you use our tools", then indeed, the only way to refuse to contractual agreement is not to enter it and renounce its benefits, like using the tools they made. So basically I agree with you on this. Using other tools that do not collect such information would be the sensible thing to do.
 
Last edited:

Art Waring

halozix.com
Maybe I'm just not understanding this, but why is this bad? If I use C++ to create a fantastic program, the program is my creation. Learning and using C++ is the process. Similarly, if I want to create a great painting, I need to learn the processes involved in proper brush strokes and much more. It seems to me that the process isn't what is owned, but rather the final content.
Well, think of it like someone watching everything you are doing while working in Adobe, violating your privacy without your knowledge or consent. Maybe you don't care about your own privacy, but I sure as hell do.

As an artist, my own artistic process is mine alone. No one has the right to monitor how I work or how I choose to create art without my consent. I use advanced techniques that I developed myself, including complex layering and other methods that I have spent two decades perfecting.

No one has the right to steal another artists process if they don't want to share it. (I have no problem with other artists sharing their techniques as this is a personal choice about how much you are willing to share about yourself with the public, but it is still their personal choice, not a choice forced upon them by Adobe). The fact that you only place value on the final output is proof alone that you don't understand what it takes to make art at a professional level.

Artists must go through several stages of creation & feedback to reach a finished result. This takes time, and sometimes you will need to make changes to accommodate a client if they feel something needs changing. This can go on until you are happy with the result. Then the process of inking begins, and the piece really starts to come to life. Coloring a piece then takes further time. This entire process can take weeks or months depending on your ability to work at a professional level.
 
Last edited:

Art Waring

halozix.com
No, the point of TDM is that indeed, if you don't respect it, you can't use scraping and you can be sued under regular copyright laws. That's why they called that the TDM exception. It conditions the legality of scraping to respecting opt-out. I don't see how you can say "there is no law..." when we're discussing a specific piece of legislation.
I am saying exactly the same thing you are. The laws are being discussed, bit there is no law set in stone to enforce opt-out lists. Until this happens they can ignore the (lack of any laws) at their convenience.

That I can understand. However, the job of lawmakers is to find the best outcome for society, not just artists. It seems that the flow of EU legislation (TDM exception in the DSM directive, last week's AI act...) seems to shaping the path toward more AI, with more controls for regulator (and exceptions to exempt itself from control) rather than the contrary.
Again, you are deflecting away from the real issue. They are using opt-out as a way to deflect blame onto artists. Opt-out lets corporations avoid blame for a system that they created that is currently both unethical & potentially illegal if the laws address this issue.

If they don't tell you, it's probably illegal. If there is, buried in the contractual clauses, a "you agree we can do what we want with the data we gather on the way you use our tools", then indeed, the only way to refuse to contractual agreement is not to enter it and renounce the benefits of it, like using the tools they made. So basically I agree with you on this. Using other tools that do not collect such information would be the sensible thing to do.
I do agree with this though. Transparency and trust are important issues right now. Artists will likely go with the tools that they can trust. I personally don't feel comfortable with being monitored while I work. Making art for me is very personal, and nobody has the right to pry open my personal life for the sake of corporate profits. Period.
 

I am saying exactly the same thing you are. The laws are being discussed, bit there is no law set in stone to enforce opt-out lists. Until this happens they can ignore the (lack of any laws) at their convenience.
The DSM was adopted in 2019 and the transposition period ended in 2021. It's entered into effect last August for the part that matter to the debate. It's no longer something being discussed.

Again, you are deflecting away of the real issue. They are using opt-out as a way to deflect blame onto artists. You are ignoring this completely to serve your own agenda. Opt-out lets corporations avoid blame for a system that they created that is currently both unethical & potentially illegal if the laws address this issue.
I am pointing out that the law in this case already happened and in the balance of interest between protecting artists interest, consumer interests, big corporations interests and so on, the EU has selected "opt-out". The "legality" part of the argument can be closed, provided model makers do actually respect opt-out, of course. A single one will suffice, as my only agenda being that I'd like to be able to use a legal tool to illustrate my character sheet.

The ethicality of EU laws and lawmakers stays an open debate, but probably best for another board, as much as the morality of US laws.
 
Last edited:


Art Waring

halozix.com
The DSM was adopted in 2019 and the transposition period ended in 2021. It's entered into effect last August for the part that matter to the debate. It's no longer something being discussed.


I am pointing out that the law in this case already happened and in the balance of interest between protecting artists interest, consumer interests, big corporations interests and so on, the EU has selected "opt-out". The "legality" part of the argument can be closed, provided model makers do respect opt-out, of course.

The morality of EU laws stays an open debate, but probably best for another board, as much as the morality of US laws.
The fact that they have elected for opt-out only is more of an indication that pro-ai lobbyists have succeeded in doing their job working for big tech. Artists don't have the money or the influence of big tech, to state that the laws favor corporations should come as no surprise when they have all the power.

Artists are a particularly vulnerable group, and the first laws put into action are very likely going to favor corporations, or they will be completely ineffective at addressing the problems we are facing.

Corporations are very good at bending the law to serve their greed. Little has changed in this regard. And it likely won't change to protect artists either. That doesn't make it right, or ethical, or in any way justified. The laws will likely take a decade or more to actually address the damage dealt to artists, writers, actors, & other affected professionals. By then the damage will have been done, and it will be too late for anyone to do anything about it.
 

As I said, I won't engage you on this as I feel that the discussing the effective democratic working of a 450-million persons union is political in nature.
 

Remove ads

Top