Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Next
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
Twitch
YouTube
Facebook (EN Publishing)
Facebook (EN World)
Twitter
Instagram
TikTok
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
The
VOIDRUNNER'S CODEX
is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!
Community
General Tabletop Discussion
*Dungeons & Dragons
Glory of the Giants' AI-Enhanced Art
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Jfdlsjfd" data-source="post: 9091075" data-attributes="member: 42856"><p>Or even that good, old, photocopy. Also, it should only matter if it's used to redistribute, because you can't rip off by just consuming for yourself. I know it might be controversial for some, but the people who just, say, download an image on his computer to print on his character sheet is doing no harm, since he's using the artwork privately for his own purpose. The result of the "mixing of points" could result in an objection to a technology in general, or an objection to a particular source for models, depriving the possibility to use a technology in a non-infringing way (the "non copy-protected outcome" chosen in the US) or ony private use since you can't infringe on copyright through private use.</p><p></p><p></p><p></p><p></p><p>Indeed, and I'd defende the word processor analogy. We don't regulate word processors, even if they can be used to retype the whole Shadow of Abracabra. We make it illegal to distribute your (home-typed) shadows of abracadabra books. It should be the same with AI. If it somehow was able to actively infringe, as in "write me a derivative work of an existing work, like the true ending of the Game of Thrones series", then it's the outcome that should be monitored, not the tool that can type out the book.</p><p></p><p></p><p></p><p></p><p>It's a contentious point, for some at least. For example, some have claimed that Adobe "strongarmed" them into relinquishing their rights because they didn't knew/understand what they relinquished and others say that they couldn't understand AI back when they entered into the contact. So you'll find people to refer to ethics even after giving consent contractually. At the other hand of the spectrum, you have the datamining exception validated in the EU, where a non-profit (in many case, public) research institution can basically do ignore copyright as long as they don't benefit financially from the outcome. Some will find it acceptable (the EU, obviously, who considered the overall benefit of having AI companies flourishing and paying taxes for the greater good), other might find that it's unfair to right holders who put their work on the Internet.</p><p></p><p></p><p></p><p>Indeed. I think however that nobody ever suggested (apart maybe a few who equate training to theft) that they were losing any right over their own work by allowing it to be used in training. It would extreme if the data in the training data had to be transfered and not only licensed for this use. Also, it would prevent using public domain works (because you can't appropriate them). I could see some case where it would be the case that the trainer owns exclusive rights over the material used in training: a private company that wouldn't disclose anything about their database, or a country-wide effort (if China or India tasks its art teachers to draw each a few specific pieces and caption them well, much better than the LAION-5B crap, they wouldn't need as many work to create an effective training model) and they might want to host the next "digital holywood without pesky actors and writers..." Those are fringe case, though. And TBH if a sovereign entity wanted to have a model like that, it would most probably create an exception for public training database much like they created an exception for public libraries to lend book.</p><p></p><p></p><p></p><p>TBH I am rather pessimistic over the livelyhood of many, many people, including many highly-paid jobs. But wealth-sharing method in a society is political so I won't say much more on this.</p><p></p><p>To the precise point of reproducing art in the way that it doesn't infringe copyright, it is easily within reach (in satisfactory or not satisfactory manners for the artists) especiall since it's a transient problem: at some point, there will be enough well-captioned, different pictures in the public domain or under free license that developping a model with them will be hasslefree and not the sole perview of Adobe.</p><p></p><p>I am pretty sure that worrying regulations, or the fear thereof, will only spur companies toward developping this database earlier than later. Also, it is a problem that can be lessened by explaining better how it works. There are people who honestly think that models are hundred of GB large because they must contain all the images in a compressed format (despite having functional model that are less than GB files). I guess many people think that "in the style of greg rutkowski" is a prompt that produces art that have the style of greg rutkowski in them, while it's not the case. It just happened that the autocaptioning of fantasy images displayed on artstation where labelled as "art by greg rutkowski", irrespective of whomever draw the picture. So it is a keyword that orients the generation toward generic fantasy art, with no link to greg rutkowski. If the artstation art had been labelled as art by George Washington, then you could ask the AI to draw dragons and pretty elves in the style of the late president's famous paintings... errr or not. Dispelling these worries would help allaying the fear of plagiarism (everyone can check by interrogating CLIP over their own artworks, it will propose styles of several artists none of which are you, generally). I have CLIP-ed a photography of a colleague I had just taken and she was said to be "an artwork by X, Y and Z" where those three were 19th century painters. I suppose that my Android phone has some dead painters's soul spliced to it ;-)</p></blockquote><p></p>
[QUOTE="Jfdlsjfd, post: 9091075, member: 42856"] Or even that good, old, photocopy. Also, it should only matter if it's used to redistribute, because you can't rip off by just consuming for yourself. I know it might be controversial for some, but the people who just, say, download an image on his computer to print on his character sheet is doing no harm, since he's using the artwork privately for his own purpose. The result of the "mixing of points" could result in an objection to a technology in general, or an objection to a particular source for models, depriving the possibility to use a technology in a non-infringing way (the "non copy-protected outcome" chosen in the US) or ony private use since you can't infringe on copyright through private use. Indeed, and I'd defende the word processor analogy. We don't regulate word processors, even if they can be used to retype the whole Shadow of Abracabra. We make it illegal to distribute your (home-typed) shadows of abracadabra books. It should be the same with AI. If it somehow was able to actively infringe, as in "write me a derivative work of an existing work, like the true ending of the Game of Thrones series", then it's the outcome that should be monitored, not the tool that can type out the book. It's a contentious point, for some at least. For example, some have claimed that Adobe "strongarmed" them into relinquishing their rights because they didn't knew/understand what they relinquished and others say that they couldn't understand AI back when they entered into the contact. So you'll find people to refer to ethics even after giving consent contractually. At the other hand of the spectrum, you have the datamining exception validated in the EU, where a non-profit (in many case, public) research institution can basically do ignore copyright as long as they don't benefit financially from the outcome. Some will find it acceptable (the EU, obviously, who considered the overall benefit of having AI companies flourishing and paying taxes for the greater good), other might find that it's unfair to right holders who put their work on the Internet. Indeed. I think however that nobody ever suggested (apart maybe a few who equate training to theft) that they were losing any right over their own work by allowing it to be used in training. It would extreme if the data in the training data had to be transfered and not only licensed for this use. Also, it would prevent using public domain works (because you can't appropriate them). I could see some case where it would be the case that the trainer owns exclusive rights over the material used in training: a private company that wouldn't disclose anything about their database, or a country-wide effort (if China or India tasks its art teachers to draw each a few specific pieces and caption them well, much better than the LAION-5B crap, they wouldn't need as many work to create an effective training model) and they might want to host the next "digital holywood without pesky actors and writers..." Those are fringe case, though. And TBH if a sovereign entity wanted to have a model like that, it would most probably create an exception for public training database much like they created an exception for public libraries to lend book. TBH I am rather pessimistic over the livelyhood of many, many people, including many highly-paid jobs. But wealth-sharing method in a society is political so I won't say much more on this. To the precise point of reproducing art in the way that it doesn't infringe copyright, it is easily within reach (in satisfactory or not satisfactory manners for the artists) especiall since it's a transient problem: at some point, there will be enough well-captioned, different pictures in the public domain or under free license that developping a model with them will be hasslefree and not the sole perview of Adobe. I am pretty sure that worrying regulations, or the fear thereof, will only spur companies toward developping this database earlier than later. Also, it is a problem that can be lessened by explaining better how it works. There are people who honestly think that models are hundred of GB large because they must contain all the images in a compressed format (despite having functional model that are less than GB files). I guess many people think that "in the style of greg rutkowski" is a prompt that produces art that have the style of greg rutkowski in them, while it's not the case. It just happened that the autocaptioning of fantasy images displayed on artstation where labelled as "art by greg rutkowski", irrespective of whomever draw the picture. So it is a keyword that orients the generation toward generic fantasy art, with no link to greg rutkowski. If the artstation art had been labelled as art by George Washington, then you could ask the AI to draw dragons and pretty elves in the style of the late president's famous paintings... errr or not. Dispelling these worries would help allaying the fear of plagiarism (everyone can check by interrogating CLIP over their own artworks, it will propose styles of several artists none of which are you, generally). I have CLIP-ed a photography of a colleague I had just taken and she was said to be "an artwork by X, Y and Z" where those three were 19th century painters. I suppose that my Android phone has some dead painters's soul spliced to it ;-) [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Dungeons & Dragons
Glory of the Giants' AI-Enhanced Art
Top