D&D General Plagiarised D&D art

Umbran

Mod Squad
Staff member
Supporter
They are not used more effectively, they are replaced. What better, higher paying jobs will they do?

The jobs that are lost to automation generally are replaced by jobs building and supporting technology.

Yes, the people who made buggy whips lost their jobs. Those jobs went away. But, the automotive industry, even with its automation, employs far more people than making buggies and buggy-related equipment ever did. The unfortunate truth is that there's a generational change in this - the new jobs are for new folks who get educated in the skills for the new jobs.

The solutions for the people who used to make buggy-whips exist, but lie in the realm of politics, so we can't get into them here.
 

log in or register to remove this ad


Lanefan

Victoria Rules
It is a vast improvement. Mostly the professional media has been sidelined in place of amature media, both good and bad.

I don't watch the likes of Fox News or CNN or any other main outlet as I think they are worthless when it comes to reporting news.
If major media outlets are worthless at reporting news, then who in your view is worthwhile at it?
 

Lanefan

Victoria Rules
It very much is soulless. The AI generated art is imitation, it is not even art.
Here I disagree:AI art is still art, just as much as if a person did it.

It's just the manner of its creation that's different and - arguably - dubious.
I've been playing with Dalle-3 now for weeks. I have 1000's of images generated to see what it could do. For my dollar (if I was to spend on it) its already better than content which WotC sold at premium prices.
For my purposes, that's all I need.
So if some untrained casual (myself) can take a free product, and generate images that to his casual eye is ALREADY better than an artist who has been paid by the largest employer in the space (WotC) how is that spawning more jobs? How is that helping actual artists?
It isn't.

But if my options are to not pay an artist and end up with no art or to not pay a computer and yet still end up with some art, the latter seems by far the preferable choice. :)
It could, if it wasnt going to be under control of the major tech companies who have already demonstrated that they will control the narrative, build the algorithms to benefit themselves, and wield influence that dictators around the world could only dream about 10 years ago.

Do you honestly think 'the people' are going to control AI?
With this, you're on to a much more valid objection IMO.
 


Lanefan

Victoria Rules
Yeah, no. Art is expression. These non-sapient AIs are not capable expression, thought, or emotion.
A photograph is art, and for some reason copyrightable even though anyone else could in theory take the same photo.

No expression involved - I just point my camera and click.

Expression can be an element of art, certainly, and very often is; but it's an optional element.
 

So I don't want to sound like a luddite, but I probably will.

I am somewhat sceptical that the AI boom in the long run will lead to new jobs for humans, at least at the pace it replaces them. Comparisons to technological advances of the past have been made. In most cases those have been about lessening physical labour, moving the human jobs more into creative/planning/intellectual stage. AI seems different. It does not think, but nevertheless it will replace jobs that require thought. And I am afraid the direction it will push the form of human labour will be opposite than what technological advance has historically produced. AI will program, create art, invest, develop scientific theories, whilst majority of humanity will work in intellectually unfulfilling drudgery of menial jobs which are not cost effective to robotise.
 
Last edited:

AI will program, create art, invest, develop scientific theories, whilst majority of humanity will work in intellectually unfulfilling drudgery of menial jobs which are not cost effective to robotise.
So here's the problem with your doomsaying - current AI is absolutely terrible at most of this because of the way it works, and that's not going to change. The problem is that you are acting like AI is intelligent, but it's not.

It's just automation. That's all we have right now - attempts to replicate human processes by essentially brute-force-based automation. This isn't SkyNet, it's not HAL 9000, it's not even Johnny Five. It's predictive text writ large.

Current AI cannot do any of this stuff except by copying humans, because it's just brute-force automation. There's no real intelligence here. What's changed isn't some sort of amazing principal, just that's easier than ever to operate in this brute-force paradigm. Let's look at your examples:

Programming - Current AI cannot innovate in programming. It can only copy what people have already done, and it has an unfortunate tendency to screw it up severely, especially if it's asked to do anything novel. It can save you a lot of time, and if what you're doing is trivial or repetitive programming based on a very standard approach, it could make a big difference, and potentially replace some jobs. But the idea that current AI is going to replace programmers generally? No.

Create art - No. Current AI is again, just copying humans. An awful lot of what it's doing is essentially just laundering plagiarism. This can and has been discussed in a lot more detail. Further, the art it creates tends to be boring, unoriginal, kind of messed-up and so on. A lot of people will overlook this for stuff like cheap/free character portraits or [politician] doing [funny thing], but current AI isn't "creating art" in any meaningful way. Further, there's strong societal resistance to this, and no sign it's being eroded - rather strengthening - partly because so much AI art is just bad and ugly and stupid, especially if you do more than glance at it.

Invest - AI is causing absolute chaos trying to do this, sure, but is it replacing stockbrokers etc. effectively? Not really at this point, in part because it's so chaotic, and the more AI interacts with the market, the more chaos and idiocy in causes. I think we'll see very serious harm done by AI being used to try and do "investment" (not really investment, just screwing around with the stockmarket), it'll probably cause a crash at some point, but current AI replacing these jobs? No. I also expect regulation to cause some issues for AI here, especially if the NYSE or something gets crashed by it.

Develop scientific theories - Flatly no. Current AI is entirely incapable of this. I am not sure where you are getting this from. AI is good for going through stuff that requires insane numbers of iterations and extremely boring to humans, like helping to figure out protein-folding stuff. That's not developing theories. This is already heavily automated and stuff like quantum computing is going to make a big impact too, but no, current AI cannot "develop scientific theories".

So yeah, in our current era, this is all not worth worrying about. For AI to really change things here, we'd need true AI - actual intelligence - the ability to understand things, to contemplate things, not just respond to input. Not brute-force stuff which is essentially what all the current flashy AI stuff is. It's all predictive text of one kind or another. And my strong suspicion is that when we do get true AI, in anywhere from 10-1000 years, the technology needed to simulate roughly as smart as a smart human will require a huge amount of extremely expensive manufacturing and ton of energy to run (at least initially). Certain problems will absolutely benefit from it, if we can get it to be cooperative, but mass replacement of jobs? Not soon.

I absolutely hate to say this, but this is really a "Who moved my cheese?" situation still. LLMs and similar predictive text-type AI (which includes the art stuff, to be clear) will cause job losses, but they require so much fiddling with to use, that a whole bunch of other jobs will appear to manage them, check their outputs, and frankly, probably to generate content to input into them. Because here's the thing - there's a limited amount of non-AI-generated stuff out there, especially that's genuinely legally available (an issue that will become more and more painful for AI firms, especially as government regulation increasingly comes in), and already we've seen AI starting to eat it's own tail as it were. Musk's Grok AI developed a bunch of behaviours because it was just copying other AI generated material. This will really limit what these kind of AIs can do, unless you can give them more, new material. And where could that come from? From people. Because if you try to use AI you just create an increasingly bad GIGO cycle.

True AI might change this. True AI is not even on the horizon. None of the technologies which make LLM-type AI are really the same tech that would make a true AI. This is kind of a dead end tech, even. It'll get refined - slowly - with better training data, and a ton of manual interventions to change specific behaviours - indeed that may become a whole industry to itself, basically altering how LLM AIs work to get their outputs more useful. But you're not going to see true AI just evolve out of this.

(The GIGO/Ouroboros factor is likely to really cause big problems for AI art too in the future, because guess what isn't protected by copyright currently (nor should it be), and is spreading like wildfire over scrape-able sources? AI art. And don't think AI art generators have carefully chosen training data - they do not - they're huge indiscriminate. I'm sure they're already eating their own tail here, and the more they eat it, the worse it's going to be, the more AI art appears on scrape-able sources, the repetitive it's going to get.)
 
Last edited:

Umbran

Mod Squad
Staff member
Supporter
Do the people who make the 'horse and carriage' comparisons not realize how many dead-end technologies showed up and failed before they were supplanted. You might be backing zepplins, not cars.

Sure, there have been, and will be, dead-end technologies. Zepplins are probably a poor example, in that we can view airplanes as supplanting them, but I can get the idea of the dead-end.

But, to be clear, I am not "backing" any particular technology. I have no personal emotional investment in generative AI. If it curls up and dies, that's fine by me.

I am not backing the tech - I am backing acceptance of the fact that the only constant in the world is change, and the root issues are in how we deal with change, not any specific particular technology.
 

Umbran

Mod Squad
Staff member
Supporter
It's just automation. That's all we have right now - attempts to replicate human processes by essentially brute-force-based automation. This isn't SkyNet, it's not HAL 9000, it's not even Johnny Five. It's predictive text writ large.

You are entirely correct, that generative AI is not SkyNet, or HAL 9000.

But neither is it brute-force automation. That is colorful, but a thoroughly inaccurate description of the technology. Generative AI is, in fact, an attempt to avoid using brute force, by using empirical training to make what amounts to an educated guess.
 

Remove ads

Top