• The VOIDRUNNER'S CODEX is coming! Explore new worlds, fight oppressive empires, fend off fearsome aliens, and wield deadly psionics with this comprehensive boxed set expansion for 5E and A5E!

AI is stealing writers’ words and jobs…

The Sigil

Mr. 3000 (Words per post)
I will just throw out here that in my experience, for many people in creative endeavors it is the performing of the "boring jobs that can be automated" that provide them with the foundational understanding needed to perform the "advanced new jobs" - it is the hours of tedious coding that give the great coder the ability to truly grok more efficient ways to code things. It is the hours of tedious paralegal work that give the great lawyer the insight to truly revolutionize contract law. It is the tedious painting of simple forms by which the artist is able to understand angles and light and shading and anatomy to create truly transcendent new artistic techniques.

If we automate the rote tasks used to gain competency in a field, we limit future participation in the field not to journeymen willing to put in the time and work to learn the basics, but instead to savants that intuitively grasp the medium and can create brilliance without ever needing to put in the time and effort to master the drudgery and basics. This will tend to push people to work in fields for which they are uniquely gifted (whether you feel this is good or bad depends on whether or not you feel people should be able to exercise agency to apply themselves to a field and substitute "experience and effort" for "natural aptitude and whether or not you worry the word "only" is going to be inserted between "to" and "work in fields"). There is also a question of "what happens to people with a work ethic and willingness to learn that do not happen to have a savant aptitude in a field that pays well?"

This speaks directly to a lot of people as the modern mythology (which I see bubbling up throughout this thread) is generally summed up as a general belief that "those that are willing to learn and do the (tedious) work will be rewarded." This tends to be followed up by expressing, "if you aren't being rewarded, you must either be unwilling to learn or unwilling to do the work." (Note, this conclusion IS true if the premise is true; however, I would suggest that the AI revolution at the very least should be quickly disabusing us of the notion that "those that are willing to learn and do the work will be rewarded" is true because in general, AI is replacing humans that are willing to learn and do the work thus leaving humans it replaces looking for another path to reward.

The problem as I see it is that we tend to attach a moral judgment to humans whose work is being replaced ("if you were virtuous, you would learn a new set of skills and do that work and be rewarded") but refuse to apply a balancing moral judgment to the technology doing the replacing (i.e., instead of saying, "it is not virtuous to use technology to replace the virtue of hard work and dedication to learning a craft" but instead we say "the tech is not immoral, it is amoral - it doesn't matter whether or not you think the tech is virtuous, it WILL replace you so stop complaining.")

I instead ought to offset our realistic views (the tech IS coming, whether we like it or not) with jettisoning moral judgments of those being replaced or perhaps entirely rethinking whether or not we should be ascribing virtue to being "willing to learn and do the (tedious) work." And if we are going to decouple the idea of being willing to learn and do tedious work from being virtuous, what virtue do we intend to replace it with? And how do we align our societal reward structure (mostly in the form of money) to this new virtue system?

If the new virtue/reward system is "rewards should flow to the one that comes up with the best way to replacing humans with technology under his control - thus gaining the rewards that used to flow to those humans" it seems we have constructed a tautology ("he is rewarded because he is virtuous and he is virtuous because he is rewarded"). If the system is a post-Roddenberry system ("being human is virtuous and once we have produced sufficient technology that humans need not work, all the rewards of that technology ought to be distributed more or less equally - however you'd like to define "equally" - amongst humanity") that is self-consistent, too, I suppose.

From this lens, the uproar is not about this particular round of technological replacements of human jobs as such (looks at buggy whip makers and car maker analogies) but instead that we are now being forced to confront the question of, "do we still believe that the virtuous are being rewarded" because our society is changing such that it is undeniably clear that being society's classical definition of a virtuous citizen ("someone who is willing to learn and do (tedious) work") is no longer sufficient to garner rewards. This is forcing us to come to grips with the uncomfortable reality that we must either re-define virtue (which makes us uncomfortable) or admit that our system of assigning rewards does not match with our values (which also makes us uncomfortable). We are also struggling to create a mental framework by which we can express what one must do in order to garner a share of rewards sufficient to live comfortably as we confront the realities of the current system.

Now, I don't have an answer for the what the system SHOULD be or even what the prerequisites are NOW to make sure you receive rewards to allow you to live comfortably - the best I can say now is "hope you are lucky enough to have talents that are in demand" but to those who say, "I have no talents that are in sufficient demand" I have no answer - no method to say, "do this and you can develop such talents" and I think that's why a lot of people are terrified at what this means - because we don't know what to do to for ourselves to ensure we continue to be "virtuous" enough in the eyes of the new society to be rewarded (and even if we think we do now, deep down we know we can't be assured we still will have that in the future).

All I can do is observe and opine that over the next 20 or so years, there will be a great struggle for society to redefine the parameters of "what is virtuous" and "how much reward does technology deserve" and I also think there will be a lot of collateral damage as that happens, with a lot of human misery. This doesn't mean I'm a "prepper" or "apocalyptic" but I am doing what I can with my resources now to minimize my risk against changing world circumstances (for example, one of my priorities - that I admittedly may not be able to achieve as quickly as I like - is to reduce my debt load by doing things like paying of mortgages so at least if my circumstances change I am more or less assured of having a roof over my head).

So, yeah, no great solutions offered here, merely observations that those that are saying "technology is coming for your job" aren't really comforting those whose jobs technology is coming for and the oft-repeated platitude of "prepare for a new/better job" may not be realistic for everyone... and perhaps more energy should be spent trying to help define a new paradigm for those people to exist in than simply dismissing them since the old "learn stuff and work hard" may no longer be available.

I should also note that I do think there are upsides to freeing ourselves from doing tedious tasks (I have experienced such in my own employment); however, it does make one uncomfortable to let go of tedious tasks that keep one employed without a clear vision of what the new tasks that will keep one employed are to become.
 

log in or register to remove this ad

overgeeked

B/X Known World
To me, it is worse. I see nothing good in endlessly recycling what has gone before, in preference for what could be. There's an awful lot of SF based on the concept of humankind becoming complacent and letting machines do literally everything for us, leading to our downfall. Now I guess such fiction will be produced ironically?
By the machines for us because we can't be arsed to learn to write.
 



Andvari

Hero
I don't understand this. Teachers/professors are constantly using copyrighted materials to teach students. Why would it suddenly become illegal if the student is AI?
When AI learns the material, I'm guessing it essentially has a 1:1 copy of the material which is then stored digitally. Similar to how it is illegal to store the data of a movie you don't own. But you can still watch a movie and "store it" in your brain. Although perhaps not if you are Johnny Mnemonic.
 

TheSword

Legend
I don't understand this. Teachers/professors are constantly using copyrighted materials to teach students. Why would it suddenly become illegal if the student is AI?
I’m sure that will be one of the arguments the AI companies use. I also think it’s why we will end up going back to legislation because existing copyright law will be unsatisfactory
 

I don't understand this. Teachers/professors are constantly using copyrighted materials to teach students. Why would it suddenly become illegal if the student is AI?

Yes, and in theory the teachers could get called out over this, but they don't because they are small fries: a teacher doing a public showing of a movie isn't something studios care about. Some people might care about people handing out photocopies of articles, but this is generally-speaking this is done at such a personal level that it's not a threat to a scholar's way of life.

AI is rather different: it's using someone's copyrighted work for the profit of someone, especially when that work is directly using your own, that's a bit different. It's not some teacher copying an article or taking a movie clip, it's a corporation using your stuff to imitate you in a way you didn't consent to. The scale of things is different, as are the usages. Me using someone's art in class is not a threat to the artist, a corporation creating an algorithm that scans all an artist's art without their permission and uses it to assemble a pseudo-version is.

It's just a really flawed comparison.
 


Schools pay for the copyrighted material.

Well, we do for a lot of things. But sometimes I might copy an article or something for a paper for a class, or something as such. I can cite it however I want, but I'm still using copyrighted material for my class that I don't have permission to use (well, sometimes; some materials give a specific okay for educational purposes). But even then, that doesn't really change the argument nor the result: yes, they could stop me if they wanted to. It's within their right! They don't because my usage is so small that it barely matters. It's just not the same scale.

It's like comparing a massive torrent site versus a guy photocopying sections of a rulebook so that everyone in his group can have access to that section without owning a book. One is closer to an actual problem, the other is not.
 

Ryujin

Legend
Why? Because there are specific educational copyright exemptions, with very narrow use cases, that generally do not extend to actually copying those works in the mechanical or data sense.

When I was in second year of my Electronics Technology course, in college, an instructor started handing out entire photocopied sections of a textbook that he preferred to the one we were instructed to buy, by the college. We raised holy hell over it. He got smacked down.

 

Remove ads

Top