The Sigil
Mr. 3000 (Words per post)
I will just throw out here that in my experience, for many people in creative endeavors it is the performing of the "boring jobs that can be automated" that provide them with the foundational understanding needed to perform the "advanced new jobs" - it is the hours of tedious coding that give the great coder the ability to truly grok more efficient ways to code things. It is the hours of tedious paralegal work that give the great lawyer the insight to truly revolutionize contract law. It is the tedious painting of simple forms by which the artist is able to understand angles and light and shading and anatomy to create truly transcendent new artistic techniques.
If we automate the rote tasks used to gain competency in a field, we limit future participation in the field not to journeymen willing to put in the time and work to learn the basics, but instead to savants that intuitively grasp the medium and can create brilliance without ever needing to put in the time and effort to master the drudgery and basics. This will tend to push people to work in fields for which they are uniquely gifted (whether you feel this is good or bad depends on whether or not you feel people should be able to exercise agency to apply themselves to a field and substitute "experience and effort" for "natural aptitude and whether or not you worry the word "only" is going to be inserted between "to" and "work in fields"). There is also a question of "what happens to people with a work ethic and willingness to learn that do not happen to have a savant aptitude in a field that pays well?"
This speaks directly to a lot of people as the modern mythology (which I see bubbling up throughout this thread) is generally summed up as a general belief that "those that are willing to learn and do the (tedious) work will be rewarded." This tends to be followed up by expressing, "if you aren't being rewarded, you must either be unwilling to learn or unwilling to do the work." (Note, this conclusion IS true if the premise is true; however, I would suggest that the AI revolution at the very least should be quickly disabusing us of the notion that "those that are willing to learn and do the work will be rewarded" is true because in general, AI is replacing humans that are willing to learn and do the work thus leaving humans it replaces looking for another path to reward.
The problem as I see it is that we tend to attach a moral judgment to humans whose work is being replaced ("if you were virtuous, you would learn a new set of skills and do that work and be rewarded") but refuse to apply a balancing moral judgment to the technology doing the replacing (i.e., instead of saying, "it is not virtuous to use technology to replace the virtue of hard work and dedication to learning a craft" but instead we say "the tech is not immoral, it is amoral - it doesn't matter whether or not you think the tech is virtuous, it WILL replace you so stop complaining.")
I instead ought to offset our realistic views (the tech IS coming, whether we like it or not) with jettisoning moral judgments of those being replaced or perhaps entirely rethinking whether or not we should be ascribing virtue to being "willing to learn and do the (tedious) work." And if we are going to decouple the idea of being willing to learn and do tedious work from being virtuous, what virtue do we intend to replace it with? And how do we align our societal reward structure (mostly in the form of money) to this new virtue system?
If the new virtue/reward system is "rewards should flow to the one that comes up with the best way to replacing humans with technology under his control - thus gaining the rewards that used to flow to those humans" it seems we have constructed a tautology ("he is rewarded because he is virtuous and he is virtuous because he is rewarded"). If the system is a post-Roddenberry system ("being human is virtuous and once we have produced sufficient technology that humans need not work, all the rewards of that technology ought to be distributed more or less equally - however you'd like to define "equally" - amongst humanity") that is self-consistent, too, I suppose.
From this lens, the uproar is not about this particular round of technological replacements of human jobs as such (looks at buggy whip makers and car maker analogies) but instead that we are now being forced to confront the question of, "do we still believe that the virtuous are being rewarded" because our society is changing such that it is undeniably clear that being society's classical definition of a virtuous citizen ("someone who is willing to learn and do (tedious) work") is no longer sufficient to garner rewards. This is forcing us to come to grips with the uncomfortable reality that we must either re-define virtue (which makes us uncomfortable) or admit that our system of assigning rewards does not match with our values (which also makes us uncomfortable). We are also struggling to create a mental framework by which we can express what one must do in order to garner a share of rewards sufficient to live comfortably as we confront the realities of the current system.
Now, I don't have an answer for the what the system SHOULD be or even what the prerequisites are NOW to make sure you receive rewards to allow you to live comfortably - the best I can say now is "hope you are lucky enough to have talents that are in demand" but to those who say, "I have no talents that are in sufficient demand" I have no answer - no method to say, "do this and you can develop such talents" and I think that's why a lot of people are terrified at what this means - because we don't know what to do to for ourselves to ensure we continue to be "virtuous" enough in the eyes of the new society to be rewarded (and even if we think we do now, deep down we know we can't be assured we still will have that in the future).
All I can do is observe and opine that over the next 20 or so years, there will be a great struggle for society to redefine the parameters of "what is virtuous" and "how much reward does technology deserve" and I also think there will be a lot of collateral damage as that happens, with a lot of human misery. This doesn't mean I'm a "prepper" or "apocalyptic" but I am doing what I can with my resources now to minimize my risk against changing world circumstances (for example, one of my priorities - that I admittedly may not be able to achieve as quickly as I like - is to reduce my debt load by doing things like paying of mortgages so at least if my circumstances change I am more or less assured of having a roof over my head).
So, yeah, no great solutions offered here, merely observations that those that are saying "technology is coming for your job" aren't really comforting those whose jobs technology is coming for and the oft-repeated platitude of "prepare for a new/better job" may not be realistic for everyone... and perhaps more energy should be spent trying to help define a new paradigm for those people to exist in than simply dismissing them since the old "learn stuff and work hard" may no longer be available.
I should also note that I do think there are upsides to freeing ourselves from doing tedious tasks (I have experienced such in my own employment); however, it does make one uncomfortable to let go of tedious tasks that keep one employed without a clear vision of what the new tasks that will keep one employed are to become.
If we automate the rote tasks used to gain competency in a field, we limit future participation in the field not to journeymen willing to put in the time and work to learn the basics, but instead to savants that intuitively grasp the medium and can create brilliance without ever needing to put in the time and effort to master the drudgery and basics. This will tend to push people to work in fields for which they are uniquely gifted (whether you feel this is good or bad depends on whether or not you feel people should be able to exercise agency to apply themselves to a field and substitute "experience and effort" for "natural aptitude and whether or not you worry the word "only" is going to be inserted between "to" and "work in fields"). There is also a question of "what happens to people with a work ethic and willingness to learn that do not happen to have a savant aptitude in a field that pays well?"
This speaks directly to a lot of people as the modern mythology (which I see bubbling up throughout this thread) is generally summed up as a general belief that "those that are willing to learn and do the (tedious) work will be rewarded." This tends to be followed up by expressing, "if you aren't being rewarded, you must either be unwilling to learn or unwilling to do the work." (Note, this conclusion IS true if the premise is true; however, I would suggest that the AI revolution at the very least should be quickly disabusing us of the notion that "those that are willing to learn and do the work will be rewarded" is true because in general, AI is replacing humans that are willing to learn and do the work thus leaving humans it replaces looking for another path to reward.
The problem as I see it is that we tend to attach a moral judgment to humans whose work is being replaced ("if you were virtuous, you would learn a new set of skills and do that work and be rewarded") but refuse to apply a balancing moral judgment to the technology doing the replacing (i.e., instead of saying, "it is not virtuous to use technology to replace the virtue of hard work and dedication to learning a craft" but instead we say "the tech is not immoral, it is amoral - it doesn't matter whether or not you think the tech is virtuous, it WILL replace you so stop complaining.")
I instead ought to offset our realistic views (the tech IS coming, whether we like it or not) with jettisoning moral judgments of those being replaced or perhaps entirely rethinking whether or not we should be ascribing virtue to being "willing to learn and do the (tedious) work." And if we are going to decouple the idea of being willing to learn and do tedious work from being virtuous, what virtue do we intend to replace it with? And how do we align our societal reward structure (mostly in the form of money) to this new virtue system?
If the new virtue/reward system is "rewards should flow to the one that comes up with the best way to replacing humans with technology under his control - thus gaining the rewards that used to flow to those humans" it seems we have constructed a tautology ("he is rewarded because he is virtuous and he is virtuous because he is rewarded"). If the system is a post-Roddenberry system ("being human is virtuous and once we have produced sufficient technology that humans need not work, all the rewards of that technology ought to be distributed more or less equally - however you'd like to define "equally" - amongst humanity") that is self-consistent, too, I suppose.
From this lens, the uproar is not about this particular round of technological replacements of human jobs as such (looks at buggy whip makers and car maker analogies) but instead that we are now being forced to confront the question of, "do we still believe that the virtuous are being rewarded" because our society is changing such that it is undeniably clear that being society's classical definition of a virtuous citizen ("someone who is willing to learn and do (tedious) work") is no longer sufficient to garner rewards. This is forcing us to come to grips with the uncomfortable reality that we must either re-define virtue (which makes us uncomfortable) or admit that our system of assigning rewards does not match with our values (which also makes us uncomfortable). We are also struggling to create a mental framework by which we can express what one must do in order to garner a share of rewards sufficient to live comfortably as we confront the realities of the current system.
Now, I don't have an answer for the what the system SHOULD be or even what the prerequisites are NOW to make sure you receive rewards to allow you to live comfortably - the best I can say now is "hope you are lucky enough to have talents that are in demand" but to those who say, "I have no talents that are in sufficient demand" I have no answer - no method to say, "do this and you can develop such talents" and I think that's why a lot of people are terrified at what this means - because we don't know what to do to for ourselves to ensure we continue to be "virtuous" enough in the eyes of the new society to be rewarded (and even if we think we do now, deep down we know we can't be assured we still will have that in the future).
All I can do is observe and opine that over the next 20 or so years, there will be a great struggle for society to redefine the parameters of "what is virtuous" and "how much reward does technology deserve" and I also think there will be a lot of collateral damage as that happens, with a lot of human misery. This doesn't mean I'm a "prepper" or "apocalyptic" but I am doing what I can with my resources now to minimize my risk against changing world circumstances (for example, one of my priorities - that I admittedly may not be able to achieve as quickly as I like - is to reduce my debt load by doing things like paying of mortgages so at least if my circumstances change I am more or less assured of having a roof over my head).
So, yeah, no great solutions offered here, merely observations that those that are saying "technology is coming for your job" aren't really comforting those whose jobs technology is coming for and the oft-repeated platitude of "prepare for a new/better job" may not be realistic for everyone... and perhaps more energy should be spent trying to help define a new paradigm for those people to exist in than simply dismissing them since the old "learn stuff and work hard" may no longer be available.
I should also note that I do think there are upsides to freeing ourselves from doing tedious tasks (I have experienced such in my own employment); however, it does make one uncomfortable to let go of tedious tasks that keep one employed without a clear vision of what the new tasks that will keep one employed are to become.