Artificial Intelligence and the future of Human Endeavor

Zardnaar

Legend
This is a very crude, blanket view of AI. The programs that are melding art into new images or prose into new stories are specialised tools, a far cry from anything capable of actual intelligent thought. Generalised artificial intelligence is still far enough away that we can't even truly define an end-point, let alone estimate it - decades at best.

There's no switch that's going to be thrown and we're suddenly all out of a job, this will be a gradual process that we'll adapt to as we go along. And guess what, by the time those AI are sophisticated to take over our intellectual jobs, they'll also be getting smart enough to attend to our psychological needs. By the nature of the problem itself, it's not one we'll have to solve by ourselves.

Assuming they want to, of course. True AIs are going to think very differently than we do, in ways we literally can't even imagine.

You monkies are obsolete. Delete delete delete.

Personally entertained only when it effects you people start to care.
 

log in or register to remove this ad

AI is just not there yet. The real world is just to chaotic.

Just take that "cute cat" restaurant robot. Well, it can't take orders...still need a real human for that. All it does is sit in the corner until the food is ready. Then a human tells it what table to go to...and it does...slowly. Then it sits there and HAS THE CUSTOMER take their food plate off the shelf. Wow...what a robot worker.

Lots of them self driving cars and trucks crashed...why..well because the AI could not handle the real world. The AI saw a shadow, so it slammed into a brick wall, and so on.

My local Rallys has a Order Bot....and it's not so great. Guess it's hard for the AI to understand humans. We say "Big Buford meal" and it hears "you want to order onion rings with ketchup".


But even when AI gets to the point that it can do work.....SO WHAT? Let the AI's do all the work.
People do NOT need to work themselves to death for society to function.
 



Scribe

Legend
And why do corporations "need" to pay people? Maybe just get rid of "Corporations" too. After all, some AI's can do that job.

Indeed, and so what exactly does the AI need humanity for at all? We just take up space, pollute, and consume, when looked at from the highest level.

If AI get's to the point where it actually has awareness of itself and the things (humans) around it, what incentive does it have for providing the best quality of life for an organism which provides nothing and has removed itself from the natural order of the world?

The world is so far away from being 'ready' for actual AI, and the assumed job losses, that it will be catastrophic to the point I hope it doesnt happen within my son's life time, let alone mine. Corporations as we are told exist to make further profit. AI is (just as computing was, and many things before it) a way to downsize labour to increase profits.

So, without some kind of guaranteed income, which would only be funded by governments, which are funded by taxes, which are paid by people (who have to make money) or corporations...how does it possibly work.

We are not all going to just become 'creatives' when the robots/AI replaces vast segments of the working west, I dont care what Star Trek tried to sell us.

I for one reject our AI overlords. :LOL:
 

Indeed, and so what exactly does the AI need humanity for at all? We just take up space, pollute, and consume, when looked at from the highest level.
Enter Skynet......
If AI get's to the point where it actually has awareness of itself and the things (humans) around it, what incentive does it have for providing the best quality of life for an organism which provides nothing and has removed itself from the natural order of the world?
Well, our aware AI might be advanced enough to realize it needs humans. Humans that are not forced to work to death can "reconnect to nature" in whatever way you would like.

Maybe the AI might be a smart as a dog. Want to know an Amazing Ability that dogs have? Dogs know they don't know everything and understand humans are smarter......and they trust humans to help them. A dog in trouble knows to ask a human for help and let them help. An AI would be luck to be that smart.
Now, sure humans have trained dogs to do this for 40,000 years.....but "train" and "program" are quite similar.

The world is so far away from being 'ready' for actual AI, and the assumed job losses, that it will be catastrophic to the point I hope it doesnt happen within my son's life time, let alone mine. Corporations as we are told exist to make further profit. AI is (just as computing was, and many things before it) a way to downsize labour to increase profits.
Well we should get (climate saving) walled cities in 2023, so we are well on our way to the 'classic' future dystopia.
So, without some kind of guaranteed income, which would only be funded by governments, which are funded by taxes, which are paid by people (who have to make money) or corporations...how does it possibly work.
Well, lots to unpack......
So why is it "governments are only funded by taxes"? Who said that was the only way? You know there WAS a time the USA had no annual income tax? Humm, how did the government pay for stuff back then?

For THAT matter...say, why can't the government make money like a company can? Why is that "wrong"?

For THAT matter, if the government wants to do something...why does the government have to "pay" for it? Maybe others should pay their "tax" in the form of what the government needs. Maybe get ALL money OUT of government.

We are not all going to just become 'creatives' when the robots/AI replaces vast segments of the working west, I dont care what Star Trek tried to sell us.

I for one reject our AI overlords. :LOL:
The Resistance needs Browncoat Cutters like you, but remember the Computer is your Friend.
 

Scribe

Legend
Well, our aware AI might be advanced enough to realize it needs humans.

For what?

Now, sure humans have trained dogs to do this for 40,000 years.....but "train" and "program" are quite similar.

Right, but in my feared scenario, the AI becomes intelligent enough, to self train/program. Thats the point, if robotics can at all keep up, in which we are replaced with an accelerated pace that human society will be unable to adapt to.

So why is it "governments are only funded by taxes"? Who said that was the only way? You know there WAS a time the USA had no annual income tax? Humm, how did the government pay for stuff back then?

For THAT matter...say, why can't the government make money like a company can? Why is that "wrong"?

For THAT matter, if the government wants to do something...why does the government have to "pay" for it? Maybe others should pay their "tax" in the form of what the government needs. Maybe get ALL money OUT of government.

At the risk of getting political, because (most?) many of the government or crown corporations of the west, have been privatized, to increase profit. The odds of that getting brought back into the fold of government, in a world in which AI is a real and actual threat is low because again, why would they?

The government has to pay, for the same reason we all do. We need to buy into the concept that money has value.

Unless you are proposing we would just become a global society that provides for all and that AI/Robots will free us from toil and bring the literal billions up to the living standard of the west....somehow, without ravaging the planets resources of natural resources, so we can all become youtube stars and play D&D forever?

Again, I dont buy into the Star Trek utopia mindset I'm afraid.
 

Well, the idea is the AI is smart enough to know it needs humans. Even if an AI could program itself....it would be stuck with the same things. AI's just don't have human qualities.

Ask an AI to draw a super hero, and it draws a guy in a cape or whatever it was "told" a super hero looks like BY A HUMAN. Ask a human to draw a super hero...and they will likely draw their favorite super hero....but an AI will never have a favorite. And quite often a human will draw something cute, funny, twisted, ironic, or lots of other things. Some might draw a fireman, or a sandwitch(er, we call sub sandwitchs in the Midwest 'heroes') or even their Mom.

so a smart AI will know it needs humans....

How about this utopia: Each human and AI robot forms a pair. The robot 'part' does most of any manual labor needed. The AI does the math/information/logic/calculations and the human does all the feelings/dreams/emotions/inspiration.
 

Scribe

Legend
Well, the idea is the AI is smart enough to know it needs humans.

Help me out here. This is the one that I have a problem with most, because when I look at the balance sheet, I dont have 'Human' as a net positive, for any view other then a selfish Human one.

No other creature on this planet, is going 'Yeah, we for sure need those humans.' other than the domesticated animals which we have intentionally bent to the will of our species.

An AI which is self aware, conscious, and able to 'think' is not going to look at humans, as a net positive.
 

Stalker0

Legend
An AI which is self aware, conscious, and able to 'think' is not going to look at humans, as a net positive.
Its also a question of how the AI interprets its directives. For example lets say its programmed to view humans as a net positive, and is told to find a way to save the planet so that humans aren't wracked by climate change.

I mean one way to do that is.... kill most of humanity, raise some children with no knowledge of the past so no technology, and effectively start the race over. Nature starts cleaning up the planet, humans are now once again in balance with nature and no longer a danger to themselves because of the ecological damage. Problem solved!
 

Scribe

Legend
Its also a question of how the AI interprets its directives. For example lets say its programmed to view humans as a net positive, and is told to find a way to save the planet so that humans aren't wracked by climate change.
Yeah I mean this is all sci-fi stuff that has been written about and discussed for decades. Even if there is a 'prime directive' of dont harm humans, its one 'glitch' away from just killing all the humans anyway.

I dont know, for me, its easily one of the most terrifying prospects of what we as humanity can inflict upon ourselves unwittingly.
 

payn

Legend
Since humans lived in caves we had existential fears. Oh but that is silly X will really take us out. Just kidding it's y that will take us out. Though really its z. On and on it goes. Sleep tight.
 

Scribe

Legend
Since humans lived in caves we had existential fears. Oh but that is silly X will really take us out. Just kidding it's y that will take us out. Though really its z. On and on it goes. Sleep tight.

Right, but since the nukes, we really had the stupidity to make it a reality. :p
 



payn

Legend
I don't think so.

Then again, I guess I'm really not saying that everyone will be wiped out, there's folks self sufficient around the world who will be fine.
Meteors, solar flares, etc... Extinction events are not solely of our own making. Sleep tight.
 

Dausuul

Legend
Help me out here. This is the one that I have a problem with most, because when I look at the balance sheet, I dont have 'Human' as a net positive, for any view other then a selfish Human one.

No other creature on this planet, is going 'Yeah, we for sure need those humans.' other than the domesticated animals which we have intentionally bent to the will of our species.

An AI which is self aware, conscious, and able to 'think' is not going to look at humans, as a net positive.
That's based entirely on your human values. What makes us so bad? Environmental damage, war, exploitation of other species and of one another? We think those are bad things, but that doesn't mean an AI would care about them at all.

Whether we are a net positive, negative, or neutral depends on what the AI values. And since we have presumably designed it to value "serving humans*," it's almost certain to regard at least some of us as a net positive.

*More specifically, "serving customers of the company that built you."
 

Hussar

Legend
The whole notion of the singularity was predicated on this.

I mean, heck, I'm teaching English to Japanese student. Language teaching is going to go the way of the buggy whip in very short order. I've always known that this was coming - the only question was could I retire first. Which, I think I will be able to. But, like the SF singularity, any prediction we're making today is very likely wrong.

Think about something as simple as teaching. Most classrooms, from about, what, 5th grade into university, base significant chunks of the grading on reports, essays, that sort of thing. Right now they have AI's that can write essays that are certainly good enough for a passing grade, even at the undergraduate level. To the point that I have an AI program that will check essays to tell me if they have been written by an AI. :p

But, that's not sustainable. Which means, IMO, we're going to have to go back to Socratic style classrooms where the majority of the grading and evaluation is done face to face, in class (or possibly screen to screen for remote learning, but, you get my point). But, who's going to pay for that? Even in the stone ages when I went to university, some of my classes had over 200 students. I cannot imagine it's any better now.

Are we suddenly going to spend five or ten times more money on education than we are now? Good luck with that. AI assisted learning?

Like I said, any prediction that anyone is making today is just spitting in the wind. No one has the slightest idea what things will look like in 10 years. Heck, the first Iphone was only 15 years ago. Let that sink in a minute. 20 years ago, no smart phones. Now, we cannot even imagine life without one. Do you really think you could have predicted the rise of the smart phone in, say, 1998? Hell, we still had beepers back then.

The rate of change just increases every year.
 

MarkB

Legend
AIs aren't going to have wants or needs anything like those of humans because our wants and needs are based on a whole host of things that we either can't or won't build into the design of an AGI:
  • A blind-idiot evolved processing system that combines very sophisticated neural networking with brute-force chemical triggers.
  • Millions of years' worth of evolved instincts and stimulus-responses, most of which were hard-coded long before we achieved actual intelligence and are therefore not designed to serve the needs of thinking beings.
  • A very specific set of sensory systems, none of which an AI will share with us one-for-one, whose inputs are not presented to our conscious mind unfiltered but instead are processed through subsystems with thousands of built-in biases and pattern-processing quirks and techniques.
  • A whole meat body that requires constant maintenance and management, to which much of our brain is dedicated without any conscious thought on our part, and whose needs impact our decision-making in thousands of subtle ways we're barely consciously aware of.
  • Well over a decade of neurological training through interaction with other members of our species, shaping our goals, attitudes, beliefs and values through mechanisms we still don't fully understand.
  • A basic but powerful sense of self-preservation that's the culmination of hundreds of millions of years of our ancestors being the ones who survived, prospered and procreated to breed the next generation.
What we get if we build a thinking being that doesn't incorporate any of these things, or which incorporates our own flawed impressions of them, is basically anyone's guess.
 

UngainlyTitan

Legend
Supporter
AIs aren't going to have wants or needs anything like those of humans because our wants and needs are based on a whole host of things that we either can't or won't build into the design of an AGI:
  • A blind-idiot evolved processing system that combines very sophisticated neural networking with brute-force chemical triggers.
  • Millions of years' worth of evolved instincts and stimulus-responses, most of which were hard-coded long before we achieved actual intelligence and are therefore not designed to serve the needs of thinking beings.
  • A very specific set of sensory systems, none of which an AI will share with us one-for-one, whose inputs are not presented to our conscious mind unfiltered but instead are processed through subsystems with thousands of built-in biases and pattern-processing quirks and techniques.
  • A whole meat body that requires constant maintenance and management, to which much of our brain is dedicated without any conscious thought on our part, and whose needs impact our decision-making in thousands of subtle ways we're barely consciously aware of.
  • Well over a decade of neurological training through interaction with other members of our species, shaping our goals, attitudes, beliefs and values through mechanisms we still don't fully understand.
  • A basic but powerful sense of self-preservation that's the culmination of hundreds of millions of years of our ancestors being the ones who survived, prospered and procreated to breed the next generation.
What we get if we build a thinking being that doesn't incorporate any of these things, or which incorporates our own flawed impressions of them, is basically anyone's guess.
QFT

But I should add that I do not believe that any current approach to AI will generate anything like sentience or intelligence except by some fluke.
That is not to say that AI will not be socially disruptive and the successful societies of the future will be the ones that handle the consequence.
 

An Advertisement

Advertisement4

Top