Artificial Intelligence and the future of Human Endeavor

Scribe

Legend
Its also a question of how the AI interprets its directives. For example lets say its programmed to view humans as a net positive, and is told to find a way to save the planet so that humans aren't wracked by climate change.
Yeah I mean this is all sci-fi stuff that has been written about and discussed for decades. Even if there is a 'prime directive' of dont harm humans, its one 'glitch' away from just killing all the humans anyway.

I dont know, for me, its easily one of the most terrifying prospects of what we as humanity can inflict upon ourselves unwittingly.
 

log in or register to remove this ad

payn

He'll flip ya...Flip ya for real...
Since humans lived in caves we had existential fears. Oh but that is silly X will really take us out. Just kidding it's y that will take us out. Though really its z. On and on it goes. Sleep tight.
 

Scribe

Legend
Since humans lived in caves we had existential fears. Oh but that is silly X will really take us out. Just kidding it's y that will take us out. Though really its z. On and on it goes. Sleep tight.

Right, but since the nukes, we really had the stupidity to make it a reality. :p
 



payn

He'll flip ya...Flip ya for real...
I don't think so.

Then again, I guess I'm really not saying that everyone will be wiped out, there's folks self sufficient around the world who will be fine.
Meteors, solar flares, etc... Extinction events are not solely of our own making. Sleep tight.
 

Dausuul

Legend
Help me out here. This is the one that I have a problem with most, because when I look at the balance sheet, I dont have 'Human' as a net positive, for any view other then a selfish Human one.

No other creature on this planet, is going 'Yeah, we for sure need those humans.' other than the domesticated animals which we have intentionally bent to the will of our species.

An AI which is self aware, conscious, and able to 'think' is not going to look at humans, as a net positive.
That's based entirely on your human values. What makes us so bad? Environmental damage, war, exploitation of other species and of one another? We think those are bad things, but that doesn't mean an AI would care about them at all.

Whether we are a net positive, negative, or neutral depends on what the AI values. And since we have presumably designed it to value "serving humans*," it's almost certain to regard at least some of us as a net positive.

*More specifically, "serving customers of the company that built you."
 

Hussar

Legend
The whole notion of the singularity was predicated on this.

I mean, heck, I'm teaching English to Japanese student. Language teaching is going to go the way of the buggy whip in very short order. I've always known that this was coming - the only question was could I retire first. Which, I think I will be able to. But, like the SF singularity, any prediction we're making today is very likely wrong.

Think about something as simple as teaching. Most classrooms, from about, what, 5th grade into university, base significant chunks of the grading on reports, essays, that sort of thing. Right now they have AI's that can write essays that are certainly good enough for a passing grade, even at the undergraduate level. To the point that I have an AI program that will check essays to tell me if they have been written by an AI. :p

But, that's not sustainable. Which means, IMO, we're going to have to go back to Socratic style classrooms where the majority of the grading and evaluation is done face to face, in class (or possibly screen to screen for remote learning, but, you get my point). But, who's going to pay for that? Even in the stone ages when I went to university, some of my classes had over 200 students. I cannot imagine it's any better now.

Are we suddenly going to spend five or ten times more money on education than we are now? Good luck with that. AI assisted learning?

Like I said, any prediction that anyone is making today is just spitting in the wind. No one has the slightest idea what things will look like in 10 years. Heck, the first Iphone was only 15 years ago. Let that sink in a minute. 20 years ago, no smart phones. Now, we cannot even imagine life without one. Do you really think you could have predicted the rise of the smart phone in, say, 1998? Hell, we still had beepers back then.

The rate of change just increases every year.
 

MarkB

Legend
AIs aren't going to have wants or needs anything like those of humans because our wants and needs are based on a whole host of things that we either can't or won't build into the design of an AGI:
  • A blind-idiot evolved processing system that combines very sophisticated neural networking with brute-force chemical triggers.
  • Millions of years' worth of evolved instincts and stimulus-responses, most of which were hard-coded long before we achieved actual intelligence and are therefore not designed to serve the needs of thinking beings.
  • A very specific set of sensory systems, none of which an AI will share with us one-for-one, whose inputs are not presented to our conscious mind unfiltered but instead are processed through subsystems with thousands of built-in biases and pattern-processing quirks and techniques.
  • A whole meat body that requires constant maintenance and management, to which much of our brain is dedicated without any conscious thought on our part, and whose needs impact our decision-making in thousands of subtle ways we're barely consciously aware of.
  • Well over a decade of neurological training through interaction with other members of our species, shaping our goals, attitudes, beliefs and values through mechanisms we still don't fully understand.
  • A basic but powerful sense of self-preservation that's the culmination of hundreds of millions of years of our ancestors being the ones who survived, prospered and procreated to breed the next generation.
What we get if we build a thinking being that doesn't incorporate any of these things, or which incorporates our own flawed impressions of them, is basically anyone's guess.
 

UngainlyTitan

Legend
Supporter
AIs aren't going to have wants or needs anything like those of humans because our wants and needs are based on a whole host of things that we either can't or won't build into the design of an AGI:
  • A blind-idiot evolved processing system that combines very sophisticated neural networking with brute-force chemical triggers.
  • Millions of years' worth of evolved instincts and stimulus-responses, most of which were hard-coded long before we achieved actual intelligence and are therefore not designed to serve the needs of thinking beings.
  • A very specific set of sensory systems, none of which an AI will share with us one-for-one, whose inputs are not presented to our conscious mind unfiltered but instead are processed through subsystems with thousands of built-in biases and pattern-processing quirks and techniques.
  • A whole meat body that requires constant maintenance and management, to which much of our brain is dedicated without any conscious thought on our part, and whose needs impact our decision-making in thousands of subtle ways we're barely consciously aware of.
  • Well over a decade of neurological training through interaction with other members of our species, shaping our goals, attitudes, beliefs and values through mechanisms we still don't fully understand.
  • A basic but powerful sense of self-preservation that's the culmination of hundreds of millions of years of our ancestors being the ones who survived, prospered and procreated to breed the next generation.
What we get if we build a thinking being that doesn't incorporate any of these things, or which incorporates our own flawed impressions of them, is basically anyone's guess.
QFT

But I should add that I do not believe that any current approach to AI will generate anything like sentience or intelligence except by some fluke.
That is not to say that AI will not be socially disruptive and the successful societies of the future will be the ones that handle the consequence.
 

Remove ads

AD6_gamerati_skyscraper

Remove ads

Recent & Upcoming Releases

Top