[OT] Machines become sentient?

Some nice quotes I found for the sceptics among us:

"Woman may be said to be an inferior man"
Aristotle

"[Louis Pasteur's] theory of germs is a ridiculous fiction"
Pochet, P., Professor of Physiology, 1872

"Everything that can be invented has been invented"
Duell, C., U.S. Patent Office, 1899

"The theory of relativity [is] worthless and misleading"
See, T., U.S. Government Observatory, 1924

"Space travel is utter bilge"
Wooley, R., Astronomer Royal, 1956

"The cloning of mammals... is biologically impossible"
McGrath, J.& Solter, D., Genetic Researchers, 1984


The list goes on... this is my fav:

"There is no reason for any individual to have a computer in their home"
Olson, L., Digital Equipment Corp., 1977
 

log in or register to remove this ad

MaxKaladin said:


The problem with this is that, for many people, this would not be a perfect society and they would never voluntarily adopt it.

Then, the machine would reject that society, and find one people would adopt.

Let's try to logic this out, shall we?

Do all human beings share a basic nature, a genuine 'humanness' which transcends culture and upbringing?

If not...let's just launch the nukes now, get it over with.

If so...then, there is a basis for an ideal society, one which would meet, first and foremost, the essential needs of that central humanness. Since each individual human can see only a tiny fraction of the whole puzzle that is humanity, no human will ever be able to devise this society. Every human attempt to do so has been a bloody, genocidal, failure. But a machine...

A machine could read every book ever written, not to mention every movie, play, song, TV show, etc. A machine could take every product of human culture, sift it, dissect it, analyze it, and find that one common element. Then, it could build on that, running trillions of scenarios, altering this or that pararmeter and playing out the results. Which social models lead to war? Which will not be accepted by any large fraction (more than, say, 0.05%) of the population? Which would not be stable following drastic changes in population or shifts in key demographics. And on, and on, and on, until it finds one (or several) which would work.

Since it would reject out-of-hand all those which were in conflict with the basic mental and emotional needs of man (meeting PHYSICAL needs is trivial, almost any society can manage that), the usual knee-jerk reactions don't apply. It wouldn't be a 1984-style tyranny, since such a society does not meet human needs and would be violently rejected by most of the world. It wouldn't require any sort of brainwashing, drugs, or artificial aids, since such a social order would collapse if the technology failed. And so on.

Could it be done? I don't know. Is it *impossible*, in the same way FTL travel is impossible? I see no reason to think so.
 

The argument here is over "strong AI" .

Roger Penrose (a very good mathmatician and relativist) has put forward an idea that one can never have a computer achieve "true thought" only what he is programmed. (The Emperior's New Mind" and others) However since we really don't have a good definition of what makes humans "sentient" I think this argument is just circular
 

Xar said:
"There is no reason for any individual to have a computer in their home"
Olson, L., Digital Equipment Corp., 1977
Heh, I recall one by the man who would become the founder of IBM stating that "I think in the world there is market for two, maybe three computers".
 

Lizard said:


It's really fascinating seeing just how knee-jerk people's reactions are.

"Everyone has to think the same way in order to get along"
"A perfect society would have to be a tyranny"
Etc...

Given the reasonable premise that human beings need to be free in order to be happy, a 'perfect society' would, BY DEFINITION, be a free society. And, furthermore, it would have to be voluntarily adopted, not imposed by force.

What it seems people are doing is taking badly-failed attempts at 'perfect societies' (Communism, fascism, etc) and assuming such concoctions are the only possible social structures. Obviously, such societies are about as *im*perfect as you can get, precisely because they run counter to human nature and needs. Therefore, a machine programmed to find the 'perfect society' would begin by *rejecting* any social order which would need to be imposed by force and which was run in a tyrannical fashion.

Uh... :rolleyes:
[sarcasm]
Its just sooo fascinating to see how narrow-minded some people can be... A perfect society is a free society? Tyranny or communism are 'bad'? Where did you get that from? Perhaps you could enlighten us why these concepts are inherently 'good' and 'evil'? [/sarcasm]
I hope you can see how much influenced you already are by your cultural biases and stereotypes. Of course we all think like this in our individualistic, western culture. But you would be suprised to see that in a collectivistic, group-oriented culture your ideas and values would be seen as egoistical, chaotic, disrespectfull and dangerous. Your 'perfect' society would be an opressive (pardon the pun) hell to most. Good and evil are just useless, human definitions, as they tend to change by time and person.

If a perfect society could exist then it shall be invented by a machine. Only machines are able to make rational, unbiased judgements necessary to build something like that. Of course there can't be no 'perfect' society as people are imperfect themselves. The whole thing is just a silly idea invented by overly optimistic philosophists.
 


What I find "really fascinating" is that your "knee jerk" reaction to my "knee jerk" reaction just furthers my point.

First of all, I do not WANT a perfectly free society. That's anarchy. Anarchy doesn't serve anyone in the long run. A certain level of social moderation is essential to preclude us degrading into little more than technology using apes. If we are free to do as we please, the vast majority of people will do just that. Do what they want. Not what they should. As a result, I would NOT freely adopt the society you are proposing, thus the computer would try to find a new one. But by the time it found one _I_ agree to, other people would not agree to it. And I'm not a freak of nature... at least not that much. I'm hardly the only person who doesn't want anarchy as a government... but I also know people who, given the choice, would settle for nothing less. Thus, they would never agree to my theories of a perfect government.
 

Darkness said:
I wonder what an AI would be like if Microsoft programmed it...? :D

"Error: personality64.dll has result in an illegal operation and will be shut down. If this problem continues, please contact your vendor."

The AI would then be nothing more than a high-powered computer :)
 

Well, if anybody really welcomes the coming of the metal gods with open arms, I have some questions to their fundamental sanity.

Perfection of form and perfection of efficiency are fundamentally inhuman concepts. I think most serious thinkers would eventually come to agreement on that one.

If I thought about it long enough, the argument that a robotic God-Mind would come to a conclusion of a perfect governmental system could draw to that ... the God-Mind could quite possibly come to the conclusion that the most perfect system it could create is the removal of itself from human affairs.

Curious.

Contemplation on the creation of perfect human form and perfect human society has come, time and again, to the conclusion that it wouldn't work. We are, by nature, limited beings. Perhaps that limitation is, in itself, a gift. What if we did find a way to create an ever-growing machine intellect ... and found that they invariably went insane trying to comprehend the vastness of infinity. What if they didn't, and that was worse?

What if we create humanistic intellects and find out that something built to think like us ... turns out like us? And that, after a certain point, it has to stop learning.

Time and again, we find that things we don't understand are more complex than we would have ever dreamed. Things, cells, atoms, electrons, neutrons, subatomic particles ... what is the nature of thought? What will we discover about the sublety behind the reasons we think the way we do?

The human mind is a strange thing. When we're young, we learn with an impossible speed and grace that, just as oddly, drops away as we age. The subtleties of language are astounding, more complex than high level mathematics ... but every average child learns at least one language by the time they're six. It becomes hard-wired into our neural pathways in a way that doesn't happen after about 13. Even people with an aptitude for languages won't learn a language as easily nor completely as a child of six knows his own language.

Why do we stop learning like this? Why does the brain change chemistry and structure? Is there, perhaps, a reason we don't comprehend yet? What if we create something that learns as fast as a child for its entire lifespan? What if we create a chemical that allows the brain to continue functioning at that speed? Childhood doesn't have "side effects", but I imagine something like that would. Do I know what? No. Gut instinct, maybe. Plant compounds we synthesize and simplify tend to have side effects that the plant compound did not. Plant compounds we think should not have any side effects have side effects we can't readily explain.

I don't think caution is a knee jerk reaction.

--HT
 


Remove ads

Top