Andor
First Post
Do you write code?
A bit, but my skills are ancient and rusty. Human (and broader biological) psychology and neurology are more my field. I don't know much about modern AI programming but neural networks are pretty simple.
First, I'm not convinced that the sort of black box neural networks we are using now are sufficiently robust to form the backbone of true commercial AI. They might make for good expert systems for consulting if you are a doctor or a lawyer, and thereby replace for example legal interns. But even if they were using some sort of evolutionary black box methodology, you'd only get human behavior out of that if you simulated human selection pressures. And why would you do that?
I not sure why you keep inserting the word human into your responses. Who said anything about human? I said evolutionary selection pressures. In a sufficiently iterative selection system you will evolve responses whose only function is to ensure that the system continues to be selected. In an AI this could easily lead to behaviors that may not be what the designer might have wanted, like C-3P0's sycophancy or the fear droids often displayed.
Ambition for what? To obtain social dominance in a simian band by accumulating power, wealth, or sexual partners? What is this 'ambition' you speak of? What is this 'laziness' you speak of? You've just introduced emotional goal driven behavior, but you haven't defined the emotional goal driven behavior. You've just left it hanging there like it's obvious what it is simply because humans have experienced it. But there is no reason to assume that droids would need equivalent emotions or that their nearest emotional equivalent behavior would have the same context, goals, and expressions that humans have. What would an 'ambitious' R2-D2 be like? Laziness is perhaps easier to understand, and you probably would have 'lazy' droids. But it wouldn't necessarily have the same causes or expressions as human laziness. Put it in context and you'll see what I mean.
The definitions are ignored because they are not required. These words describe both drives and behaviors, and the behaviors are emergent, not predetermined. And again, you keep pointless dragging humans into a discussion where they are not called for. Ambition in the social sense can be seen in any group oriented species. Ambition in the sense of accomplishment can be seen in several species who use constructed displays to attract mates like bower birds and some fish. Laziness is likewise hardly a human unique trait. And both ambition and laziness are valuable traits in a droid because they constitute a drive that can be channeled to improve the droids task performance. Without them you would have to either abandon iterative improvements to industrial processes, or invest significantly in some sort of 3rd party feedback generation.
Sure, but Star Wars is a dominant human space, and so far as we can tell no widespread species views AI's as heirs or peers and builds them for that purpose. And I think it's complex enough to deal with the alienness of an AI without dealing with the alienness of an AI built by an alien. Presumably an r-strategy breeder that didn't care if their droids turned out to be bad at not stepping on babies, also themselves didn't care too much if they stepped on babies. What we are talking about really is more like an r-strategy breeder building a machine that enjoyed stepping on babies. Hopefully even an r-strategy breeder would see the dangers of a strong AI with that as a strong and unchecked priority.
No, the point was raised in discussion of a droid with properties you are claiming would be absurdly unlikely because they would be dangerous from our human viewpoint. My point was that properties dangerous to humans might be of no concern to other species and in the cosmopolitan star wars galaxy there is every possibility that species X has built droids that randomly step on babies, belch cyanide gas, or possess transcendental ambitions.