Then I programmed the AI to be malicious.What if you inexpertly program the AI to stay 10' behind you at all costs, no matter what, and override its failsafes so that it cannot prioritise any other instruction more highly than that?
But in my example, you're also missing another big point: who programmed the AI to stab and shoot? I didn't; the AI invented that on it's own.
No one programmed HAL to murder or taught him how to kill. He invented his methods of murder on his own. Dave and Frank did not accidentally end up outside the ship. HAL sent them there, manipulating them with lies, foresight, and premeditation. That's a big part of what makes him evil.
Not really relevant, but: both. Fault is not a finite resource.Is it still at fault if it cuts through someone to keep its position? Or are you?