Whether it was programmed to be sapient or it develops sapience over time, or only became sapient because, I dunno, a bolt of lightning struck the computer, Short Circuit style, that human programming would still cause it to have a degree of anthropomorphization, if only because humans would have programmed the computer in ways humans can understand. Probably the only way an AI wouldn't have at least a smidge of human influence in it is if it evolved from something like that self-replicating, evolution-capable mini-program Richard Dawkins wrote about in, I believe, The Blind Watchmaker. And even then, since organic life forms all have at least some degree of self-preservation instinct, I have a very hard time believing a naturally-evolved true AI wouldn't also evolve self-preservation.