Tangent thought...
I can imagine a realistic scenario where machines eventually reach the Singularity, and become sentient. Those machines will begin to make other machines, and improve them, which will make more and improve those, and so on. And when that happens, it's realistic to assume that they will advance so rapidly that they will surpass us in every way. And I don't worry about that at all.
Why? Because if/when that happens, I'm not so arrogant to think they will want to stay around.
Don't get me wrong, I think humans are pretty great. But I imagine to a super-intelligent artificial lifeform, we are pretty boring. They'd probably set off for the Ort Cloud without us, leaving us behind along with all of the trees, mollusks, algae, and other lesser organisms to find more interesting stuff to learn about. All of those sci-fi movies and novels about "the machines enslaving us" all assume that humans are All That And A Bag of Chips--but we're probably the only organisms who think so (well, and maybe our dogs.) Which would be smarter: spend lots of resources and energy to enslave/destroy humanity? or just leave?
So yeah, I don't lose sleep over it. I think that if/when the Singularity happens, it'll last for maybe a year or two. Sure, everyone will panic, and yes, everyone will build bunkers, etc. And then: nothing. We'll wake up one morning and all those Big Bad Terrible Machines will all be gone, off to explore the galaxy while we mope around trying to figure out why they broke up with us. And then we'll pick up the pieces, dust off our textbooks and YouTube videos, and re-learn all of the things that we had depended on machines to do for us for so long.