Though I in turn question that a machine that is sufficiently intelligent that we can't distinguish it from human intelligence will still be "copyable" like that. Sure ,if we can run it on a contempory computer, probably it can, but is that actually feasible? But what if the artificial intellect actually is also dependent on specific hardware? Maybe it requires a quantum computer, or organic tissue, and then copying it would be as hard as copying a human.
Quite true - if true AI is achieved by quantum computing, copying becomes problematic to impossible, as copying *will* change the state of the system. Copying for organic parts may or may not be feasible, depending on the function the organic parts play.
Let us remember, also, that "brain chemistry" is not a blanket, nor a static, thing. Some changes in brain chemistry produce long-term and notable changes in personality, and others don't. We have to consider what we call "the same person".
Let us consider a real world case - a person goes off to war, and comes back with PTSD. Are they "the same person" as when they left? Legally speaking, yes, they are. If their dysfunction is powerful enough, they may be relieved of legal responsibilities, but they are still the same person. So, there's some level of change in the brain we tolerate in calling them "the same". We don't even need them to behave in the same way for all time - which is good, because people do change, even without traumatic events.