It's difficult to compare, but here is one paper:
(Edit: Linked a totally different article that I read earlier, interesting topic, but not for this discussion)
So, that's better 1 bit error per 2 GB, I think, but of course, can you maintain this at the scale of a cell? And once you go beyond that - there are additional "error correction" mechanisms for the multicellor organism - like cells being killed off for building the wrong proteins. Estimations are that a human body consists of around 37.2 Trillion cells. 37 * 10^12, and each has about 3 Billion nucleotide pairs. And our body keeps working for decades...
And of course, lots of hardware is not exactly "space-proof" - once exposed to cosmic or solar radiation, unprotected by our atmopshere, it tends to have worse error rates.
I really thing realistic "Von Neumann" machines is really just organic life. You might think maybe it must be some mini-robot made from metal and using electric currents, or maybe some fancy "carbon-nano-tubes", but metals need to be found and processed to build tiny machines from, and if you're at carbo-nano-tubes - if it was easy to build them in a self-replicating manner, organic life probably would have evolved it already...