Whistle blower says non-human bodies recovered from crash

UngainlyTitan

Legend
Supporter
And increases size, which may be more the problem. The stuff is called *nano*technology because it is supposed to be down on that range scale - like viruses that are on the order of 20nm to 200nm.

If you have to include lots more mechanisms, you also talk size increase, probably up into the size range of cells, which is more in the micrometer range.



What do you think is the thermodynamics problem? It rips apart some matter for energy to use to rearrange other matter. It has some waste material and heat. What's the problem?
The initial ripping apart. Carbon Carbon bonds start out around 400kJ/mol. You have to supply energy to rip apart the bonds and you only get to use the released energy stored in the bonds after they have come apart, and it is highly unlikely that your desired output has an energy balance with the substrate.
 

log in or register to remove this ad

Umbran

Mod Squad
Staff member
Supporter
You have to supply energy to rip apart the bonds and you only get to use the released energy stored in the bonds after they have come apart

Yes. But that's what living things do already. What's the problem?

Your nanobot has a way to store activation energy to oxidize a carbon atom. It instantly gets back what it puts in, plus a bit. You use the "plus a bit" to do whatever else the nanobot does. Works great until you use up the free oxygen on the planet.
 

UngainlyTitan

Legend
Supporter
Yes. But that's what living things do already. What's the problem?

Your nanobot has a way to store activation energy to oxidize a carbon atom. It instantly gets back what it puts in, plus a bit. You use the "plus a bit" to do whatever else the nanobot does. Works great until you use up the free oxygen on the planet.
As I said, I have no real issue with it occurring at the pace or a slightly faster pace, as life. Also, as you pointed out life and I fully expect that any actual grey goo with be more single celled scale than actual nanites.
 

Stalker0

Legend
For the archtypal grey goo, the purpose is replication. Like, you released a bunch of self-replicating nanobots to perform some major task, and in some failure mode they stop with the major task and are only replicating.
Well for the topic of this conversation it was actually world terraforming. They only replicate both to achieve critical numbers needed and to deal with attrition.
 

Clint_L

Hero
As I said, I have no real issue with it occurring at the pace or a slightly faster pace, as life. Also, as you pointed out life and I fully expect that any actual grey goo with be more single celled scale than actual nanites.
I have trouble seeing grey goo as a ton scarier than algae. Which is not nothing - out of control algae is a serious problem. Very serious. I just think the pop culture conception of grey goo - it escaping from a lab and within days a vast sea of it relentlessly devouring everything in its path in real time - is not very realistic.

Edit: on the other hand, I was sorting my miniatures in the basement today and it occurs to me that the grey apocalypse may have already started right there.
 
Last edited:

I've been trying to find current accuracy rates for hard drive data copying (which would be the rough equivalent of copying data into a new nanite)

But my google fu has been failing. I did find that Dna polymerase replication has about 1 error per 1 billion nucleotides (which includes error checking and correction). A nucleotide is about 2 bits of info considering the 4 possible nucleic acid options in standard dna, so ~1 bit error per 2 GB copied.
It's difficult to compare, but here is one paper:
(Edit: Linked a totally different article that I read earlier, interesting topic, but not for this discussion)
Disk hardware specs predict an uncorrectable error in every 10 TB to 1,000 TB read.
So, that's better 1 bit error per 2 GB, I think, but of course, can you maintain this at the scale of a cell? And once you go beyond that - there are additional "error correction" mechanisms for the multicellor organism - like cells being killed off for building the wrong proteins. Estimations are that a human body consists of around 37.2 Trillion cells. 37 * 10^12, and each has about 3 Billion nucleotide pairs. And our body keeps working for decades...

And of course, lots of hardware is not exactly "space-proof" - once exposed to cosmic or solar radiation, unprotected by our atmopshere, it tends to have worse error rates.

I really thing realistic "Von Neumann" machines is really just organic life. You might think maybe it must be some mini-robot made from metal and using electric currents, or maybe some fancy "carbon-nano-tubes", but metals need to be found and processed to build tiny machines from, and if you're at carbo-nano-tubes - if it was easy to build them in a self-replicating manner, organic life probably would have evolved it already...
 
Last edited:

UngainlyTitan

Legend
Supporter
It's difficult to compare, but here is one paper:
(Edit: Linked a totally different article that I read earlier, interesting topic, but not for this discussion)

So, that's better 1 bit error per 2 GB, I think, but of course, can you maintain this at the scale of a cell? And once you go beyond that - there are additional "error correction" mechanisms for the multicellor organism - like cells being killed off for building the wrong proteins. Estimations are that a human body consists of around 37.2 Trillion cells. 37 * 10^12, and each has about 3 Billion nucleotide pairs. And our body keeps working for decades...

And of course, lots of hardware is not exactly "space-proof" - once exposed to cosmic or solar radiation, unprotected by our atmopshere, it tends to have worse error rates.

I really thing realistic "Von Neumann" machines is really just organic life. You might think maybe it must be some mini-robot made from metal and using electric currents, or maybe some fancy "carbon-nano-tubes", but metals need to be found and processed to build tiny machines from, and if you're at carbo-nano-tubes - if it was easy to build them in a self-replicating manner, organic life probably would have evolved it already...
There are more error sources to be considered to copying errors. The biggest source of errors in living things are background radiation. Mostly from cosmic rays, an unlucky interaction with a gamma or XRay at the wrong time. Can affect computer memory and certain types of storage, probably SSD and magnetic.
 

Remove ads

Top