Wicht said:
Are you a statistician? Because I have read the articles of a few of them that would disagree with you.
If this is what you got out of that, then you've misread the article, or the papers were done by some serious quack statisticians.
I think the key thing is - In the real universe it is an impossibility to randomly produce anything that long with that much structure. The problem is that it is far more probable for an almost infinite amount of other things to be produced and many of them are far more likely.
Nothing is impossible. Nothing. Extremely improbable, Inconsequential, not worth considering? yes. Less probably than something else? Yes. Impossible? No.
Try it - initiate a computer program that randomly produces letters and numbers (coupled with punctuation) in a completely equal way and see how many full sentences you form in say, a month of running the program.
If they produce them in a completely equal way, then you don't have a random system.
A few problems, IMO should become apparent. In the structure of language some letters are far more likely to appear than others, "e" for example. Punctuation is less likely and some letters, like "z" or "x" only come up every now and then. But in a completely random environment, one is as likely as the other. If you tailor your program so that only one "z" appears for every hundred "e"s you are more likely to begin to approach the sctructure of a language, but you have also taken out the randomness of it.
You don't need to tailor the program. If it's truly random, then patterns will occur. They are just as likely to occur as non-patterns. If your 'random' generator never repeats itself, and never produces a recognisable pattern, then it is NOT RANDOM.
The flaw in the monkey theory is that probability breaks down when structure is involved.
No. It doesn't. In a truly random system, a predefined outcome is just as likely as another predefined outcome. In this case, the two predefined outcomes are "a pattern which is the same as some fragment of language" and "a pattern which is not the same as some fragment of language". The two are equally likely. There just happens to be far more patterns which fit into the second category than the first.
The chance for a random generator to produce a specific 10,000 letter passage, assuming a 255 letter alphabet (ie - the entire extended ascii set) is 1/(255 ^ 10,000) for a given passage of 10,000 letters. It's a really small number, but it's not zero.
Alzrius said:
We've already made such an insight: as I said, entropy is what makes it impossible for the theoretically infinite monkeys with typewriters to ever produce a work of Shakespeare or whoever. Unordered processes break down towards chaos.
No. An unordered process IS chaos. That's what chaos means. I believe the actual theory of entropy is that everything tends towards a static system. However it only applies to certain things (like the distribution of heat and matter in the universe). It certainly doesn't apply to a hypothetical never-ending stream of random numbers.
For that I believe you have to turn to chaos theory, which states that a sufficiently complex system appears to be chaotic, and that as a corollary, chaos can appear to be a sufficiently complex system.