Finding the mean total number of mutation events was vital as it is this that is directly proportional to the improbability or entropy of the state and hence the Entropy Cost of achieving that state under the second law. Let me illustrate what I mean by Entropy Cost with two dice: The probability of [6][6] is 1/36 but this does not mean we cannot throw [6][6] on the very first throw nor does it mean we are guaranteed to get [6][6] after 36 throws. What it really means is [6][6] will occur on average once every 36 throws of two dice if we just keep throwing the dice. The longer we keep doing it the closer will the ratio of total number of throws divided by occurrences of [6][6] approach 36 which is what the Law of Large Numbers predicts.
What that means is 36 throws of two dice is the minimum work required or entropy cost of the state of order of [6][6] in that system by the second law. Heavier dice simply increase the energy required but in no way does that affect the entropy cost measured as the number of throws of two dice = 72 demonstrating that entropy has nothing to do with energy. There is a paper yet to be published on that subject but let it suffice to know the second law imposes a minimum average number of random events to create a state of order by those events and it is equal to the improbability of the state. If any theory requires a state of order with less random events to pay the entropy cost it is falsified by the 2nd Law.
The Dawkins coin toss model can be modified to be made generally applicable to any code base with any probabilities for selection and cull desired or even introduce code redundancies effectively moderating mutation probabilities for deselection etc. Noting that real evolution must in the end grow genes made up of an alphabet of bases A, T, C, G with certain probabilities of mutation it occurred to me the model could be configured to do exactly what evolution in the wild must do to grow a gene.
No comments:
Post a Comment