Tuesday, February 16, 2016

Appendix 7 Modelling Evolution

First a note about probability and entropy..

The probability of any 'event' is determined by the 'prior' expectation of it (by a given process). (ie if coins are intelligently placed in order (HTHT.. etc.) then the probability of that arrangement would appear to be 1 because the outcome is certain). So by this it would appear the probability of achieving a state of matter depends upon how it is produced (process). But the absolute entropy of any state of matter is independent of the process that produced it! So it would appear the entropy cost of producing an end state is independent of the absolute entropy of the state itself.! Yet we know they must be dependently linked because any low entropy state must be accounted for by an increase in entropy (in the surroundings) and the only available way to get it is the process that produced it..?

How do we reconcile this apparent contradiction..

If the improbability of a state is reduced by a natural bias (like selection) or even direct intelligence the entropy of that state is not changed and the cost of that state must still be accounted for.. The answer must lie in that fact I have only considering the final steps (the placing or tossing of the coins) of a process which is in fact the result of a much larger 'system' that makes the outcome possible..

The considerations of where did the coins (or dice) come from.. what about the table, the room and even the person doing the placing. So is it valid to calculate the improbability simply by looking at the end result. That answer is in the term conditional probability. The probability of A given B is written Pr(A|B), (Noting that improbability is just 1/probability). So when I am calculating the probability of a sequence of 100 coins (leave the DNA for now).. It is actually..

Pr(100 HT pattern | coins, table, room, person, land, earth etc etc.) with all those 'givens'..

It is therefore possible to simply state the absolute entropy of the state is not changed by the process its just that I am calculating the conditional probability (or improbability) of the last part of the process (system) that produced it. Most important here is the fact that each conditional part in the process must result in an entropy increase which exceeds the drop in entropy that it creates. So the drop in entropy resulting from a person placing the coins in order is different to the drop in entropy resulting from tossing the coins randomly to get the same order. But the absolute entropy (~probability) of the final state is the same in both cases. They simply have a different set of 'givens'.

I hope that is clear enough..

No comments:

Post a Comment