Sunday, March 27, 2011

7.4 Evolution and Entropy


Before we look at the entropy cost of a string of DNA or its protein we need to make an important observation about our accounting of logical improbability. When we examine the 3 dice in the box the macrostate {123} without order has 6 microstates by noting the identity of individual dice. The Boltzman-Gibbs thermodynamic probability 'W' also uses the identification of each particle in a system to account for all possible microstates. That is why W is such a huge number for any real system (ie for each macrostate every particle is swapped with every other particle creating a massive logical phase space). Effectively the value of 'W' for the {123} macrostate is 6 whereas for {666} it is just 1. But now if order is significant the macrostate {123} has like {666} only one microstate and hence the least possible probability and the smallest 'W'. Just one of a total 216 possibilities in this very simple case.

Now every string of codes being semantic information has three essential properties: They are
  1. Non repeating
  2. Non random
  3. Ordered
Notwithstanding the fact that number 2 would appear to preclude any random process from assembling semantic information what we are really interested in is number 3. DNA being semantic information is ordered. Which means any meaningful string of DNA (such as codes for a specific protein) has the maximum improbability possible for any string of the same length (W = 1). Which means a string of DNA represents one out of the total logical phase space available when we identify each particle or in this case nucleic acid base. Allowing for redundancy in the human genome we may conservatively speak about a billion or 10^9 uniquely identified bases in the human genome!

Remembering that an increase in improbability is also a decrease in entropy we may now do some entropy accounting. We recall there are just 4 nucleic acid bases making up the alphabet of DNA, {A, T, C, G}. These molecules are also handed but in life only the R Form (right handed) is used even though roughly speaking both are equally likely and chemically equivalent. So just like our 3 dice a string of 3 DNA bases has a total number of possibilities or microstates of 4^3 x 2^3 = 8^3 = 512. Three DNA bases is called a codon and happens to code for one amino acid in a protein and being ordered it is a macrostate with only one microstate and an improbability of 512.

The calculation of the improbability (negative entropy) of any protein is now straight forward but some would ask, is it really improbable?

Have a cracking day..

Tuesday, March 15, 2011

7.3 Evolution and Entropy

-->
Now the question is simply this; Can evolutionary processes on earth afford to pay their entropy debts?

I have shown (refer ch 6) for a logicaly improbable state the negative entropy by the Boltzman-Gibbs equation must be accounted for by the performance of a sufficient number of trials to make that outcome possible. This is the real implication of that equation for the second law where the total number of possibilities (logical phase space) is so large as to preclude any average increase in improbability, complexity or order (for equivalence refer ch 5). So for a box of just 3 dice (system), of the 216 possible outcomes only 6 are tripples, 30 are pairs but 180 are composed of three different numbers. The average behaviour is a mathematical certainty, most outcomes will be 3 different numbers. However we may also see that for the entropy cost of 36 shakes we may get an unlikely triple! That is over many trials (random shakes) even 3 dice will obey the second law! Now consider 100 dice in the box, there are only 6 100's (same number) out of 6^100 possibilities. With just 10^80 atoms in the known universe you would be hard pressed to stray far from the average of 100/6 of each number, let alone get a 100 of any one number. The implication is clear, if any system with a very large number of logical possibilities (phase space) cannot do the necessary trials to make an improbable outcome possible (pay the entropy debt), we have an instance of a significant drop in entropy without the requisite increase in the surroundings and that is a violation of the second law! Meaning if you found the box with 100 6's face up you would be correct in concluding it did not occur by any random process.

DNA is the complete specification for the growth, reproduction, defence and repair of each life form. It includes specific codes for all its proteins which are the main functional molecules of the cell. Its instructions are read and acted on by a veriety of complex machines which themselves are made of protein. The driver of evolutionary change is random mutation of the DNA at the moment of reproduction, facilitating inheritance of new traits. It is well known that natural processes may produce most of the amino acids, the building blocks of proteins, now we need to know what is the cost in Boltzmann entropy of these logical states. The DNA contains semantic information (refer ch 4) by virtue of the ordering of the amino acids, as such it has a measurable Boltzmann absolute entropy.. (logical improbability).

Have a specially nice day..

Thursday, March 10, 2011

7.2 Evolution & Entropy


Some authoritative quotes..
  1. The Second Law is inviolable and inescapable..
    "The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." — Sir Arthur Stanley Eddington, The Nature of the Physical World (1927)

  2. Life must obey the Second Law..
    Living organisms are biological machines which are also vast assemblies (or states) of matter and as such the second law applies to them. The processes by which they function, reproduce and grow creates order or an increase in complexity. That means a local reduction in entropy just like the cooling of a freezer box in the middle of a hot room. The second law requires the existance of an imaginary boundary around the process and enough of its surroundings to include everything affected by those processes. Within that boundary it will be found that there will be an overall increase in entropy. Experimental measurements show that the boundary need only approximate an isolated system for the calculation to work as predicted.
  3. Evolution must obey the second law.. 
    Life processes like growth generate more disorder in their surroundings than that which they create, and the evolution of life itself does the same thing (parphrased from Peter Atkins Second Law). Evolution is the explanation for the general increase in complexity of the biosphere, but an increase in complexity is also a local drop in entropy. By the second law the same process must in a dependant manner account for that drop by creating a greater increase in entropy in its surroundings. If a chosen boundary for strict application of the law does not meet the entropy increase requirement it may simply be enlarged until it does even up to include the whole universe if necessary.

If life does not come from somewhere else then the processes which created it here must also pay for it here. If the system boundary is extended to include the whole universe (ie second law applies) it is invalid to balance negative entropy on earth by positive entropy from an unrelated process somewhere else. That means the second law must apply to the earth-sun system so to argue that because the earth is not an isolated system the second law does not apply is invalid.

Have a nice day..