Monday, February 21, 2011

7.1 EVOLUTION and ENTROPY


The fundamental postulate of evolution by natural selection is design without a designer, which from my previous discussion about design means a natural origin for semantic (or signal) information since every 'design' is specified by semantic information. Lets just for a moment ignore the intuitive improbability of such an occurrence and flesh out the 'algorithm' by which this process works.

Beginning with a hypothetical replicating molecule (about 600 atoms according to Prof Shapiro) it is expected there will be variations captured at the moment of duplication somewhat equivalent to copying errors. In molecular terms some of those may affect the quality and or speed of replication and the natural outcome of this will be a general increase in those more successful replicators. In succeeding generations this process will steadily accumulate all those changes that happen to improve reproductive success. As populations of these increase in number they will tend to reduce the pure chance failure of the occasional top dog or best replicator like being eaten by some competing molecule. It is also logical that any defensive or offensive improvements which reduce further the chances of being 'eaten' also contribute to success without necessarily speeding up or otherwise improving reproduction itself and so become 'selectable' traits. It is then simply proposed that there is no barrier to the continuation of this process through any degree of complexity imaginable, and why not. Its simple, perfectly understandable and even eloquent like the very best discoveries. It is the last point that is the central issue about which this entire blog is written. Are there really no barriers to ascending complexity using this algorithm?

The earth is where this incredibly improbable event took place over 4.5 billion years according to evolution theory. The earth is not an isolated system as required by the second law. However notwithstanding the hypothetical raining down of proteins from outer space (panspermia), by that law the processes that produced them here must also pay the entropy cost for them.

Panspermia was postulated by Fred Hoyle & Chandra Wickramasinghe  as a consequence of  their clear understanding that evolution alone could not account for all the diversity of life. see [http://www.panspermia.org/chandra.htm]

Have a nice day..

6.3 Entropy and Probability

-->
Now consider our three dice in a box in a childs play room. The room is the childs universe but definitely not an isolated system as required by the second law, however the probability outcomes for the box work to illustrate the thermodynamic probability term 'W' in the Boltzman Gibbs equation for absolute entropy. Note we are only interested in the entropy change associated with the process of shaking the dice. That change is the difference between the final and the initial absolute entropies of the two logical states. Note the Botzman Gibbs entropy depends only on the state of the system not how it got that way. For imperfect universes like our play room so long as the 'leaks' are random and small they will not affect the randomness of the process of shaking the dice. Non random energy would be like the child opening the box and deliberately ordering the dice just to trick you. Which is not allowed even in school let alone science.

At an atomic level we observe that heat always travels from hot to cold by any mechanism available.. conduction, convection and radiation. Atoms are very small particles indeed and this is why their random jumbling behaviour is smooth and steady to anything so large as a thermometer. We observe atoms in close proximity share their vibration (heat) energy with those with less (heat) energy. The energy spreads over more atoms and vastly increases the number of ways (micro states) available for it to occupy. So W increases and entropy increases but the real driver of the second law is probability. The effect is to disipate energy. The heat equation for entropy discovered by Rudolf Clasius simply extends the range of possibility for calculation of entropy change using temperature rather than having to count and identify atoms!

Notice the tendancy of heat to move from hot to cold does not require an 'isolated' system to proceed. If we put a source of heat on one end of a metal bar that is a non random input which will maintain an out of equilibrium condition but at no point in the system will we observe heat moving of itself from cold to hot. Putting the heat source and bar in a large insulated box we can measure the increase in entropy of the air surrounding the bar accounting for the entropy debt represented by the non uniform distribution of heat in the bar.. ie the entropy cost of maintaining that improbable state.

Have a specially good day..

6.2 Entropy and Probability


With just three dice in a box you will readily appreciate that at any time a less probable pattern may occur. Its not difficult to get {222} (W = 1 micro state). Its 1 out of 216 possibilities so we may say if we continue to shake the box randomly we will expect to see {222} every 216 shakes on average. Thats the price in entropy required to pay for the improbable {222}. Indeed if we were to count the number of shakes to successive occurances of {222} and divide the total number of shakes by the number of occurances we would observe the result approach 216, its asymptote to any degree of accuracy we like. So with just 3 particles we can see that the Boltzman Gibbs equation means an increase in improbability is a decrease in entropy that must be accounted for by the second law. Also the second law only applies to a large number of particle events not just particles, which is the basis of the probability which drives it.

The second law has to be formerly constrained to an isolated system of particles and a sufficiently large number of particle events to follow a predicted average behaviour. Under such circumstances the law states that the entropy (disorder) of the system must increase or at best remain constant (equilibrium). This predictability is about as near to an 'absolute' as you can get in science and yet it is driven by nothing more than the normal probability distribution with its central most probable outcome and an infinite number of less probable outcomes reaching all the way to the impossible.

So what is an 'isolated system'? It sounds simple enough.. a system which is so well insulated and sealed the inside is not effected by the outside. The only problem with this idea is, it's purely hypothetical.. we may be able to approximate it but in truth the only one we know of is the entire universe! The ultimate meaning of the second law is that the entropy of the universe must increase with time.

Have a particularly nice day..

6.1 ENTROPY and PROBABILITY


This section could also be given the mysterious title of “The Meaning of W”, ie the W from the Boltzman Gibbs equation.. [s = k log(W)]. While were at it, lets also point out that (k) is a constant called the Boltzman constant. Its arbitrary and unnecessary however by giving entropy (s) the units of energy per degree or (Joules/deg C) it allows the derivation of the Clausius equation for entropy change due to heat transfer (special case). Note however W (thermodynamic probability) is just a number it has no units.

Now lets play dice, take 3 dice, any pattern you throw will be one of a total of 6^3 = 216 these are the microstates of the system. The probability of any specific patern is 1/216 call it P3. It is essential that all microstates have the same probability. Now a macrostate is just a group of microstates with a common characteristic. We could say the microstates are 'natural' but we choose the macrostate which is therefore intelligent. There are more ways to get some macrostates than others because they contain more microstates. For example there is only one way of getting {666} (ie it does not matter which dice is which) but look at {665} you can get {656} or {566} three ways to do it, but we get those three ways by noting which dice has the 5. Now take {315} essentially three different numbers, there is {351}, {135}, {153}, {531} and {513} ie 6 ways of getting any such group. Our choice of macrostate meant we ignored the order in which they come but required us to identify the dice. We could have chosen just a sum of dots as a macrostate.

The probability of {351} written P(351) = 6 x 1/216 = 6P3 and Pr(665) = 3 x 1/216 = 3P3. So {666} has only 1 micro state, {665} has 3 and {135} has 6. The meaning of 'W' may now be stated as the number of micro states in any given macro state. If we have a box containing the three dice starting with {666} and shake it, the next pattern is 3 times more likely to be {665} and 6 times more likely to be {135}, and for n = any number its 18 times more likely to be {22n} (six macro states) etc. There is a powerful tendency for the arrangement to move towards a macro state with a larger number of micro states because that is more probable. Hence W will tend to increase and so will entropy (s) to a point where W is at its maximum (in this case 6) and tend to stay there which is called equilibrium.

Have a realy nice day..

Thursday, February 17, 2011

5.4 Complexity

Boltzman explained the link between entropy and the very powerful axioms of probability, an increase in entropy is an increase in probability.. (called thermodynamic probability) ie chance of being in that state as opposed to all other possibilities. Hence if (~ means directly related to)
Entropy ~ Probability
Now the opposite of probability is the inverse or 1/Probability = Improbability so we may say if a state of matter moves to a more improbable state it represents a drop in entropy.
But entropy is also a measure of 'disorder' or loss in complexity so we may write..
Complexity ~ - Entropy (negative entropy)
And so by inference..
Complexity ~ Improbability
But the above implies that (- entropy) ~ (1/entropy) which is not true mathematically! But there is a math function for which negation of the function equals the function of the inverse! Logarithm. Hense the equation for entropy [s = k logW] and for a change in entropy from s1 to s2 we have
s2 – s1 = k log W2 - k log W1 = k log W2/W1
So 's' is a measure of complexity and related to probability by the log function of the probability called W. Hense the second law is expressed as in some engineering texts:
“The second law of thermodynamics means that any system of paritcles, left to itself, will tend to move to its most probable state”.

Tendancy means it is the average outcome of a large number of events, usually referred to as a 'statistically significant' number. This is true for everything made of atoms! It includes chemical reactions as well as physical pressures, temperatures and even nuclear reactions. It encompasses all imaginable states of matter regardless of size from electrons to elephants and giardia to galaxies.

So any propositon or theory that claims to result in an average increase in complexity like evolution by natural selection, can be tested by the above for compliance with the second law. If it fails that test it is falsified regardless of the amount of data interpreted in its favour, that interpretation must be incorrect.

Have a nice day..