Tuesday, April 12, 2011

9.2 CAN EVOLUTION BE FALSIFICATION

6.   Semantic information is a non repeating, non random, ordered set and as such for any given length it is a macrostate with only one microstate or the least likely of any arrangement. DNA is semantic information and in relation to the specification of protein represents a decrease in Boltzmann entropy by virtue of its improbability. The size of the entropy drop is not important, it is the second law requirement for that process to balance the entropy books.
7.   Myoglobin was the first protein sequenced and has 153 (or 154) amino acids each coded by a 3 DNA based codon. The effective improbability of a codon is 162 counting chirality (handedness) and redundancy of codons. By the second law the evolutionary system that got it to that state must also pay for it which means the capacity for the execution of 153^162 = 10^338 random changes or mutations if selection and reproduction do not affect the improbability(1).
8.   Since the total event capacity of the universe is 10^100 atom milliseconds this would clearly be insufficient to cover the entropy debt for the myoglobin protein molecule. In truth the opportunities for mutation events capable of evolving such a state of matter as myoglobin are limited to the surface of planets like earth. To falsify the theory of evolution it is necessary to show the evolutionary process cannot pay the entropy debt for proteins like myoglobin.
9.   The entropy argument is saying that the number of possibilities (improbability) is by far too big for random mutations to string together the required series of complimentary beneficial changes. Lets look at another quote from the Wikipedia regarding what they call the Fitness Function concerning trying to 'evolve' an optimum truck delivery rout.

    Evolutionary optimization techniques are particularly useful in situations in which it is easy to determine the quality of a single solution, but hard to go through all possible solutions one by one (it is easy to determine the driving time for a particular route of the delivery truck, but it is almost impossible to check all possible routes once the number of destinations grows to more than a handful).”

    (1) After correction (Feb 2016) the simple model I proposed in Dec 2015 has demonstrated that selection and replication reduce the improbability. [see Dr J verifies]

    I do hope you have a nice day..
    Mike Bellamy BE (Aero) UNSW 1972

    9.1 CAN EVOLUTION BE FALSIFIED


    1. The second law of thermodynamics is based on the observable fact that everything made of atoms tends towards its most probable state. It is the basis of the relentless decay we see all around us. What it actually says is that in any large group of atoms (system) isolated from interference from outside, the quantitative measure of that decay called entropy will increase.
    2. Boltzman and Gibbs discovered an expression for entropy which reveals probability is the real driving force behind the second law.. [ s = k Log W where W is called the thermodynamic probability, it is the number of microstates in any given macrostate and k is the Boltzman constant ]. There is another equation for 's' based on temperature and heat flow but it is a special case and where calculable the two are equivalent. There is only one entropy.
    3. The second law does not say that entropy cannot decrease, it happens in many cases like the growth of living things, cooling inside a refrigerator or the chance deal of a royal flush. However all such decreases have to be paid for by a bigger increase in the surroundings. The last one is most significant in this context because it is an increase in logical improbability. In fact as an ordered set it is a macrostate with only one microstate and the least likely of any arrangement of 7 playing cards.
    4. We know the minimum (entropy) price to get a specific 7 card royal flush from a well shuffled deck is the work of shuffling and dealing 52C7 (133 million) hands. This is the long term average number of deals to get this result. Note it is irrelevant that you might just get that royal flush on the very first deal it is the act of doing it repeatedly that gives the required entropy cost. The second law has nothing to say about single events nor do they matter.
    5. We do not need to know the actual measure of the entropy drop of the 7 card royal flush itself. However shuffling and dealing 133 million hands will generate quite a lot of heat and is the measure of the entropy cost required by that system to get that outcome. The fact that the smoky saloon is not a strictly isolated system, notwithstanding untoward card tricks, is isolated enough from card changing type events.
    Richard Feynman said “If you cannot explain your theory to a person waiting at a bus stop.. you don't understand your theory.” I am trying.. see next page.
    Have a very nice day..

    8.2 A Truth You Might Not Want to Know


    Proteins in all their myriad forms are the semantic equivalent of tools, machine parts and machines themselves. In engineering terms this is just the bio form of technology. Functional proteins range from just 11 amino acids (casein) to several hundreds. Take myoglobin an oxygen storage molecule residing in the muscle tissue of mammals. The logical improbability of its DNA (codon set) is 1e338. From 7.2 we know the second law applies to the evolutionary process on earth but lets go right over the top and consider the whole universe, can it pay the entropy debt of myoglobin? The universe is estimated to contain 1e80 atoms and be 14 billion years old which is 1e20 milliseconds. Now lets calculate the total event capacity of the universe, 1e80 x 1e20 = 1e100 (atom milliseconds). That figure is the maximum event capacity of the entire universe, its ability to do random trials or from the foregoing its entropy bank balance. It means that any state of matter with a logical improbability exceeding that figure cannot be the result of a series of random events without implying a net decrease in entropy contravening the second law. Selection and reproduction can bias the outcome to reduce the total improbability of the outcome and this must be accounted for in any model.

    Myoglobin exists in a number of slightly variant forms in different creatures some by just a single amino acid. This has been described as a measure of the evolutionary distance between these types. This is misleading because it ignores functional differences between variant species which may require different myoglobins and if so would not change the logical improbability of any one molecule. They are all highly improbable molecules and the question is can they be the result of random mutation with selection. The only alternative is of course a non-random process – a creation event!

    The result is partially verified in the Wikipedia topic 'Evolutionary Algorithm' ie the actual test of the algorithm, viz..

    “Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes, however some computer simulations, such as Tierra and Avida, attempt to model macroevolutionary dynamics.”

    You may note the goal of the random creation of semantic information (macroevolution) by modelling mutation and natural selection has not been achieved. It will never be achieved if it is shown to be a violation of the second law.

    Have a really nice day..

    8.1 A Truth You Might Not Want to Know


    Recall 7.4 above, a codon (triplet) of DNA has a logical improbability of 512 (ie there are 512 possible arrangements using the alphabet of DNA {A, T, C, G} and {L or R} handedness or chirality. Since DNA is semantic information each codon is also an ordered string meaning it is a macrostate with only one microstate (W=1). From my previous pages you should appreciate that the entropy cost of a single codon of DNA is the requirement for enough random events to give that specific sequence repeatedly when averaged over a very large number of events. The answer of course is 512, that's not the entropy change itself but the expenditure required to pay the entropy debt for that improbable state by pure random change.

    Protein only accounts for about 5% of all the DNA and the rest is 'junk' according to evolutionists, except we now know the only junk was that assumption! As far as coding for amino acids to build protein goes there is a lot of redundancy built into the system ie more than one codon will give the same amino acid. Which is a smart way of reducing the genome's exposure to some random mutations. Since I am going to be concentrating on protein I must account for that redundancy so the improbability of a correct codon is less than 512. There are just 20 'L' form or left handed amino acids, all the rest of the combinations are redundant. Some amino acids have 6 codons which all give the same result and some only one. So the actual improbability (number of possibilities) of a codon (producing a specific amino acid) is on average (4^3)/(3.15 = avge redundancy) x 2^3 = 162 (its not exactly 160 because of the 3 non coding stop bits).

    DNA is the vehicle of heritable traits so the analysis must be done at that level rather than the level of the protein itself to properly address the evolutionary algorithm. The charility or handedness of all DNA bases {R form} cannot be ignored, even though random mutations do not affect charility. The protein machinery of the cell which exclusively produces {R form} DNA bases is also a product of the evolutionary algorithm by that theory so the entropy cost of that aspect of the end state must be included in the total calculation.

    So for myoglobin with 153 amino acids the improbability at the DNA codon level is 162^153 = 10^338 (written 1e338). The question is can the evolutionary process account for this?

    I do hope you have a very nice day..

    Sunday, April 3, 2011

    7.5 Evolution and Entropy


    I had a long discussion with a mathematician who turned out to be a NASA rocket scientist! He said the outcome of any random event is only improbable if you are trying to specify it beforehand. If you shake 100 dice in a box you are certain to get something and whatever it is would be just as impossible to predict. True but what are the implications of this for evolution?
    1. You don't care what outcome you get
    2. Natural selection avoids the rules of cumulative probability over a long series of events by biasing less probable outcomes to produce a result not attainable without it
    3. It does not matter how many possibilities there are.
    Which leads to the idea that any change toward preservation however small will through population demographics will tend to get preserved and that is all you need for evolution. Well not quite..

    The first implies that during the course of design (or evolution) there is no point where prior specification will become necessary, demonstrating ignorance of technology. The second implies you can make the impossible possible, demonstrating ignorance of probability. The third implies the second law has nothing to say about the process, demonstrating ignorance of physics, which is all rather disappointing for a rocket scientist I would think!

    To take the first, all technology (machines) are based on cooperating parts with complex interfaces implying constrains on the interfaces for all parts of the machine. The constraints get tighter as the design progresses. It is just as true of biotechnology as for the man made kind even mouse traps.

    As for the second it is a basic axiom of probability that for two or more independent events the probability of a particular outcome is the product of their individual probabilities. It is not affected by the time it takes or the order of the events or anything you do after the event.

    On the third, if its made of atoms the second law applies no exceptions (refer 7.2). It is universally acknowledged any process resulting in an entropy drop is like a debt which must be payed for with interest. It is scientifically fraudulent to admit the staggering improbability of biotechnology and fail to account for the entropy debt or brush it off with ignorant statements about floods of energy.

    So my task here is really about whether or not evolution passes this test or heaven forbid is actually falsified by the test!

    Have a cuppa.. or something stronger.. and a very nice day..

    Sunday, March 27, 2011

    7.4 Evolution and Entropy


    Before we look at the entropy cost of a string of DNA or its protein we need to make an important observation about our accounting of logical improbability. When we examine the 3 dice in the box the macrostate {123} without order has 6 microstates by noting the identity of individual dice. The Boltzman-Gibbs thermodynamic probability 'W' also uses the identification of each particle in a system to account for all possible microstates. That is why W is such a huge number for any real system (ie for each macrostate every particle is swapped with every other particle creating a massive logical phase space). Effectively the value of 'W' for the {123} macrostate is 6 whereas for {666} it is just 1. But now if order is significant the macrostate {123} has like {666} only one microstate and hence the least possible probability and the smallest 'W'. Just one of a total 216 possibilities in this very simple case.

    Now every string of codes being semantic information has three essential properties: They are
    1. Non repeating
    2. Non random
    3. Ordered
    Notwithstanding the fact that number 2 would appear to preclude any random process from assembling semantic information what we are really interested in is number 3. DNA being semantic information is ordered. Which means any meaningful string of DNA (such as codes for a specific protein) has the maximum improbability possible for any string of the same length (W = 1). Which means a string of DNA represents one out of the total logical phase space available when we identify each particle or in this case nucleic acid base. Allowing for redundancy in the human genome we may conservatively speak about a billion or 10^9 uniquely identified bases in the human genome!

    Remembering that an increase in improbability is also a decrease in entropy we may now do some entropy accounting. We recall there are just 4 nucleic acid bases making up the alphabet of DNA, {A, T, C, G}. These molecules are also handed but in life only the R Form (right handed) is used even though roughly speaking both are equally likely and chemically equivalent. So just like our 3 dice a string of 3 DNA bases has a total number of possibilities or microstates of 4^3 x 2^3 = 8^3 = 512. Three DNA bases is called a codon and happens to code for one amino acid in a protein and being ordered it is a macrostate with only one microstate and an improbability of 512.

    The calculation of the improbability (negative entropy) of any protein is now straight forward but some would ask, is it really improbable?

    Have a cracking day..

    Tuesday, March 15, 2011

    7.3 Evolution and Entropy

    -->
    Now the question is simply this; Can evolutionary processes on earth afford to pay their entropy debts?

    I have shown (refer ch 6) for a logicaly improbable state the negative entropy by the Boltzman-Gibbs equation must be accounted for by the performance of a sufficient number of trials to make that outcome possible. This is the real implication of that equation for the second law where the total number of possibilities (logical phase space) is so large as to preclude any average increase in improbability, complexity or order (for equivalence refer ch 5). So for a box of just 3 dice (system), of the 216 possible outcomes only 6 are tripples, 30 are pairs but 180 are composed of three different numbers. The average behaviour is a mathematical certainty, most outcomes will be 3 different numbers. However we may also see that for the entropy cost of 36 shakes we may get an unlikely triple! That is over many trials (random shakes) even 3 dice will obey the second law! Now consider 100 dice in the box, there are only 6 100's (same number) out of 6^100 possibilities. With just 10^80 atoms in the known universe you would be hard pressed to stray far from the average of 100/6 of each number, let alone get a 100 of any one number. The implication is clear, if any system with a very large number of logical possibilities (phase space) cannot do the necessary trials to make an improbable outcome possible (pay the entropy debt), we have an instance of a significant drop in entropy without the requisite increase in the surroundings and that is a violation of the second law! Meaning if you found the box with 100 6's face up you would be correct in concluding it did not occur by any random process.

    DNA is the complete specification for the growth, reproduction, defence and repair of each life form. It includes specific codes for all its proteins which are the main functional molecules of the cell. Its instructions are read and acted on by a veriety of complex machines which themselves are made of protein. The driver of evolutionary change is random mutation of the DNA at the moment of reproduction, facilitating inheritance of new traits. It is well known that natural processes may produce most of the amino acids, the building blocks of proteins, now we need to know what is the cost in Boltzmann entropy of these logical states. The DNA contains semantic information (refer ch 4) by virtue of the ordering of the amino acids, as such it has a measurable Boltzmann absolute entropy.. (logical improbability).

    Have a specially nice day..

    Thursday, March 10, 2011

    7.2 Evolution & Entropy


    Some authoritative quotes..
    1. The Second Law is inviolable and inescapable..
      "The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." — Sir Arthur Stanley Eddington, The Nature of the Physical World (1927)

    2. Life must obey the Second Law..
      Living organisms are biological machines which are also vast assemblies (or states) of matter and as such the second law applies to them. The processes by which they function, reproduce and grow creates order or an increase in complexity. That means a local reduction in entropy just like the cooling of a freezer box in the middle of a hot room. The second law requires the existance of an imaginary boundary around the process and enough of its surroundings to include everything affected by those processes. Within that boundary it will be found that there will be an overall increase in entropy. Experimental measurements show that the boundary need only approximate an isolated system for the calculation to work as predicted.
    3. Evolution must obey the second law.. 
      Life processes like growth generate more disorder in their surroundings than that which they create, and the evolution of life itself does the same thing (parphrased from Peter Atkins Second Law). Evolution is the explanation for the general increase in complexity of the biosphere, but an increase in complexity is also a local drop in entropy. By the second law the same process must in a dependant manner account for that drop by creating a greater increase in entropy in its surroundings. If a chosen boundary for strict application of the law does not meet the entropy increase requirement it may simply be enlarged until it does even up to include the whole universe if necessary.

    If life does not come from somewhere else then the processes which created it here must also pay for it here. If the system boundary is extended to include the whole universe (ie second law applies) it is invalid to balance negative entropy on earth by positive entropy from an unrelated process somewhere else. That means the second law must apply to the earth-sun system so to argue that because the earth is not an isolated system the second law does not apply is invalid.

    Have a nice day..

    Monday, February 21, 2011

    7.1 EVOLUTION and ENTROPY


    The fundamental postulate of evolution by natural selection is design without a designer, which from my previous discussion about design means a natural origin for semantic (or signal) information since every 'design' is specified by semantic information. Lets just for a moment ignore the intuitive improbability of such an occurrence and flesh out the 'algorithm' by which this process works.

    Beginning with a hypothetical replicating molecule (about 600 atoms according to Prof Shapiro) it is expected there will be variations captured at the moment of duplication somewhat equivalent to copying errors. In molecular terms some of those may affect the quality and or speed of replication and the natural outcome of this will be a general increase in those more successful replicators. In succeeding generations this process will steadily accumulate all those changes that happen to improve reproductive success. As populations of these increase in number they will tend to reduce the pure chance failure of the occasional top dog or best replicator like being eaten by some competing molecule. It is also logical that any defensive or offensive improvements which reduce further the chances of being 'eaten' also contribute to success without necessarily speeding up or otherwise improving reproduction itself and so become 'selectable' traits. It is then simply proposed that there is no barrier to the continuation of this process through any degree of complexity imaginable, and why not. Its simple, perfectly understandable and even eloquent like the very best discoveries. It is the last point that is the central issue about which this entire blog is written. Are there really no barriers to ascending complexity using this algorithm?

    The earth is where this incredibly improbable event took place over 4.5 billion years according to evolution theory. The earth is not an isolated system as required by the second law. However notwithstanding the hypothetical raining down of proteins from outer space (panspermia), by that law the processes that produced them here must also pay the entropy cost for them.

    Panspermia was postulated by Fred Hoyle & Chandra Wickramasinghe  as a consequence of  their clear understanding that evolution alone could not account for all the diversity of life. see [http://www.panspermia.org/chandra.htm]

    Have a nice day..

    6.3 Entropy and Probability

    -->
    Now consider our three dice in a box in a childs play room. The room is the childs universe but definitely not an isolated system as required by the second law, however the probability outcomes for the box work to illustrate the thermodynamic probability term 'W' in the Boltzman Gibbs equation for absolute entropy. Note we are only interested in the entropy change associated with the process of shaking the dice. That change is the difference between the final and the initial absolute entropies of the two logical states. Note the Botzman Gibbs entropy depends only on the state of the system not how it got that way. For imperfect universes like our play room so long as the 'leaks' are random and small they will not affect the randomness of the process of shaking the dice. Non random energy would be like the child opening the box and deliberately ordering the dice just to trick you. Which is not allowed even in school let alone science.

    At an atomic level we observe that heat always travels from hot to cold by any mechanism available.. conduction, convection and radiation. Atoms are very small particles indeed and this is why their random jumbling behaviour is smooth and steady to anything so large as a thermometer. We observe atoms in close proximity share their vibration (heat) energy with those with less (heat) energy. The energy spreads over more atoms and vastly increases the number of ways (micro states) available for it to occupy. So W increases and entropy increases but the real driver of the second law is probability. The effect is to disipate energy. The heat equation for entropy discovered by Rudolf Clasius simply extends the range of possibility for calculation of entropy change using temperature rather than having to count and identify atoms!

    Notice the tendancy of heat to move from hot to cold does not require an 'isolated' system to proceed. If we put a source of heat on one end of a metal bar that is a non random input which will maintain an out of equilibrium condition but at no point in the system will we observe heat moving of itself from cold to hot. Putting the heat source and bar in a large insulated box we can measure the increase in entropy of the air surrounding the bar accounting for the entropy debt represented by the non uniform distribution of heat in the bar.. ie the entropy cost of maintaining that improbable state.

    Have a specially good day..

    6.2 Entropy and Probability


    With just three dice in a box you will readily appreciate that at any time a less probable pattern may occur. Its not difficult to get {222} (W = 1 micro state). Its 1 out of 216 possibilities so we may say if we continue to shake the box randomly we will expect to see {222} every 216 shakes on average. Thats the price in entropy required to pay for the improbable {222}. Indeed if we were to count the number of shakes to successive occurances of {222} and divide the total number of shakes by the number of occurances we would observe the result approach 216, its asymptote to any degree of accuracy we like. So with just 3 particles we can see that the Boltzman Gibbs equation means an increase in improbability is a decrease in entropy that must be accounted for by the second law. Also the second law only applies to a large number of particle events not just particles, which is the basis of the probability which drives it.

    The second law has to be formerly constrained to an isolated system of particles and a sufficiently large number of particle events to follow a predicted average behaviour. Under such circumstances the law states that the entropy (disorder) of the system must increase or at best remain constant (equilibrium). This predictability is about as near to an 'absolute' as you can get in science and yet it is driven by nothing more than the normal probability distribution with its central most probable outcome and an infinite number of less probable outcomes reaching all the way to the impossible.

    So what is an 'isolated system'? It sounds simple enough.. a system which is so well insulated and sealed the inside is not effected by the outside. The only problem with this idea is, it's purely hypothetical.. we may be able to approximate it but in truth the only one we know of is the entire universe! The ultimate meaning of the second law is that the entropy of the universe must increase with time.

    Have a particularly nice day..

    6.1 ENTROPY and PROBABILITY


    This section could also be given the mysterious title of “The Meaning of W”, ie the W from the Boltzman Gibbs equation.. [s = k log(W)]. While were at it, lets also point out that (k) is a constant called the Boltzman constant. Its arbitrary and unnecessary however by giving entropy (s) the units of energy per degree or (Joules/deg C) it allows the derivation of the Clausius equation for entropy change due to heat transfer (special case). Note however W (thermodynamic probability) is just a number it has no units.

    Now lets play dice, take 3 dice, any pattern you throw will be one of a total of 6^3 = 216 these are the microstates of the system. The probability of any specific patern is 1/216 call it P3. It is essential that all microstates have the same probability. Now a macrostate is just a group of microstates with a common characteristic. We could say the microstates are 'natural' but we choose the macrostate which is therefore intelligent. There are more ways to get some macrostates than others because they contain more microstates. For example there is only one way of getting {666} (ie it does not matter which dice is which) but look at {665} you can get {656} or {566} three ways to do it, but we get those three ways by noting which dice has the 5. Now take {315} essentially three different numbers, there is {351}, {135}, {153}, {531} and {513} ie 6 ways of getting any such group. Our choice of macrostate meant we ignored the order in which they come but required us to identify the dice. We could have chosen just a sum of dots as a macrostate.

    The probability of {351} written P(351) = 6 x 1/216 = 6P3 and Pr(665) = 3 x 1/216 = 3P3. So {666} has only 1 micro state, {665} has 3 and {135} has 6. The meaning of 'W' may now be stated as the number of micro states in any given macro state. If we have a box containing the three dice starting with {666} and shake it, the next pattern is 3 times more likely to be {665} and 6 times more likely to be {135}, and for n = any number its 18 times more likely to be {22n} (six macro states) etc. There is a powerful tendency for the arrangement to move towards a macro state with a larger number of micro states because that is more probable. Hence W will tend to increase and so will entropy (s) to a point where W is at its maximum (in this case 6) and tend to stay there which is called equilibrium.

    Have a realy nice day..

    Thursday, February 17, 2011

    5.4 Complexity

    Boltzman explained the link between entropy and the very powerful axioms of probability, an increase in entropy is an increase in probability.. (called thermodynamic probability) ie chance of being in that state as opposed to all other possibilities. Hence if (~ means directly related to)
    Entropy ~ Probability
    Now the opposite of probability is the inverse or 1/Probability = Improbability so we may say if a state of matter moves to a more improbable state it represents a drop in entropy.
    But entropy is also a measure of 'disorder' or loss in complexity so we may write..
    Complexity ~ - Entropy (negative entropy)
    And so by inference..
    Complexity ~ Improbability
    But the above implies that (- entropy) ~ (1/entropy) which is not true mathematically! But there is a math function for which negation of the function equals the function of the inverse! Logarithm. Hense the equation for entropy [s = k logW] and for a change in entropy from s1 to s2 we have
    s2 – s1 = k log W2 - k log W1 = k log W2/W1
    So 's' is a measure of complexity and related to probability by the log function of the probability called W. Hense the second law is expressed as in some engineering texts:
    “The second law of thermodynamics means that any system of paritcles, left to itself, will tend to move to its most probable state”.

    Tendancy means it is the average outcome of a large number of events, usually referred to as a 'statistically significant' number. This is true for everything made of atoms! It includes chemical reactions as well as physical pressures, temperatures and even nuclear reactions. It encompasses all imaginable states of matter regardless of size from electrons to elephants and giardia to galaxies.

    So any propositon or theory that claims to result in an average increase in complexity like evolution by natural selection, can be tested by the above for compliance with the second law. If it fails that test it is falsified regardless of the amount of data interpreted in its favour, that interpretation must be incorrect.

    Have a nice day..

    Friday, January 28, 2011

    5.3 Complexity

    So what's the connection between entropy and complexity? Well however you conceive of the term complexity it intuitively includes concepts like 'special', 'rare', 'not uniform' in comparison to less complex being 'uninteresting', 'common', 'uniform'. In the program “Life” commented by David Attenborough he frequently uses the words “remarkable”, “amazing” and “incredible” to describe unusually special features or characteristics of living creatures suggesting their complexity. There is one area of mathematics which is all about such terms.. probability.

    The occurrence of rare events is measured by the ratio of the number of them divided by the total number of possibilities and the answer is always between 0 and 1. Its the chance of such an event occurring based on the assumption that all possible events have the same chance of happening. 0 = impossible, 1 = certain. We talk of 'events' not objects, such as in the occurrence of 3 aces in a poker hand. In probability speak the occurrence implies both the acts of shuffling (randomizing) and fairly dealing the hand! If the deck were not shuffled or the deal unfair there would be a bias toward certain combinations which would significantly reduce the accuracy of the probability calculation. As the number of possibilities increases so does the 'improbability' of the event occurring, ie the number of possibilities is a measure of improbability. Which logically suggests improbability as a measure of complexity!

    That statement is significant because from the work of Ludwig Boltzman and Gibbs came an equation for entropy (s) which related it directly to the logarithm of the number of possible states (called micro states - W) that particular class or type of arrangement (called macro states) can have. ie inverse of probability or improbability (by convention it is called thermodynamic probability)! Max Plank had the equation [ s = kLog W ] inscribed on Botzman's gravestone. Its true significance is sadly underestimated and not understood as it should be. This equation puts a numerical scale on the axis of complexity! It reveals the vital link between entropy, complexity and probability. The real answer to Mark Riddleys problem.

    Have a nice day..

    Thursday, January 20, 2011

    5.2 Complexity

    A couple of years ago I read Mark Riddley's Mendel's Demon by which he received great accolades for explaining the rise in complexity in the biosphere. In the introduction he says the following..
    “Complexity is an ill-defined term, and I have been tempted to avoid it completely. I do not think it is meaningless to say that some forms of life are more complex than others, but I am as puzzled as anyone by what exactly I mean when I say it. There is no biologically agreed definition of complexity, but I suspect most people would agree on what should contribute to it. Structural complexity is one factor.”
    Bit of problem I would have thought, not knowing what it is you are trying to explain? He ends up choosing the number of genes in an organism as his measure of complexity, 'the number of the beast'! Convenienly avoiding the origin of the gene itself and life in the bargain. Well he is only a biologist.. if we want a definition which covers the whole spectrum of nature both non living and living then our definition must work in both. It must be a measurable quantity which in theory may be calculated for any state of matter. Sounds like a real toughie.. but is it?

    Recall 'entropy' from 4.5 above.. we said it was a measure of the 'disorder' or how close a signal is to a random jumble from Shannon's 'inverse' information theory. Well, according to something Ludwig Boltzmann discovered (about 1904) this same measure also applies to any state of matter! Just what we are looking for. The term entropy was first coined by Rudolf Clausius around 1857 based on the observation that heat energy always moves by conduction through matter from hot to cold. He does not 'prove' this but uses it to establish a law of thermodynamics called the Second Law. Now laws are laws because they are inviolable.. its known so well we can say with confidence it will never be violated.

    It says that in a 100% sealed system containing both mass and energy if they are not evenly distributed then after a period of time they will be. As it becomes more 'disorderly' the entropy will increase and cannot by itself decrease. This law is about as close as we will come to having an 'absolute' truth in science..

    Have a nice day..

    Saturday, January 15, 2011

    5.1 COMPLEXITY

    The thoughts preceding are pivotal to what comes next.. they became evident as I searched for the answer to a question posed by finding a small tincture bottle while walking. I guessed the bottle would be about a 100 years old by its imperfections and the air bubble locked into the unevenly molded sides, but it was instantly recognizable as a man made artifact, a design. I asked the question: how much information is contained in that bottle compared with an irregular blob of glass the same size? I thought of what it would take to make a copy to within a certain tolerance (ie minimum dimensional error acceptable) but the bubble was an error, a random unspecified part of it. It soon became apparent it would take just as much information to copy the blob of glass as it would to copy the bottle! So that does not answer the question concerning the difference between the two.

    The information required to copy the bottle is the same as the information to create it in the first place (including the tolerance) but the blob of glass was not designed or specified beforehand it just happened. The key lay in another type of descriptor.. complexity.

    We could relate complexity to the size or amount of material in an object.. call it 'numerical complexity' but given the same minimum tolerance (say .01mm) the two are the same. We may talk about the shape and call that 'geometric complexity' but with the lack of symmetries on the blob of glass it would probably have a higher geometric complexity than the bottle. That is it would take more information to correctly describe the shape to the tolerance specified than for the bottle. Then I realized the bottle was designed with a purpose in mind.. it had what may be called 'functional complexity', of which the blob of glass had absolutely none! So the bottle is in what may be called a 'functionally complex' state of matter. The only problem now was how to measure it??

    Have a nice day..

    Tuesday, January 11, 2011

    4.5 Information

    I know its all a bit impersonal.. but it is important ground work for what comes later. One last type of information is what forms the major part of the teaching on the subject based on the work of Claud Shannon and it has nothing whatsoever to do with semantics or meaning. Its about the quantity of information able to be recieved and not the quality or meaning of the signal. In fact when the meaning of a signal is clearly known prior to sending it the Shannon definition implies the signal contains no information at all! There is however a connection between Shannon information and the term entropy which is referred to as information entropy for obvious reasons.

    Lets just say that the entropy of a system is a measure of its disorder. The more disordered it is the higher the entropy. One fairly intuitive but correct inference we may get from this is that natural or accidental occurances tend always to increase disorder. Whether it is the stuff on your desk or the compressed air in your bike tyres or just the water in a dam, it takes some effort to keep it where you put it. So a low entropy state is an ordered state and a high entropy state is a disordered state. Shannons information entropy is a maximum for a signal when the content is completely random and unpredictable (disordered) and lowest for a clearly known message (highly ordered).

    We will come back to entropy later but for now I just want to make a clear distinction between these very different meanings of the term information. One fairly clear conclusion however from the above is that with time accumulated transmission, encoding or copying errors will increase the entropy (disorder) of any semantic signal. Which means as we move back in time the accuracy of any semantic signal must improve.

    Have a nice day..

    4.4 Information

    My former conclusion that the origin of all natural semantic information or DNA must come from outside the universe is not of itself a 'proof' since it does not deal with the probabilities however small of accidental assembly of such a system subject to some preferential selection. We'll come to that later. But it does present the basis of a very considerable herdle to such an idea which must be answered by that idea. There are other types of information or meanings of the word in use which now need an airing.

    Physicists talk of 'cause and effect' as the mechaism for the preservation of information in the uiniverse. It is the same meaning as the answer to the question: What actually travelled via the Olympic Torch relay from Athens to Bieging? Assuming the flame did not go out, it was the ignition temperture of the gas used. The whole reason for the assumed hyper-inflation following the big bang is the need to commuicate temperature evenly throughout and end up with what we see today. Lets call this 'event history' information or Ieh as opposed to semantic or signal information Is. Its what makes predictability possible concerning the interaction of matter and energy according to the known laws of physics and the verification and hence validity of those laws.

    Notwithstanding the odd black hole we may observe the state of the entire universe at any future point in time records the entire history of it. And if you have heard of the 'butterfly effect' you may understand that record is complete down to the motion of a single sub-atomic particle at its begining! This I think was partly at least the reason for Albert Einstein's belief that the whole thing was pre-determined. On a personal level the universe records every thought and motion you have ever made.. which might be a problem if there exists a mind out there big enough to read it!

    Have a nice day..

    Monday, January 10, 2011

    4.3 Information

    I covered the five prerequisites to the existence of semantic information and the basis of its intimate relationship with 'design'. The position gets much more interwoven and complex as we look deeper. Firstly all such information is stored in matter, even just a thought in your mind implies storage in the neuro circuitry of your brain. So any man made object is a design specified by semantic information even if that information is never stored anywhere else but in the designers brain. For complex items we create drawings, sketches computer data etc, however note how the information is always separate from the object it specifies, its always one step removed. These observations lead us to the following conclusions:
    1.Since information is stored in matter it is subject to the laws affecting matter
    2.To originate or create a design one must first create the semantic information
    3.Since semantic information always has a purpose it follows that every design also has a purpose, which is true by observation.

    Now comes the real fun.. the thing is recursively related to its own prerequisites! That is at least two of the prerequisites for semantic information require a 'machine' ie to write and read it. But a machine is a design which must be specified by semantic information!

    We conclude that semantic information cannot exist without prior semantic information! Which taken to its logical conclusion means if we observe any semantic information in the universe which is not the creation of man (notwithstanding the capabilities of other creatures) then its origin must be outside the physical universe.

    A new principle now emerges.. I say without proof, that any entity capable of creating semantic information is limited by the capacity of its own information. In other words we may never create a computing machine with greater creativity than our own mind, which by the way we did not create.

    Have a nice day..

    4.2 Information


    You may be surprised to know that the subject of 'semantic information' is shallow and vague in its treatment by the present academia. True to the reductionist approach all the bits are there widely spread among many sub-topics, but where is the 'whole'? Have a look on the Wikipedia its just a sub-heading under semiotics! But this is the 'information' age.. semantic implied as communication is not possible without it, but what is it really? This is actually indicative of a much deeper problem which begins with this subject, the tip of an iceberg!

    For example the Wiki makes much of vague statements like "information" as a "difference that makes a difference".. or “So a generalized definition of the concept should be: "Information" = An answer to a specific question".. and Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge,” What on earth are they on about? This is a smoke screen, to hide a deep truth someone doesn't want anyone to know! It should not come as a surprise that we are not all reductionists! I am entitled to ask what is observationally true here?

    Semantic information has five essential prerequisites to its existence:
    1. A pure alphabet (or encoding)
    2. Grammatical structure (syntax or language)
    3. A method of writing it (sending machinery)
    4. A method of reading it (receiving machinery)
    5. Purpose

    All design is based on it, it is always specified and always precedes the object of its creation of which it is the symbolic representation.. did you get all that? Its really important.. We are entitled to question why this is missing from the topics of information and design in the Wikipedia?

    Have a nice day..

    Saturday, January 1, 2011

    4.1 INFORMATION

    Its the new year and I must get moving on to deeper things! We are all aware of the passage of time.. the perception is entirely personal and different for each of us even in the same place. I mention this here because one sobering truth is, we are all running out of time. Our life is literally ebbing away and so we become somewhat morbidly if not glibly aware of our 'purpose'. What am I supposed to achieve in the time period given to me personally if anything? Well if I might be so bold suggest, whatever you feel that 'something' may be you will want it to be based on something which is true not something which is false. Only the deceived and deluded by default end up settling for less.

    According to the modern 'reductionists' like Richard Dawkins who claim they can explain all there is to know about big things from analysis of their small parts, energy and matter are the only observable fundamental entities in the universe. They are ignorant of some basic truths. Ignorant because they choose to ignore the published work of the mathematician Kurt Godell (1931), who essentially proved "the whole is more than the sum of the parts". Only mathematics you may say.. sorry but the maths always underpins the physics its just the schematic form of it.

    Signal Information is referred to as 'semantic' information meaning it has meaning. This has many deep and very powerful implications.. but lets just put it like this.. if the reductionists are right then..

    Matter + Energy = Information (semantic)

    And let me say this.. Every attempt to demonstrate, simulate, model or approximate the above proposition has failed. They will always fail because of a most basic and well known truth. One they choose to ignore and even hide but which I must reveal.. this is my purpose.

    Have a really nice day.