Friday, January 28, 2011

5.3 Complexity

So what's the connection between entropy and complexity? Well however you conceive of the term complexity it intuitively includes concepts like 'special', 'rare', 'not uniform' in comparison to less complex being 'uninteresting', 'common', 'uniform'. In the program “Life” commented by David Attenborough he frequently uses the words “remarkable”, “amazing” and “incredible” to describe unusually special features or characteristics of living creatures suggesting their complexity. There is one area of mathematics which is all about such terms.. probability.

The occurrence of rare events is measured by the ratio of the number of them divided by the total number of possibilities and the answer is always between 0 and 1. Its the chance of such an event occurring based on the assumption that all possible events have the same chance of happening. 0 = impossible, 1 = certain. We talk of 'events' not objects, such as in the occurrence of 3 aces in a poker hand. In probability speak the occurrence implies both the acts of shuffling (randomizing) and fairly dealing the hand! If the deck were not shuffled or the deal unfair there would be a bias toward certain combinations which would significantly reduce the accuracy of the probability calculation. As the number of possibilities increases so does the 'improbability' of the event occurring, ie the number of possibilities is a measure of improbability. Which logically suggests improbability as a measure of complexity!

That statement is significant because from the work of Ludwig Boltzman and Gibbs came an equation for entropy (s) which related it directly to the logarithm of the number of possible states (called micro states - W) that particular class or type of arrangement (called macro states) can have. ie inverse of probability or improbability (by convention it is called thermodynamic probability)! Max Plank had the equation [ s = kLog W ] inscribed on Botzman's gravestone. Its true significance is sadly underestimated and not understood as it should be. This equation puts a numerical scale on the axis of complexity! It reveals the vital link between entropy, complexity and probability. The real answer to Mark Riddleys problem.

Have a nice day..

Thursday, January 20, 2011

5.2 Complexity

A couple of years ago I read Mark Riddley's Mendel's Demon by which he received great accolades for explaining the rise in complexity in the biosphere. In the introduction he says the following..
“Complexity is an ill-defined term, and I have been tempted to avoid it completely. I do not think it is meaningless to say that some forms of life are more complex than others, but I am as puzzled as anyone by what exactly I mean when I say it. There is no biologically agreed definition of complexity, but I suspect most people would agree on what should contribute to it. Structural complexity is one factor.”
Bit of problem I would have thought, not knowing what it is you are trying to explain? He ends up choosing the number of genes in an organism as his measure of complexity, 'the number of the beast'! Convenienly avoiding the origin of the gene itself and life in the bargain. Well he is only a biologist.. if we want a definition which covers the whole spectrum of nature both non living and living then our definition must work in both. It must be a measurable quantity which in theory may be calculated for any state of matter. Sounds like a real toughie.. but is it?

Recall 'entropy' from 4.5 above.. we said it was a measure of the 'disorder' or how close a signal is to a random jumble from Shannon's 'inverse' information theory. Well, according to something Ludwig Boltzmann discovered (about 1904) this same measure also applies to any state of matter! Just what we are looking for. The term entropy was first coined by Rudolf Clausius around 1857 based on the observation that heat energy always moves by conduction through matter from hot to cold. He does not 'prove' this but uses it to establish a law of thermodynamics called the Second Law. Now laws are laws because they are inviolable.. its known so well we can say with confidence it will never be violated.

It says that in a 100% sealed system containing both mass and energy if they are not evenly distributed then after a period of time they will be. As it becomes more 'disorderly' the entropy will increase and cannot by itself decrease. This law is about as close as we will come to having an 'absolute' truth in science..

Have a nice day..

Saturday, January 15, 2011

5.1 COMPLEXITY

The thoughts preceding are pivotal to what comes next.. they became evident as I searched for the answer to a question posed by finding a small tincture bottle while walking. I guessed the bottle would be about a 100 years old by its imperfections and the air bubble locked into the unevenly molded sides, but it was instantly recognizable as a man made artifact, a design. I asked the question: how much information is contained in that bottle compared with an irregular blob of glass the same size? I thought of what it would take to make a copy to within a certain tolerance (ie minimum dimensional error acceptable) but the bubble was an error, a random unspecified part of it. It soon became apparent it would take just as much information to copy the blob of glass as it would to copy the bottle! So that does not answer the question concerning the difference between the two.

The information required to copy the bottle is the same as the information to create it in the first place (including the tolerance) but the blob of glass was not designed or specified beforehand it just happened. The key lay in another type of descriptor.. complexity.

We could relate complexity to the size or amount of material in an object.. call it 'numerical complexity' but given the same minimum tolerance (say .01mm) the two are the same. We may talk about the shape and call that 'geometric complexity' but with the lack of symmetries on the blob of glass it would probably have a higher geometric complexity than the bottle. That is it would take more information to correctly describe the shape to the tolerance specified than for the bottle. Then I realized the bottle was designed with a purpose in mind.. it had what may be called 'functional complexity', of which the blob of glass had absolutely none! So the bottle is in what may be called a 'functionally complex' state of matter. The only problem now was how to measure it??

Have a nice day..

Tuesday, January 11, 2011

4.5 Information

I know its all a bit impersonal.. but it is important ground work for what comes later. One last type of information is what forms the major part of the teaching on the subject based on the work of Claud Shannon and it has nothing whatsoever to do with semantics or meaning. Its about the quantity of information able to be recieved and not the quality or meaning of the signal. In fact when the meaning of a signal is clearly known prior to sending it the Shannon definition implies the signal contains no information at all! There is however a connection between Shannon information and the term entropy which is referred to as information entropy for obvious reasons.

Lets just say that the entropy of a system is a measure of its disorder. The more disordered it is the higher the entropy. One fairly intuitive but correct inference we may get from this is that natural or accidental occurances tend always to increase disorder. Whether it is the stuff on your desk or the compressed air in your bike tyres or just the water in a dam, it takes some effort to keep it where you put it. So a low entropy state is an ordered state and a high entropy state is a disordered state. Shannons information entropy is a maximum for a signal when the content is completely random and unpredictable (disordered) and lowest for a clearly known message (highly ordered).

We will come back to entropy later but for now I just want to make a clear distinction between these very different meanings of the term information. One fairly clear conclusion however from the above is that with time accumulated transmission, encoding or copying errors will increase the entropy (disorder) of any semantic signal. Which means as we move back in time the accuracy of any semantic signal must improve.

Have a nice day..

4.4 Information

My former conclusion that the origin of all natural semantic information or DNA must come from outside the universe is not of itself a 'proof' since it does not deal with the probabilities however small of accidental assembly of such a system subject to some preferential selection. We'll come to that later. But it does present the basis of a very considerable herdle to such an idea which must be answered by that idea. There are other types of information or meanings of the word in use which now need an airing.

Physicists talk of 'cause and effect' as the mechaism for the preservation of information in the uiniverse. It is the same meaning as the answer to the question: What actually travelled via the Olympic Torch relay from Athens to Bieging? Assuming the flame did not go out, it was the ignition temperture of the gas used. The whole reason for the assumed hyper-inflation following the big bang is the need to commuicate temperature evenly throughout and end up with what we see today. Lets call this 'event history' information or Ieh as opposed to semantic or signal information Is. Its what makes predictability possible concerning the interaction of matter and energy according to the known laws of physics and the verification and hence validity of those laws.

Notwithstanding the odd black hole we may observe the state of the entire universe at any future point in time records the entire history of it. And if you have heard of the 'butterfly effect' you may understand that record is complete down to the motion of a single sub-atomic particle at its begining! This I think was partly at least the reason for Albert Einstein's belief that the whole thing was pre-determined. On a personal level the universe records every thought and motion you have ever made.. which might be a problem if there exists a mind out there big enough to read it!

Have a nice day..

Monday, January 10, 2011

4.3 Information

I covered the five prerequisites to the existence of semantic information and the basis of its intimate relationship with 'design'. The position gets much more interwoven and complex as we look deeper. Firstly all such information is stored in matter, even just a thought in your mind implies storage in the neuro circuitry of your brain. So any man made object is a design specified by semantic information even if that information is never stored anywhere else but in the designers brain. For complex items we create drawings, sketches computer data etc, however note how the information is always separate from the object it specifies, its always one step removed. These observations lead us to the following conclusions:
1.Since information is stored in matter it is subject to the laws affecting matter
2.To originate or create a design one must first create the semantic information
3.Since semantic information always has a purpose it follows that every design also has a purpose, which is true by observation.

Now comes the real fun.. the thing is recursively related to its own prerequisites! That is at least two of the prerequisites for semantic information require a 'machine' ie to write and read it. But a machine is a design which must be specified by semantic information!

We conclude that semantic information cannot exist without prior semantic information! Which taken to its logical conclusion means if we observe any semantic information in the universe which is not the creation of man (notwithstanding the capabilities of other creatures) then its origin must be outside the physical universe.

A new principle now emerges.. I say without proof, that any entity capable of creating semantic information is limited by the capacity of its own information. In other words we may never create a computing machine with greater creativity than our own mind, which by the way we did not create.

Have a nice day..

4.2 Information


You may be surprised to know that the subject of 'semantic information' is shallow and vague in its treatment by the present academia. True to the reductionist approach all the bits are there widely spread among many sub-topics, but where is the 'whole'? Have a look on the Wikipedia its just a sub-heading under semiotics! But this is the 'information' age.. semantic implied as communication is not possible without it, but what is it really? This is actually indicative of a much deeper problem which begins with this subject, the tip of an iceberg!

For example the Wiki makes much of vague statements like "information" as a "difference that makes a difference".. or “So a generalized definition of the concept should be: "Information" = An answer to a specific question".. and Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge,” What on earth are they on about? This is a smoke screen, to hide a deep truth someone doesn't want anyone to know! It should not come as a surprise that we are not all reductionists! I am entitled to ask what is observationally true here?

Semantic information has five essential prerequisites to its existence:
  1. A pure alphabet (or encoding)
  2. Grammatical structure (syntax or language)
  3. A method of writing it (sending machinery)
  4. A method of reading it (receiving machinery)
  5. Purpose

All design is based on it, it is always specified and always precedes the object of its creation of which it is the symbolic representation.. did you get all that? Its really important.. We are entitled to question why this is missing from the topics of information and design in the Wikipedia?

Have a nice day..

Saturday, January 1, 2011

4.1 INFORMATION

Its the new year and I must get moving on to deeper things! We are all aware of the passage of time.. the perception is entirely personal and different for each of us even in the same place. I mention this here because one sobering truth is, we are all running out of time. Our life is literally ebbing away and so we become somewhat morbidly if not glibly aware of our 'purpose'. What am I supposed to achieve in the time period given to me personally if anything? Well if I might be so bold suggest, whatever you feel that 'something' may be you will want it to be based on something which is true not something which is false. Only the deceived and deluded by default end up settling for less.

According to the modern 'reductionists' like Richard Dawkins who claim they can explain all there is to know about big things from analysis of their small parts, energy and matter are the only observable fundamental entities in the universe. They are ignorant of some basic truths. Ignorant because they choose to ignore the published work of the mathematician Kurt Godell (1931), who essentially proved "the whole is more than the sum of the parts". Only mathematics you may say.. sorry but the maths always underpins the physics its just the schematic form of it.

Signal Information is referred to as 'semantic' information meaning it has meaning. This has many deep and very powerful implications.. but lets just put it like this.. if the reductionists are right then..

Matter + Energy = Information (semantic)

And let me say this.. Every attempt to demonstrate, simulate, model or approximate the above proposition has failed. They will always fail because of a most basic and well known truth. One they choose to ignore and even hide but which I must reveal.. this is my purpose.

Have a really nice day.