Tuesday, January 11, 2011

4.5 Information

I know its all a bit impersonal.. but it is important ground work for what comes later. One last type of information is what forms the major part of the teaching on the subject based on the work of Claud Shannon and it has nothing whatsoever to do with semantics or meaning. Its about the quantity of information able to be recieved and not the quality or meaning of the signal. In fact when the meaning of a signal is clearly known prior to sending it the Shannon definition implies the signal contains no information at all! There is however a connection between Shannon information and the term entropy which is referred to as information entropy for obvious reasons.

Lets just say that the entropy of a system is a measure of its disorder. The more disordered it is the higher the entropy. One fairly intuitive but correct inference we may get from this is that natural or accidental occurances tend always to increase disorder. Whether it is the stuff on your desk or the compressed air in your bike tyres or just the water in a dam, it takes some effort to keep it where you put it. So a low entropy state is an ordered state and a high entropy state is a disordered state. Shannons information entropy is a maximum for a signal when the content is completely random and unpredictable (disordered) and lowest for a clearly known message (highly ordered).

We will come back to entropy later but for now I just want to make a clear distinction between these very different meanings of the term information. One fairly clear conclusion however from the above is that with time accumulated transmission, encoding or copying errors will increase the entropy (disorder) of any semantic signal. Which means as we move back in time the accuracy of any semantic signal must improve.

Have a nice day..

No comments:

Post a Comment