Tuesday, 22 July 2014

1.4 The Problem

Well in all that effort on the Wiki.. I have now written 18458 words in contributions to the 'talk - Entropy' page all about changing ONE WORD in the entropy article.. simply because it is being used in a manner contrary to its dictionary definition.!! That ought to tell you something.

In that context I have raised another word which is likewise kept in a seriously darkened state as to its real meaning..    COMPLEXITY

You see it is argued that 'evolution by natural selection' is the 'mechanism' by which complexity increases in nature. (ref Bill Nye debate with Ken Ham). But what does that mean if you don't define complexity?

The truth is they don't want to define it in any rigorous way because it would present such a huge problem for any proposal it can evolve by any natural process. Go ahead look up "define complexity".. you will see what I mean.

There is another word before I leave this introduction and it is..

                            INFORMATION

If you look this up you will be directed to a theory analyzing data transfer in a noisy environment by Claude Shannon and all I need to say about that is you will be mislead because Shannon was not interested in the significance or content of the signal. Only how the limits of error tolerance affect transmission. So in his terminology a message with the highest level of 'information' would be a random stream of characters ie with no expectation of the next character like the next letter in a word. Here's the quote from 'The Mathematical Theory of Communication' by Claude Shannon "meaning.. is irrelevant to the engineering problem". The best reference I can give you is the following..

[http://schneider.ncifcrf.gov/information.is.not.uncertainty.html]

This is the actual inverse of what the thermodynamic term entropy means for semantic information. Shannon's work is important but it is unfortunate that he chose to use the word 'entropy' for both information and uncertainty.. As a result the literature abounds with "Information = entropy and entropy = uncertainty".. The resolution is simple Shannon information is NOT semantic information (with meaning) as I am referring to.


So these three words are at the root of the problem..

           ENTROPY    -    COMPLEXITY    -    INFORMATION

(Which causes problems with another word..     DESIGN )

This is the 'no man's land' between two apparently very unequally matched sides..


We all have a list.. 'truth we don't want to know'. My advice keep it as small as possible

Have a very good day