There is too much.. for the time I give to this.! I am aware a few of you really want to understand.. truth should be like that.. available to everyone.. hence my Wikipedia involvement. I truly and deeply despise ignorance and particularly when its deliberate. That's a justice issue and I honestly believe I have a responsibility to tell what I know and yes invite anyone to tell me I'm wrong or out with the fairies etc..
There is a much 'bigger picture' to all this.. WHY.. am I discovering stuff that should already be there in our education system, open for discussion and question without cover-up or fear..
Well we don't actually have "truth'.. the best we can do is what I call "verified knowledge".. There is an underlying truth to knowledge, its just that we can never say we actually have it.. However truth can and would be known to an infinite mind.. and if that mind chose to reveal it.. please just hold that thought.
I am now reading James Gleick's "The Information".. he is a very good writer and does his homework very thoroughly. BUT.. How does he end up concluding (haven't finished the book yet but this was given on a radio interview with the author)
INFORMATION (now) = UNCERTAINTY
Whatever he means by information it's not 'semantic information'.. but surely he knows that.. a couple of quotes..
pp44 (In regard to the deciphering of cuneiform mathematical tablets)
"They were like maps of a mysterious city. This was the key to deciphering them, finally; the ordered chaos that seems to guarantee the the presence of meaning."
'ordered chaos' is an oxymoron.. 'ordered' means by definition not chaotic..?
Here's a hint.. of the problem for the naturalist..
pp32 "The paleographer has a unique bootstrap problem. Its only writing that makes its own history possible".
Pointing to my much earlier conclusion.. semantic information is recursive.
next time.
Sunday, 31 August 2014
Saturday, 30 August 2014
1.5 The Problem
Get this.. Here is how the Wikipedia editor in contention with me defines the second law..
The above is absolute BS because it takes no account of 'logically' ordered states like 'information', only physically ordered states like where one part is hot and another cold. I can't hardly believe such ignorance.. this guy is supported by over 460 observers.. we are in deep trouble.
The best and most meaningful statement of the Second Law is..
Any system left to itself will tend to move to its most probable state..!
I did finally manage to put a {further explanation needed} tag on the misuse of the word "evolve" (dictionary synonym 'develop') where it should read "decay" hopefully directing people to the 'talk' page to see why.
I also got some clarification into the page on 'Introduction to Entropy'.. but there is a glaring confusion still there to be dealt with.. in particular use of the term "information entropy".. suggesting there are different types of entropy.. there are not..
There is only ONE entropy..!
What I am telling you here is only the tip of the iceberg..
-
-
- One may consider an initial set {i } of several thermodynamic systems each its own state of internal thermodynamic equilibrium with entropy {Si }. There may then occur a thermodynamic operation by which the walls between those systems are changed in permeability or otherwise altered, so that there results a new and final set {f } of physical systems, at first not in thermodynamic equilibrium. Eventually they will settle into their own states of internal thermodynamic equilibrium having entropies {Sf }. The second law asserts that
-
The above is absolute BS because it takes no account of 'logically' ordered states like 'information', only physically ordered states like where one part is hot and another cold. I can't hardly believe such ignorance.. this guy is supported by over 460 observers.. we are in deep trouble.
The best and most meaningful statement of the Second Law is..
Any system left to itself will tend to move to its most probable state..!
I did finally manage to put a {further explanation needed} tag on the misuse of the word "evolve" (dictionary synonym 'develop') where it should read "decay" hopefully directing people to the 'talk' page to see why.
I also got some clarification into the page on 'Introduction to Entropy'.. but there is a glaring confusion still there to be dealt with.. in particular use of the term "information entropy".. suggesting there are different types of entropy.. there are not..
There is only ONE entropy..!
What I am telling you here is only the tip of the iceberg..
Subscribe to:
Posts (Atom)