Tuesday, August 27, 2013

Appendix 4.3 Other Laws

5. The Rational Mind
Any theory predicting the random assembly of the human brain (the most orderly state of matter in the known universe) as 'rational', 'logical' and therefore capable of discerning truth and therefore doing 'science'.. must itself be unreliable.. not capable of doing science.

Gödel was a convinced theist.[22] He rejected the notion of others like his friend Albert Einstein that God was impersonal.
He believed firmly in an afterlife, stating: "Of course this supposes that there are many relationships which today's science and received wisdom haven't any inkling of. But I am convinced of this [the afterlife], independently of any theology." It is "possible today to perceive, by pure reasoning" that it "is entirely consistent with known facts." "If the world is rationally constructed and has meaning, then there must be such a thing [as an afterlife]."[23]
...
[Wikipedia - Kurt Godel]

I think God is an uncomfortable truth for everyone actually.. but in the end truth is what we need and what good science is all about. It cannot be left to a bunch of theophobics to make it up to suit their beliefs. They have brought science into disrepute and most certainly misled society into thinking you can forget about God (and by implication truth)..

Absolutes do exist.. Temperature has one, Entropy has one, Velocity has one.. so it should not surprise you to know that Law and Morality have them (since they are based on the existence of truth). Truth is by definition an absolute which cannot be avoided and only fools try to fight.

Please read 9.1-2 'The Falsification of Evolution'.. if you agree I suggest you also look at "Who is God" at [vh-who.blogspot.com.au]. If you do not understand or accept the falsification case.. leave a comment.

I am sincerely trying to help you..

Thank you..

Appendix 4.2 Other Laws

These points are not so much other laws but where evolutionary thinking fails in its application..

3. The First Star

After the mega blast of energy and mass, many times the size of the universe, the extremely high initial temperature produces a rapidly expanding ball of protons.. (hydrogen nuclei).. After a little more cooling the protons gain a couple of electrons and voilà - a massive ball of hydrogen.. (cosmic sand).

By definition that process becomes an adiabatic expansion of gas. Its adiabatic because there is no 'space' outside it for any of the heat to transfer or radiate into.. it is expanding as fast as the radiation itself. Which all means it begins as a low entropy state.. highly concentrated very hot.. and moves to being a high entropy state.. cool and fairly evenly distributed.

Now density fluctuations are theorised to cause the collapse of one or more regions under the action of their collective gravity to eventually form the first massive stars. But compared to a cool ball of hydrogen in space a massive star is a much lower entropy state.!! Oh No.. not that Second Law again.

So gravity MUST overcome the Second Law.. I don't think so.. I should be able to prove that.. (to come)

4. The Simple can explain the Complex

From a mathematical - logic point of view.. this was proved to be impossible by Kurt Godel in his 1931 Incompleteness Theorems. There is always a 'remainder'. It's equivalent to saying the whole can be explained by examining the parts.. basic reductionism.. Its false.

An aeroplane may comprise 15000 parts.. none of which can fly..

In technological terms.. the whole is more than the sum of the parts.

Godel's theorems give meaning to the word 'design' as a Functional State of Matter.. meaning a design always has an overall purpose.. which can only be the product of a mind.

Think about it..

Friday, August 23, 2013

Appendix 4.1 Other Laws

If the theory of evolution by natural selection violates the Second Law of Thermodynamics it should not be too surprising to find that ideas and theories founded on the same thinking may also be shown to violate other laws of physics. The scope of influence of evolutionary thinking cannot be underestimated as to its effects on almost every aspect of science and education from medicine to economics.. so what if its wrong!!

1. The Law of Biogenesis Louis Pasteur circa 1860
Inorganic chemistry.. molecules are supposed to have 'competed' for fitness, resulting in greater size and complexity in the primordial soup on an ancient earth cooling down from its assumed hot beginning. All leading to a self replicating RNA or DNA molecule or a proto-protein capable of replication. That's the essence of 'abiogenesis', however.

Last night on the BBC Story of Science Power, Proof and Passion program 23rd Aug 2013, Michael J Mosley made a watershed admission.. DNA can only exist within a cell, complete with all its complex systems that maintain it and make it work. The observation was simply put "living cells come from living cells".. and he concluded life started.. complex not simple.

Biogenesis is back on the podium.. (Omni vitum ex vita).

2. The First Law - the baryon number problem
After Einstein we knew mass was a concentrated form of energy and it could be converted directly into energy as confirmed by the atomic bomb. Conversely energy can be 'concentrated' to form mass. However.. the law of conservation dictates that whenever this happens you always get an equal number of anti-matter particles and matter particles (baryons - matter particles).

So the postulate of a big bang creating matter from energy means you not only get a universe.. but also an anti-universe!! Which means they should both annihilate one another and return to energy.! So it is postulated this universe is just the 'ashes' left over from a very much bigger bang than anyone can imagine.. meaning the universe is just the tiny 'error'.

Laws would breakdown at a big bang singularity.. however the energy -> mass conversion occurs after the singularity. The assumption of a baryon number error the size of this universe has to qualify as a violation of the 1st law.

There's more.

Sunday, August 18, 2013

Appendix 3.3 Evolutionary Predictions

Last but not least is..

(3) Computer Simulation

From Darwin 1857 to about 1990 it was not possible to do really big model simulation and the only method was complex algebra and later calculus to find analytic solutions (formula with general applicability). But many problems have no analytic solution.

In the 1990's massively parallel computers became available allowing for simulation of such things as the weather.. and the solution of very large structural problems..  (economics is apparently still out of reach :*()

Surely evolution... bunch of replicating things.. subject to occasional random change.. in an environment with a few enemies, a few different foods, a few ecological niches etc. let it run and see what survives.. a lot easier than the weather I would imagine. Checking the Wiki..(Evolutionary Algorithm)

"Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes. The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics." (complete with spelling error.. mutation.. hmm)

Hint.. "macroevolution" means create new information.. "attempt" means NO its not been done..

In 'Climbing Mount Improbable' Richard Dawkins promised this prediction would be fulfilled within ten years.. that was 1996. From my falsification I can confidently say it will never be done.. because macro-evolution implies a violation of the Second Law for any DNA coding protein type life on any Earth like planet.

The humble honey bee routinely does problems we need a supercomputer to solve?

A little humility is probably in order for us..

Appendix 3.2 Evolutionary Predictions

(2) Systematic Taxonomy

The evolutionary algorithm is a theory of limitless and constant change.. long periods of 'stasis' are cited for such creatures as crocodiles simply because their fossils cover a theoretical 200 million years.. However other fossil deposits demand very 'rapid' evolutionary change. The problem is the science is vague and accords more with convenient storytelling.

Human beings are described as having opted out of evolution.. But it was not so long ago black indigenous native people were considered less than human, primitive savages in evolutionary terms. Certainly Charles Darwin expressed this opinion and thought they would eventually die out.


The question is how does a system of continuous change produce a living kingdom which conforms to an enduring classification system of distinct types with identifiable groupings. The original idea actually came from the bible.. by a creationist (like all theists of his day) and it is still with us! Carl Linnaeus was the first to create the binomial naming system of classification.. its changed but his classification of large animals remains essentially unchanged.

Species is an inadequate term to talk about immutability but at the higher family level it accords rather well with the bible's use of the word 'kind'. Creatures reproduce after their kind and the flesh of one kind is different to the flesh of another kind.. accords well with observation and the Law of Biogenesis..

"Pasteur demonstrated that fermentation is caused by the growth of micro-organisms, and the emergent growth of bacteria in nutrient broths is due not to spontaneous generation, but rather to biogenesis (Omne vivum ex vivo "all life from life")." [from Wikipedia article Louis Pasteur]

must move on..

Apppendix 3.1 Evolutionary Predictions

All good scientific theories make predictions which are crucial to the verification of the truth of the theory and evolution is no exception. Again there are at least three I am a ware of..

(1) The Occurrence of Beneficial Mutations

Since it is just too improbable for a mutation or series of mutations to occur precisely when they are needed by a change in say the environment or food supply or predators etc. It is postulated that these accumulate in waiting as neutral or non fatal mutations which become available when the system is driven out of its previously stable pattern.

Unfortunately studies of mutations to date have not turned up anything remotely like what would be required to actually make evolution a possibility. The vast majority of mutations are neutral or of very small effect thanks to the redundancy built into DNA and the double helix. The elaborate proof reading and error correcting mechanism plays a vital role in verification of the copy and last but not least.. the biblical command to not marry your close relatives.. also practised in nature by larger animals.

Dr Lee Spetner in "Not By Chance" a graduate in both mathematics and genetics does a calculation of the required number of 'favourable' mutations required to meet the evolutionary requirement for a creature like the horse. Its of the order of 2 million!

Confirmation comes from the research like Motoo Kimura..  Kimura's data shows beneficial mutations as so rare to be not measurable. and from research in the Wikipedia "Out of all mutations, 39.6% were lethal, 31.2% were non-lethal deleterious, and 27.1% were neutral.". One often cited beneficial mutation called "Sickle Cell" gives resistance to Malaria but itself is ultimately shortens life expectancy?

The data clearly does not support this prediction..

Appendix 2.4 Evolutionary Assumptions

(3) Physical Assumption

The physical assumption inherent in the evolutionary algorithm is that you can naturally increase complexity because natural selection gets around the Second Law..

The tactic has been to avoid a clear definition of the word 'complexity'.. ignorance at work. (go on.. have a look for the definition)..

Ludwig Boltzmann, Willard Gibbs and Max Plank put a scale on the axis of complexity for any state of matter..

     COMPLEXITY   =   IMPROBABILITY

The Second Law requirement for any large system that the absolute entropy cannot naturally decrease means improbability cannot increase which means complexity cannot increase over time. Inconsistent with evolution but consistent with all observations, simulations and honest calculations.

Natural selection is just that, SELECTION of what has already been produced by random mutation and therefore cannot effect or provide any direction to those mutations.

The essence of my falsification is the application of the second law to the evolution of semantic information as encoded in DNA.. but it equally applies to the physical state.

Physical 'Design' is another word left without proper definition. We even have R Dawkins inventing 'designoid' (appearance of design with purpose but not purposeful).. without first defining 'design' upon which it is based!

    DESIGN  =  A FUNCTIONAL STATE OF MATTER

Only a mind can conceive of a need for a function.. purpose is therefore implicit. If it has a purpose then by definition it is a design.. Like eyes, ears, muscles, feathers, bicycle chains, teapots and elephants..

Occam's Razor required..

Appendix 2.3 Evolutionary Assumptions

It's a pity Michael Behe of 'Intelligent Design' fame is only a biologist.. because, like Richard Dawkins they're both ignorant of technology.. You need to do a serious course in Engineering for that. Technology means functional design and though not all designs are machines they do all have a purpose..


So Michael Behe calls a 'machine' or 'design'; "Irreducibly Complex"..

That is not a good description because almost any machine can be shown to be reducible to some minor degree without destroying its capacity to do the function.. I think he qualifies 'reducible' to mean the complete loss of an essential component.. but its not rigorous.

The reason you can reduce any machine by a small amount is simply the fact that we cannot build machines to anything approaching optimum. An optimum machine or structure has all failure modes occur at the same time. That is nothing to do with over design due to safety factors. It's over-design due to our imperfect knowledge, imperfect material quality and imperfect manufacturing processes.

A better description for any piece of technology (design) is..

        A FUNCTIONALLY COMPLEX STATE OF MATTER.

All technology (design) satisfies this definition..
and has the following characteristics..

(a)  It is SPECIFIED by semantic information
      (this is the design part)

(b) It is ISOLATED from all other machines..
     (where the design is not controlled by standards)

(c) It is internally INTEGRATED..
     (the parts work together)

(d) Performs a FUNCTION (none of the parts can do)
     (the whole is more than the sum of the parts)


(e) It has a PURPOSE
     (meets a need)

All technology is therefore the product of a rational MIND.because only a rational mind can express a need.

Moving on..

Saturday, August 17, 2013

Appendix 2.2 Evolutionary Assumptions

(2) Technological Assumption

The evolutionary algorithm must assume that for all evolved forms there exists an unbroken chain of changes going all the way back to its origin. Each one having a small but statistically significant selective advantage for survival over its predecessor.

Not only does the evolutionary algorithm assume such a chain exists but the requirement for a selective survival advantage at each stage also precludes the change from being 'infinitesimally small' or even just small.

The simplest and most obvious impasse to this assumption is in the direct comparison of biotechnology with human technology where they are functionally the same. So we may compare the design of the bacterial flagella motor to an ordinary electric motor. Both have the following component parts..

Case housing two sets of bearings, a lubricant and lubrication system, a stator providing a variable 'poled' magnetic field, an armature with complimentary magnetic poles with a precise phase relationship to the stator, electrical connections conforming to a workable circuit producing the required poles, an adequate power source for the required task, a fully reversible, variable speed control circuit and lastly a functional output device to perform the task for which the motor was designed.

We KNOW and can say with confidence.. such a machine CANNOT be designed and constructed incrementally in some hypothetical series of small steps where each step..

(a) Has a useful function.. which part do you make first and what does it do? and..

(b) Gets progressively better after each step at performing the same function.

If the function of one stage is different from that of its predecessor then you have effectively multiplied the problem.. What replaces it in its former function and what was performing the new function before the change?  Or are we to suppose the requirement for both functions changed simultaneously at the moment a random mutation miraculously occurred to provide a solution??

a bit more on this one..

Friday, August 16, 2013

Appendix 2.1 Evolutionary Assumptions

If the evolutionary algorithm as simple and elegant as it sounds actually violates the Second Law and is therefore impossible.. there should be some significant contrary observations linked to this seemingly incredible assertion.

All scientific theories embody foundational assumptions which must be independently verifiable. For evolution there are at least three.

(1) - Mathematical Assumption

Evolutionary changes which are mathematically impossible in a single mutation event are claimed to be made possible by a large number of only slightly improbable mutation events. (R Dawkins "crane" in Climbing Mount Improbable)

This assumption relies on the true observation that Natural Selection will preserve or 'quarantine' a population from the steady occurrence of damaging mutations. The group need only wait until a beneficial one comes along.. which when it does appear spreads through the population simply because their prodigy are more successful. Then all you have to do is wait for the next one.. and voilà evolution. It could be true..  except for..


(a) Entropy being a state variable.. like any improbability (low entropy state) it is irrelevant how you got there the improbability is the same. For 100 heads it makes no difference if you toss 100 coins at once or one coin 100 times. Mount improbable is just as forbidding.

(b) Codes of DNA put a finite limit on 'small' they are not infinitesimal.. (its a real process not quasi-static as R Dawkins erroneously claims "take any change as small as you like" - Climbing Mount Improbable).

Note natural selection is a SELECTION process for what random mutations have ALREADY PRODUCED.. so logically cannot influence those mutations.

So if the evolutionary process assumes (as I did) every atom of the universe for every millisecond in 14 billion years is applied to a single protein molecule and still it falls massively short .. ie predicts you cannot get enough beneficial changes (correct DNA to make the protein).. the protein remains impossible.

moving on..

Wednesday, August 14, 2013

Appendix 1.5 Entropy

I could not find the Boltzmann Gibbs equation for absolute entropy applied to DNA like this anywhere. It challenges the naturalistic assumption that semantic information can evolve from a chance combination of mass and energy.. by definition DNA is a low entropy state.. which must be paid for..

The Second Law condition for the random assembly of a string of semantic information of length p from an alphabet of m possible codes is..

                 n  =  m^p    random trials.

(eg to throw a double [6] with 2 dice n = 6^2 = 36 throws or
for a string of 10 bases of DNA n = 4^10 = 1048576 random mutations)

It is the average occurrence of a specific sequence in an infinite number of random trials that determines the minimum number of trials (entropy cost) to meet the second law requirement, entropy must increase.

The probability of getting at least one occurrence of a specific string of length p codes from an alphabet of m possibilities in n = m^p random trials is..

     Pr(at least one)   =   1  -   Pr(not getting any)  =   1   -    [(n-1) /n]^n

So for at least one (two heads) from n = 4 throws of a coin..
Pr(at least one 2xhead)  =  1  -  [(4 - 1) / 4]^4    =   0.6836

For at least one [double 6] from 2 dice in n = 6^6 = 36 throws..
Pr(at least one)  =  1  -  (35/36)^36  =  0.6372

For at least one 10 base DNA string from 1048576 random mutations..
Pr(at least one)   =   1  -   (1048575)/1048576]^1048576  = 0.6321

Note the probability of at least one occurrence as n gets large asymptotes toward a certain LIMITING value.. So what is it?

      the limit  of  [1  -  [ (n-1) / n ] ^ n]    =   1  -   1/e   approx =  0.6321
                           for   n -> infinity

Its my number so.. The 'Bellamy limit' = 1 - 1/e   is the lowest probability demanded by the Second Law for a randomly assembled string of semantic information length p from m codes in n = m^p tries to make it PROBABLE ENOUGH NOT to violate that law (for large n say > 50).

(Jan 2015: I now believe it applies to all logical states ie.. microstates)

Sunday, August 11, 2013

Appendix 1.4 Entropy

Now..

                 MASS    +    ENERGY    =     INFORMATION

is the EQUATION of LIFE..
It's the basis of the assumption that life will evolve on Earth like planets.. so

             ROCK  +  LIQUID WATER    +    HEAT   =    DNA

MUST be true for life to evolve.. To fail to question this assumption is to fail to do science.

2. Entropy is an EXTRINSIC property.. meaning its value depends on factors which are not inherent to the mass of the system.

Weight is an extrinsic property of mass because its value depends upon the strength of the gravitational acceleration where it is being measured.

The absolute entropy of any system depends upon only ONE thing.. 'W' the thermodynamic probability of the system. It is the number of micro-states in that macro-state.. Given all micro-states are equally likely a system (of particles) will tend to move to a macro-state with more micro-states and so entropy increases.

The absolute entropy of the dice is a combination of both the PHYSICAL state and the LOGICAL state (ie double 3). The Clausius equation for entropy change can account for the physical state during the process of manufacture but only the Boltzmann equation can account for the logical state which is extrinsic because it is not dependent on the mass of the dice.


The important thing to understand about the Second Law is systems left to themselves will tend to move to a more probable state. All we need is a probability calculation to determine the STRENGTH of that tendency. For a large system it is not sufficient for it just to be POSSIBLE, it must be PROBABLE..

For DNA (large book, even for bacteria) the total number of possible arrangements is what determines its absolute entropy and the possibility of that state arising from random mutations (mass + energy).

So does the Second Law define a boundary between possible and probable?

one more on this..

Appendix 1.3 Entropy

The fact that entropy is a STATE variable (does not care about the process that got the system to where it is) does not mean the process can violate the Second Law..

Consider two pairs of dice.. first a pair of plastic dice on the kitchen table, second a pair of 1 metre square steel cubes at the bottom of a 10 metre high sand dune..

Now both exist.. so their physical form (low absolute entropy state of matter) has been accounted for by a larger increase in the entropy of the surroundings by their manufacturing processes and obeys Second Law..

Now throw both pairs 9972 times.. Their is a crane and dump truck at the bottom of the sand dune..

Counting the number of [3].[3].. We know it will be 277, average once every 36 times for both. But the absolute entropy for the LOGICAL state of [3].[3] is the SAME for both (actually zero because there is only one way to get it). However the heat energy and consequential entropy increase in the surroundings is massively bigger for one than the other..

This not only shows why the absolute entropy of a system must be independent of the process that got it there BUT also the absolute entropy of any LOGICAL state such as semantic information is independent of the mass of the system.. Demonstrating semantic information is a separate and distinctly different entity to mass or energy.. and cannot be a product of them..

               MAS   +     ENERGY   #    INFORMATION  (live DNA)

more to come..

Appendix 1.2 Entropy

Now we need to get some fundamentals sorted out.. particularly concerning the two equations used for calculating entropy which have been all mixed up thanks to the very distorted article in the Wikipedia. I tried to fix it.. I really did but the self proclaimed 'moderator'.. used his knowledge of physics to put up a smoke screen of techno-bable to fend me off... all I can say is..   I'LL BE BACK.!

1. Entropy is a STATE variable meaning its absolute value is independent of the path by which it got there.. we know this from the Boltzmann Gibbs equation for absolute entropy which reveals it is directly (not linearly) and exclusively based on the probability of that particular state existing.

The Rudolf Clausius 'heat' equation is the integral sum (meaning by infinitesimally small amounts added up) of heat crossing a system boundary divided by the temperature at that point and moment on the boundary. It calculates the CHANGE in entropy of the system inside the boundary for a given quantity of heat transferred. Heat IN is positive increasing entropy, heart OUT is negative decreases entropy. Not only does this not say anything about the absolute entropy of the system but it totally misses logical states like semantic information such as a book or molecule of DNA.

The problem with absolute entropy is the calculation of the thermodynamic probability term "W". It requires the IDENTIFICATION of every particle or logical place holder in the system and the calculation for each state called a macro-state {set of energy states or logical sequences} the entire count of all possible combinations when every particle is swapped with every other particle in the system.. which is mind bogglingly big for any more than a few particles.

Engineers are mostly interested in the change of entropy during a process anyway so Clausius is the big winner here. His equation came in about 1865 just in time for the industrial revolution and steam power where it was needed most.

more to come

Sunday, August 4, 2013

Appendix 1.1 Entropy

I visited the London Science Museum a few years ago and saw they had an entire floor devoted to 'ENERGY', so I asked "Where's the floor devoted to 'ENTROPY'.. a sort of stunned silence followed by "What's entropy?".

Guess what.. There is a reason why nobody seems to know or care what 'entropy' is or means.. it is an embarrassment to the naturalists.. who control the agenda of modern science.

I need a new blog.. "What's Missing from Science".. later

ENTROPY.. according to the Cambridge Encyclopedia

"A quantitative measure of disorder" not bad.. but it would help if they defined 'disorder'.. Is it what you would naturally think of? Like 'a mess on the desk'.. well almost.

But there is a proviso.. Disorder in the thermodynamic context means not just jumbled up but uniformly jumbled up.. So the papers in neat piles by subject and in date order would correspond to the minimum entropy state for that system. Still in 'piles' but not in date order.. higher entropy.. all piled up on one side of the desk still higher entropy and evenly spread over the desk in random fashion would be the highest entropy state. Note both PHYSICAL and LOGICAL order affects the value of the absolute entropy of a system.

Because semantic information is always stored in physical matter (even an idea in your head is a circuit in physical neurons) it is also subject to the Law of Increasing Entropy.. or increasing disorder.

When energy finally ends up as heat, (or radiation), it spreads out evenly moving from higher concentration (hot) to lower concentration (cool).. driven by the most powerful axiom of probability.. a disordered state is a more probable state because there are far more of them than ordered states.

Think of a working motorcycle of 5000 parts.. How many ways could those parts be connected together.. and for how many would it still work?

next time