Sunday, October 18, 2015

Appendix 5.2 Richard Dawkins Verifies

These derivations go like this.. First you set the problem up in words.. then you translate those into math symbols, work the math to get an answer and then translate it back into words. So here goes..

                (n) 
E =  N .  \               1            =     N .  (2^n  -  1)        =        2N . (2^n  -  1)
               /          2^(m-1)                     2^(n-1)                                   2^n
               m=1

Now for n large enough (number of correct guesses ~ 8 will do) 
(2^n  - 1)  approx ~=   2^n   leaving     E   ~=  2N

Now for n = 8 (RD's example where he noted the audience was about 100)

E (from actual calculation above) = N . (255/128)

From simple probability..  n correct guesses at random will occur once every 2^n tosses if averaged over a very large number of tosses.

Hence the expected value of E is 2^8 = 256 so putting this value in above we get..

N . (255/128)  =  256    Therefore N = 128.502   or    129  (whole people)
This corresponds well with Richard Dawkins estimate at the time. Now look at this table..


n                  N                    2^n = E  
8                 129                  256
10               513                1024
15           16385              32768
20       524289          1048576

So N (rounded up) ~= E/2.. The audience number N determines E (the number of events) which (by my proof) = the required number to pay the entropy debt for the low entropy result (a number of heads guessed in a row)..
The simulation verifies how the entropy debt is paid for by the process.. QED

Saturday, October 17, 2015

Appendix 5.1 Richard Dawkins Verifies

This is all about an experiment Richard Dawkins did with an audience of young people to demonstrate improbable events happen all the time and the inclination to see the supernatural is in fact illogical.. Ref: [https://www.youtube.com/watch?v=H1TxH0zf07w]..

So lets see what this same experiment reveals when analysed as to conformance with the second law.. the basis of my falsification.. First I must say I agree the outcome is in no way supernatural.. and you do end up with an apparently remarkable result.. in the case given guessing 8 times in a row the outcome of the toss of a coin. But what does this actually demonstrate..?

Lets take the number of people participating as  = N
the number of correct guesses in a row  =  n

The total number of guess 'events' = N + N/2 + N/4 + ... +N/2^(n-1)  =  (lets call this) E

                (n) 
E =  N .  \                1          {ie N x (sum of the series  1/2^(n-1)  (n) times)}
               /             2^(m-1)
               m=1

Assuming half sit down after each toss of the coin.. because they got it wrong.

Now my falsification is based on the observation that the number of random changes (events or tosses of coin) required on average to get a certain improbable outcome is how the process balances the entropy decrease of that outcome (state) with an entropy increase in the surroundings. Its the entropy cost of the low entropy state which by the second law must be paid by that process.

So now we have the end result being the low entropy state of 8 correct guesses in a row which is equivalent to tossing 8 heads in a row or tossing 8 coins and ending up with 8 heads.

The probability of this outcome = 1/2^8  (or 1/2^n for 'n' correct guesses in a row)

So the question is, what is the audience size N to give enough events to pay the cost..?