Sunday, October 18, 2015

Appendix 5.2 Richard Dawkins Verifies

These derivations go like this.. First you set the problem up in words.. then you translate those into math symbols, work the math to get an answer and then translate it back into words. So here goes..

                (n) 
E =  N .  \               1            =     N .  (2^n  -  1)        =        2N . (2^n  -  1)
               /          2^(m-1)                     2^(n-1)                                   2^n
               m=1

Now for n large enough (number of correct guesses ~ 8 will do) 
(2^n  - 1)  approx ~=   2^n   leaving     E   ~=  2N

Now for n = 8 (RD's example where he noted the audience was about 100)

E (from actual calculation above) = N . (255/128)

From simple probability..  n correct guesses at random will occur once every 2^n tosses if averaged over a very large number of tosses.

Hence the expected value of E is 2^8 = 256 so putting this value in above we get..

N . (255/128)  =  256    Therefore N = 128.502   or    129  (whole people)
This corresponds well with Richard Dawkins estimate at the time. Now look at this table..


n                  N                    2^n = E  
8                 129                  256
10               513                1024
15           16385              32768
20       524289          1048576

So N (rounded up) ~= E/2.. The audience number N determines E (the number of events) which (by my proof) = the required number to pay the entropy debt for the low entropy result (a number of heads guessed in a row)..
The simulation verifies how the entropy debt is paid for by the process.. QED

No comments:

Post a Comment