To: Wayners who wrote (775221 ) 3/16/2014 10:58:47 PM From: koan Respond to of 1573430 These monster computers keep exploring more and more math and physics. I am expecting them to tell us stuff pretty soon that will blow us away. Two science fiction books that recently blew me away. So well written, so smart, great mysteries and you will feel smarter after reading them . I bought about 50 copies and gave them to my family and friends. www.wake and Hominids by Robert Sawyer. One is about the internet gaining consciousness and one is about an alternate planet where Neanderthal survived and we didn't. He feeds you constant information wrapped in great mystery. They read fast.en.wiktionary.org Shannon entropy Definition from Wiktionary, the free dictionary Jump to: navigation , search Contents [ hide ] English[ edit ]Etymology[ edit ]Named after Claude Shannon , the "father of information theory ". Noun[ edit ] Shannon entropy ( countable and uncountable , plural Shannon entropies ) information entropy Shannon entropy H is given by the formula where p i is the probability of character number i showing up in a stream of characters of the given "script". Consider a simple digital circuit which has a two-bit input (X , Y ) and a two-bit output (X and Y , X or Y ). Assuming that the two input bits X and Y have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is . Then the possible output combinations are (0,0), (0,1), and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is , so the circuit reduces (or "orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility. Related terms[ edit ]See also[ edit ]