SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices -- Ignore unavailable to you. Want to Upgrade?


To: Wayners who wrote (775221)3/16/2014 10:58:47 PM
From: koan  Respond to of 1573430
 
These monster computers keep exploring more and more math and physics.

I am expecting them to tell us stuff pretty soon that will blow us away.

Two science fiction books that recently blew me away. So well written, so smart, great mysteries and you will feel smarter after reading them . I bought about 50 copies and gave them to my family and friends.

www.wake and Hominids by Robert Sawyer.

One is about the internet gaining consciousness and one is about an alternate planet where Neanderthal survived and we didn't.

He feeds you constant information wrapped in great mystery. They read fast.

en.wiktionary.org






Shannon entropy

Definition from Wiktionary, the free dictionary

Jump to: navigation, search

Contents [ hide]


English[ edit]Etymology[ edit]Named after Claude Shannon, the "father of information theory".

Noun[ edit] Shannon entropy ( countable and uncountable, plural Shannon entropies)

  1. information entropy Shannon entropy H is given by the formula where pi is the probability of character number i showing up in a stream of characters of the given "script". Consider a simple digital circuit which has a two-bit input (X, Y) and a two-bit output (X and Y, X or Y). Assuming that the two input bits X and Y have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is . Then the possible output combinations are (0,0), (0,1), and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is , so the circuit reduces (or "orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility.
Related terms[ edit]See also[ edit]