homechevron_rightProfessionalchevron_rightComputers

Shannon Entropy

This online calculator computes Shannon entropy for a given event probability table and for a given message

Creative Commons Attribution/Share-Alike License 3.0 (Unported)

This content is licensed under Creative Commons Attribution/Share-Alike License 3.0 (Unported). That means you may freely redistribute or modify this content under the same license conditions and must attribute the original author by placing a hyperlink from your site to this work https://planetcalc.com/2476/. Also, please do not modify any references to the original work (if any) contained in this content.

In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

H(X)= - \sum_{i=1}^np(x_i)\log_b p(x_i)

Minus is used because for values less than 1 logarithm is negative, however, since

-\log a = \log \frac{1}{a},

formula can be expressed as

H(X)= \sum_{i=1}^np(x_i)\log_b \frac{1}{p(x_i)}

Expression
\log_b \frac{1}{p(x_i)}
is also called an uncertainty or surprisal, the lower the probability p(x_i), i.e. p(x_i) → 0, the higher the uncertainty or the surprise, i.e. u_i → ∞, for the outcome x_i.

Formula, in this case, expresses the mathematical expectation of uncertainty and that is why information entropy and information uncertainty can be used interchangeably.

There are two calculators below - one computes Shannon entropy for given probabilities of events, second computes Shannon entropy for given symbol frequencies for given message.

PLANETCALC, Shannon Entropy

Shannon Entropy

Event probability table

arrow_upwardarrow_downwardEventarrow_upwardarrow_downwardProbability
Items per page:

Digits after the decimal point: 2
Entropy, bits
 

PLANETCALC, Shannon Entropy

Shannon Entropy

Digits after the decimal point: 2
Entropy, bits
 

Creative Commons Attribution/Share-Alike License 3.0 (Unported) PLANETCALC, Shannon Entropy

Comments