homechevron_rightProfessionalchevron_rightComputers

Shannon Entropy

This online calculator computes Shannon entropy for a given event probability table and for a given message

In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

H(X)= - \sum_{i=1}^np(x_i)\log_b p(x_i)

Minus is used because for values less than 1 logarithm is negative, however, since

-\log a = \log \frac{1}{a},

formula can be expressed as

H(X)= \sum_{i=1}^np(x_i)\log_b \frac{1}{p(x_i)}

Expression
\log_b \frac{1}{p(x_i)}
is also called an uncertainty or surprisal, the lower the probability p(x_i), i.e. p(x_i) → 0, the higher the uncertainty or the surprise, i.e. u_i → ∞, for the outcome x_i.

Formula, in this case, expresses the mathematical expectation of uncertainty and that is why information entropy and information uncertainty can be used interchangeably.

There are two calculators below - one computes Shannon entropy for given probabilities of events, second computes Shannon entropy for given symbol frequencies for given message.

PLANETCALC, Shannon Entropy

Shannon Entropy

Event probability table

EventProbability
Items per page:

Digits after the decimal point: 2
Entropy, bits
 
Save the calculation to reuse next time, to extension embed in your website or share share with friends.

PLANETCALC, Shannon Entropy

Shannon Entropy

Digits after the decimal point: 2
Entropy, bits
 
Save the calculation to reuse next time, to extension embed in your website or share share with friends.

Comments