Get reference code
Appearance
Sample
ProfessionalComputers

Shannon Entropy

This online calculator computes Shannon entropy for a given event probability table and for a given message
Timur2013-06-04 15:04:43

In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

H(X)= - \sum_{i=1}^np(x_i)\log_b p(x_i)

Minus is used because for values less than 1 logarithm is negative, however, since

-\log a = \log \frac{1}{a},

formula can be expressed as

H(X)= \sum_{i=1}^np(x_i)\log_b \frac{1}{p(x_i)}

Expression
\log_b \frac{1}{p(x_i)}
is also called an uncertainty or surprisal, the lower the probability p(x_i), i.e. p(x_i) → 0, the higher the uncertainty or the surprise, i.e. u_i → ∞, for the outcome x_i.

Formula, in this case, expresses the mathematical expectation of uncertainty and that is why information entropy and information uncertainty can be used interchangeably.

There are two calculators below - one computes Shannon entropy for given probabilities of events, second computes Shannon entropy for given symbol frequencies for given message.

Shannon EntropyCreative Commons Attribution/Share-Alike License 3.0 (Unported)
Event probability table
Import data.
"One of the following characters is used to separate data fields: tab, semicolon (;) or comma(,)" 
Add Import data. Clear table
0.12345678901234567890
 

Shannon EntropyCreative Commons Attribution/Share-Alike License 3.0 (Unported)
0.12345678901234567890
 

Request a calculator
View all calculators
(505 calculators in total. )

Comments