In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message.
The formula for entropy was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
Minus is used because for values less than 1 logarithm is negative, however, since
formula can be expressed as
is also called an uncertainty or surprisal, the lower the probability , i.e. → 0, the higher the uncertainty or the surprise, i.e. → ∞, for the outcome .
Formula, in this case, expresses the mathematical expectation of uncertainty and that is why information entropy and information uncertainty can be used interchangeably.
There are two calculators below - one computes Shannon entropy for given probabilities of events, second computes Shannon entropy for given symbol frequencies for given message.