In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the message's information.
Claude E. Shannon introduced the formula for entropy in his 1948 paper "A Mathematical Theory of Communication."
Minus is used because for values less than 1, and logarithm is negative. However, since
formula can be expressed as
is also called an uncertainty or surprise, the lower the probability , i.e. → 0, the higher the uncertainty or the potential surprise, i.e. → ∞, for the outcome .
In this case, the formula expresses the mathematical expectation of uncertainty, which is why information entropy and information uncertainty can be used interchangeably.
This calculator computes Shannon entropy for given probabilities of events
This calculator computes Shannon entropy for symbol frequencies of a given message.