Conditional entropy

This online calculator calculates entropy of Y random variable conditioned on X random variable and X random variable conditioned on Y random variable given a joint distribution table (X, Y) ~ p

This page exists due to the efforts of the following people:

Timur

Timur

Created: 2019-10-07 17:04:41, Last updated: 2021-02-25 08:11:06

The conditional entropy H(Y|X) is the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.

In order to calculate the conditional entropy we need to know joint distribution of X and Y. Below you should enter the matrix where the cell value for any i row and j column represents the probability of the {x_i, y_j} outcome, p_{(x_i, y_j)}. Rows represent the values of X random variable {x_1, x_2, ... x_n and columns - the values of Y random variable {y_1, y_2, ... y_m}.

Note that you can click "Show details" to view the details of the calculation. The formula used in the calculation is explained below the calculator.

PLANETCALC, Conditional entropy

Conditional entropy

Digits after the decimal point: 2
H(Y|X)
 
H(X|Y)
 
The file is very large. Browser slowdown may occur during loading and creation.

Conditional entropy formula

The conditional entropy of Y given X is defined as

\mathrm {H} (Y|X)\ =-\sum _{x\in {\mathcal {X}},y\in {\mathcal {Y}}}p(x,y)\log {\frac {p(x,y)}{p(x)}}

It is assumed that the expressions 0\log 0 and 0\log \frac{c}{0} should be treated as being equal to zero.

p(x) for each row is calculated by summing the row values (that is, summing cells for each value of X random variable), and p(x,y) are already given by the input matrix.

What is the meaning of this formula?

In fact, it is the weighted average of specific conditional entropies over all possible values of X.

Specific Conditional Entropy of Y for the X taking the value v is the entropy of Y among only those outcomes in which X has value v. That is,

\mathrm {H} (Y|X=v)=-\sum _{y\in {\mathcal {Y}}}{P(Y=y|X=v)\log _{2}{P(Y=y|X=v)}}

So, the conditional entropy as the weighted sum of specific conditional entropies for each possible value of X, using p(x) as the weights, is

\mathrm {H} (Y|X)\ &\equiv \sum _{x\in {\mathcal {X}}}\,p(x)\,\mathrm {H} (Y|X=x)}&=-\sum _{x\in {\mathcal {X}}}p(x)\sum _{y\in {\mathcal {Y}}}\,p(y|x)\,\log \,p(y|x)
=-\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}\,p(x,y)\,\log \,p(y|x)&=-\sum _{x\in {\mathcal {X}},y\in {\mathcal {Y}}}p(x,y)\log {\frac {p(x,y)}{p(x)}}

URL copied to clipboard
PLANETCALC, Conditional entropy

Comments