Conditional entropy
This online calculator calculates entropy of Y random variable conditioned on X random variable and X random variable conditioned on Y random variable given a joint distribution table (X, Y) ~ p
This content is licensed under Creative Commons Attribution/Share-Alike License 3.0 (Unported). That means you may freely redistribute or modify this content under the same license conditions and must attribute the original author by placing a hyperlink from your site to this work https://planetcalc.com/8414/. Also, please do not modify any references to the original work (if any) contained in this content.
The conditional entropy H(Y|X) is the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known.
In order to calculate the conditional entropy we need to know joint distribution of X and Y. Below you should enter the matrix where the cell value for any i row and j column represents the probability of the outcome, . Rows represent the values of X random variable and columns - the values of Y random variable .
Note that you can click "Show details" to view the details of the calculation. The formula used in the calculation is explained below the calculator.
Conditional entropy formula
The conditional entropy of Y given X is defined as
It is assumed that the expressions and should be treated as being equal to zero.
for each row is calculated by summing the row values (that is, summing cells for each value of X random variable), and are already given by the input matrix.
What is the meaning of this formula?
In fact, it is the weighted average of specific conditional entropies over all possible values of X.
Specific Conditional Entropy of Y for the X taking the value v is the entropy of Y among only those outcomes in which X has value v. That is,
So, the conditional entropy as the weighted sum of specific conditional entropies for each possible value of X, using p(x) as the weights, is
Comments