Joint Entropy

This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p

This page exists due to the efforts of the following people:

Timur

Timur

Created: 2019-10-12 15:06:56, Last updated: 2021-02-24 12:07:34
Creative Commons Attribution/Share-Alike License 3.0 (Unported)

This content is licensed under Creative Commons Attribution/Share-Alike License 3.0 (Unported). That means you may freely redistribute or modify this content under the same license conditions and must attribute the original author by placing a hyperlink from your site to this work https://planetcalc.com/8418/. Also, please do not modify any references to the original work (if any) contained in this content.

Joint entropy is a measure of "the uncertainty" associated with a set of variables.

In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the {x_i, y_j} outcome, p_{(x_i, y_j)}. You can find the joint entropy formula below the calculator.

PLANETCALC, Joint entropy

Joint entropy

Digits after the decimal point: 2
Joint Entropy H(X,Y)
 

Joint Entropy Formula

The joint Shannon entropy (in bits) of two discrete random variables X and Y with images \mathcal {X} and \mathcal {Y} is defined as:

\mathrm {H} (X,Y)=-\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}P(x,y)\log _{2}[P(x,y)]

where x and y are particular values of X and Y, respectively, P(x,y) is the joint probability of these values occurring together, and P(x,y)\log _{2}[P(x,y)] is defined to be 0 if P(x,y)=01

URL copied to clipboard
PLANETCALC, Joint Entropy

Comments