# Joint Entropy

This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p

This content is licensed under Creative Commons Attribution/Share-Alike License 3.0 (Unported). That means you may freely redistribute or modify this content under the same license conditions and must attribute the original author by placing a hyperlink from your site to this work https://planetcalc.com/8418/. Also, please do not modify any references to the original work (if any) contained in this content.

Joint entropy is a measure of "the uncertainty" associated with a set of variables.

In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any ** i** row and

**column represents the probability of the outcome, . Joint entropy formula can be found below the calculator.**

*j*### Joint Entropy Formula

The joint Shannon entropy (in bits) of two discrete random variables and with images and is defined as:

where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if ^{1}

## Comments