# Joint Entropy

This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p

### This page exists due to the efforts of the following people:

#### Timur

Created: 2019-10-12 15:06:56, Last updated: 2021-02-24 12:07:34

Joint entropy is a measure of "the uncertainty" associated with a set of variables.

In order to calculate the joint entropy, you should enter the joint distribution matrix where the cell value for any i row and j column represents the probability of the outcome, . You can find the joint entropy formula below the calculator. #### Joint entropy

Digits after the decimal point: 2
Joint Entropy H(X,Y)

### Joint Entropy Formula

The joint Shannon entropy (in bits) of two discrete random variables and with images and is defined as:

where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if 1

URL copied to clipboard

#### Similar calculators

PLANETCALC, Joint Entropy