Word info

joint entropy

Noun

Meaning

joint entropy (countable and uncountable, plural joint entropies)

(information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
If random variables



X


{\displaystyle X}

and



Y


{\displaystyle Y}

are mutually independent, then their joint entropy



H
(
X
,
Y
)


{\displaystyle H(X,Y)}

is just the sum



H
(
X
)
+
H
(
Y
)


{\displaystyle H(X)+H(Y)}

of its component entropies. If they are not mutually independent, then their joint entropy will be



H
(
X
)
+
H
(
Y
)

I
(
X
;
Y
)


{\displaystyle H(X)+H(Y)-I(X;Y)}

where



I
(
X
;
Y
)


{\displaystyle I(X;Y)}

is the mutual information of



X


{\displaystyle X}

and



Y


{\displaystyle Y}

.

Source: en.wiktionary.org

Related terms

Close letter words and terms