Word info

conditional entropy

Noun

Meaning

conditional entropy (plural conditional entropies)

(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
The conditional entropy of random variable



Y


{\displaystyle Y}

given



X


{\displaystyle X}

(i.e., conditioned by



X


{\displaystyle X}

), denoted as



H
(
Y

|

X
)


{\displaystyle H(Y|X)}

, is equal to



H
(
Y
)

I
(
Y
;
X
)


{\displaystyle H(Y)-I(Y;X)}

where



I
(
Y
;
X
)


{\displaystyle I(Y;X)}

is the mutual information between



Y


{\displaystyle Y}

and



X


{\displaystyle X}

.

Source: en.wiktionary.org

Related terms

Close letter words and terms