Word info

mutual information

Noun

Meaning

mutual information (usually uncountable, plural mutual informations)

(information theory) A measure of the entropic (informational) correlation between two random variables.
Mutual information



I
(
X
;
Y
)


{\displaystyle I(X;Y)}

between two random variables



X


{\displaystyle X}

and



Y


{\displaystyle Y}

is what is left over when their mutual conditional entropies



H
(
Y

|

X
)


{\displaystyle H(Y|X)}

and



H
(
X

|

Y
)


{\displaystyle H(X|Y)}

are subtracted from their joint entropy



H
(
X
,
Y
)


{\displaystyle H(X,Y)}

. It can be given by the formula



I
(
X
;
Y
)
=




x





y



p

X
,
Y


(
x
,
y
)

log

b







p

X
,
Y


(
x
,
y
)



p

X

|

Y


(
x

|

y
)

p

Y

|

X


(
y

|

x
)





{\displaystyle I(X;Y)=-\sum _{x}\sum _{y}p_{X,Y}(x,y)\log _{b}{p_{X,Y}(x,y) \over p_{X|Y}(x|y)p_{Y|X}(y|x)}}

.

Source: en.wiktionary.org

Examples

A basic property of this form of conditional entropy is that: : Mutual information (transinformation) Mutual information measures the amount of information that can be obtained about one random variable by observing another. Source: Internet

Close letter words and terms