Mutual information

From testwiki
Jump to navigation Jump to search

English

Template:Wikipedia

Noun

Template:En-noun

  1. Template:Lb A measure of the entropic (informational) correlation between two random variables.
    Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=xypX,Y(x,y)logbpX,Y(x,y)pX|Y(x|y)pY|X(y|x).

Translations

Template:Trans-top

Template:Trans-bottom

See also