Mutual information

From testwiki
Revision as of 18:57, 29 August 2023 by imported>WingerBot (consolidate author=/author2=/first=/last=/etc. into author= in {{quote-journal}})
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

English

Template:Wikipedia

Noun

Template:En-noun

  1. Template:Lb A measure of the entropic (informational) correlation between two random variables.
    Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=xypX,Y(x,y)logbpX,Y(x,y)pX|Y(x|y)pY|X(y|x).

Translations

Template:Trans-top

Template:Trans-bottom

See also