Conditional entropy

From testwiki
Revision as of 20:24, 13 December 2021 by imported>Ffffrr (English: Wikipedia)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

English

Template:Wikipedia

Noun

Template:En-noun

  1. Template:Lb The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
    The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)I(Y;X) where I(Y;X) is the mutual information between Y and X.