conditional entropy
English
Noun
conditional entropy (plural conditional entropies)
- (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
- The conditional entropy of random variable given (i.e., conditioned by ), denoted as , is equal to where is the mutual information between and .
Related terms
This article is issued from
Wiktionary.
The text is licensed under Creative
Commons - Attribution - Sharealike.
Additional terms may apply for the media files.