mutual information

English

Noun

mutual information (usually uncountable, plural mutual informations)

  1. (information theory) A measure of the entropic (informational) correlation between two random variables.
    Mutual information between two random variables and is what is left over when their mutual conditional entropies and are subtracted from their joint entropy . It can be given by the formula .

See also

This article is issued from Wiktionary. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.