Conditional mutual information

Venn diagram of information theoretic measures for three variables , , and , represented by the lower left, lower right, and upper circles, respectively. The conditional mutual informations , and are represented by the yellow, cyan, and magenta regions, respectively.

In probability theory, particularly information theory, the conditional mutual information[1][2] is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.

  1. ^ Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:10.1016/s0019-9958(78)90026-8.
  2. ^ Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Uspekhi Mat. Nauk. 14: 3–104.

Developed by StudentB