Directed information

Directed information is an information theory measure that quantifies the information flow from the random string to the random string . The term directed information was coined by James Massey and is defined as[1]

where is the conditional mutual information .

Directed information has applications to problems where causality plays an important role such as the capacity of channels with feedback,[1][2][3][4] capacity of discrete memoryless networks,[5] capacity of networks with in-block memory,[6] gambling with causal side information,[7] compression with causal side information,[8] real-time control communication settings,[9][10] and statistical physics.[11]

  1. ^ a b Massey, James (1990). "Causality, Feedback And Directed Information". Proceedings 1990 International Symposium on Information Theory and its Applications, Waikiki, Hawaii, Nov. 27-30, 1990.
  2. ^ Kramer, Gerhard (1998). Directed information for channels with feedback (Doctoral). ETH Zurich. doi:10.3929/ethz-a-001988524. hdl:20.500.11850/143796.
  3. ^ Tatikonda, Sekhar Chandra (2000). Control under communication constraints (Doctoral). Massachusetts Institute of Technology. hdl:1721.1/16755.
  4. ^ Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:cs/0608070. doi:10.1109/TIT.2008.2009849. S2CID 13178.
  5. ^ Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.
  6. ^ Kramer, Gerhard (April 2014). "Information Networks With In-Block Memory". IEEE Transactions on Information Theory. 60 (4): 2105–2120. arXiv:1206.5389. doi:10.1109/TIT.2014.2303120. S2CID 16382644.
  7. ^ Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:0912.4872. doi:10.1109/TIT.2011.2136270. S2CID 11722596.
  8. ^ Simeone, Osvaldo; Permuter, Haim Henri (June 2013). "Source Coding When the Side Information May Be Delayed". IEEE Transactions on Information Theory. 59 (6): 3607–3618. arXiv:1109.1293. doi:10.1109/TIT.2013.2248192. S2CID 3211485.
  9. ^ Charalambous, Charalambos D.; Stavrou, Photios A. (August 2016). "Directed Information on Abstract Spaces: Properties and Variational Equalities". IEEE Transactions on Information Theory. 62 (11): 6019–6052. arXiv:1302.3971. doi:10.1109/TIT.2016.2604846. S2CID 8107565.
  10. ^ Tanaka, Takashi; Esfahani, Peyman Mohajerin; Mitter, Sanjoy K. (January 2018). "LQG Control With Minimum Directed Information: Semidefinite Programming Approach". IEEE Transactions on Automatic Control. 63 (1): 37–52. arXiv:1510.04214. doi:10.1109/TAC.2017.2709618. S2CID 1401958.
  11. ^ Vinkler, Dror A; Permuter, Haim H; Merhav, Neri (20 April 2016). "Analogy between gambling and measurement-based work extraction". Journal of Statistical Mechanics: Theory and Experiment. 2016 (4): 043403. arXiv:1404.6788. Bibcode:2016JSMTE..04.3403V. doi:10.1088/1742-5468/2016/04/043403. S2CID 124719237.

Developed by StudentB