1 Answers
Directed information, I {\displaystyle I} , is an information theory measure that quantifies the information flow from the random process X n = { X 1 , X 2 , … , X n } {\displaystyle X^{n}=\{X_{1},X_{2},\dots ,X_{n}\}} to the random process Y n = { Y 1 , Y 2 , … , Y n } {\displaystyle Y^{n}=\{Y_{1},Y_{2},\dots ,Y_{n}\}}. The term directed information was coined by James Massey and is defined as
where I {\displaystyle I} is the conditional mutual information I {\displaystyle I}.
The Directed information has many applications in problems where causality plays an important role such as capacity of channel with feedback, capacity of discrete memoryless networks with feedback, gambling with causal side information, compression with causal side information,and in real-time control communication settings, statistical physics.