-
-
Measuring Information: Bits
-
Adding Up Bits
-
Using Bits to Count and Label
-
-
-
Physical Forms of Information
-
Entropy
-
-
-
Information and Probability
-
-
-
Fundamental Formula of Information
-
-
-
Computation and Logic: Information Processing
-
-
-
Mutual Information
-
-
-
-
Shannon's Coding Theorem
-
-
-
The Manifold Things Information Measures
-
-
-
Homework
-
Homework Solutions
-
6.1 Mutual Information » A Note on Information Notation
It is common to find “marginal information” I(X) and “joint information” I(XY) referred to as “marginal entropy” and “joint entropy” respectively, and indicated by H(X) and H(X,Y).
Mutual information between X and Y is usually indicated as I(X:Y) or I(X;Y), and sometimes MI(X:Y) or MI(X;Y).