6.1 Mutual Information » A Note on Information Notation
It is common to find “marginal information” I(X) and “joint information” I(XY) referred to as “marginal entropy” and “joint entropy” respectively, and indicated by H(X) and H(X,Y).
Mutual information between X and Y is usually indicated as I(X:Y) or I(X;Y), and sometimes MI(X:Y) or MI(X;Y).