Mutual information I(X;Y) measures how much knowing one variable reduces uncertainty about the other.
When X and Y are independent, I(X;Y) = 0. When perfectly correlated, I(X;Y) = min(H(X), H(Y)).