← Back to Gallery

Shannon Entropy

H = 0.00 bits

About Entropy

Shannon entropy measures the average information content or uncertainty in a probability distribution.

H(X) = -Σ p(x) log₂ p(x)

Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Minimum entropy (0) occurs when one outcome is certain.