← Back

Shannon Entropy

0.00 bits

Max entropy for 4 states: 2.00 bits

State Probabilities

Controls

Information Theory

Shannon entropy measures the average information content (uncertainty) in a system.

H = -Σ p(x) log₂ p(x)

High entropy: Random, unpredictable, high information

Low entropy: Ordered, predictable, low information