← Back to Gallery

Information Theory Simulations

Explore the mathematical foundations of data, communication, and compression through interactive visualizations of Shannon entropy, coding theory, and information measures.

01

Shannon Entropy

Calculate information entropy for different probability distributions. Adjust probabilities and see how uncertainty changes.

02

Huffman Coding

Build optimal prefix-free code trees. Watch the algorithm construct variable-length codes based on symbol frequencies.

03

LZ77 Compression

Visualize sliding window compression with dictionary matching. See how repeated patterns reduce data size.

04

Hamming Codes

Detect and correct bit errors using parity checks. Flip bits and watch the decoder recover the original message.

05

Channel Capacity

Explore Shannon's theorem for noisy channels. Adjust noise levels and see the maximum reliable transmission rate.

06

Mutual Information

Venn diagrams of entropy relationships. Visualize how information is shared between random variables.

07

Kolmogorov Complexity

Compare compressibility of patterns vs random strings. Explore the shortest description length concept.

08

Source Coding

Compare fixed-length vs variable-length coding efficiency. See when compression saves space.

09

Cryptographic Entropy

Analyze random number quality. Compare pseudo-random and true random sources for security applications.

10

Information Bottleneck

Compress data while preserving relevant information. Balance compression ratio with prediction accuracy.