📊 Information Theory Visualizer

Explore Shannon's entropy, channel capacity, and the mathematics that powers all digital communication

🎯 Choose Your Learning Mode

🤔 What Is Information Theory?

Information theory, founded by Claude Shannon in 1948, is the mathematical study of quantifying, storing, and communicating information. It defines entropy as the measure of uncertainty in a message — the more surprising an outcome, the more information it carries. Shannon proved that every communication channel has a maximum capacity, and that reliable transmission is possible at any rate below this limit using clever encoding.

Why does this matter? Every text message, phone call, streaming video, and Wi-Fi connection relies on Shannon's theorems. Compression algorithms (ZIP, MP3, JPEG) exploit entropy to shrink data. Error-correcting codes (used in satellites, QR codes, 5G) add strategic redundancy so data survives noise. Information theory even connects to thermodynamics, machine learning, and the fundamental limits of computation.

📏
Entropy
H = -Σp·log₂(p) — measures uncertainty and information content per symbol
📡
Channel Capacity
C = B·log₂(1+SNR) — the maximum reliable data rate through a noisy channel
📦
Compression
Remove redundancy to approach the entropy limit — lossless vs lossy encoding
🛡️
Error Correction
Hamming and Reed-Solomon codes add redundancy to survive channel noise
🔊
Noise & BER
Channel noise corrupts bits — Bit Error Rate measures transmission quality
🔗
Mutual Information
I(X;Y) measures the shared information between sender and receiver

🚀 Quick Start

⚙️ Source & Channel

📋 Event Log

Channel idle. Press Start to begin information pipeline...
Entropy: 0.00 bits
Channel Capacity: 0 bps
Bit Error Rate: 0.000
Compression Ratio: 1.00:1
Redundancy: 0.0%
Mutual Information: 0.00 bits