mathematics

Information Theory

The mathematics of information — how to measure, compress, transmit, and protect data against noise and errors.

information theoryShannonentropycompressionerror correctionchannel capacity

Information theory was created by Claude Shannon in his landmark 1948 paper 'A Mathematical Theory of Communication.' In one stroke, Shannon defined information mathematically (as entropy), proved that data compression has a fundamental limit, and showed that reliable communication over noisy channels is possible — up to a maximum rate called the channel capacity.

Shannon's theory is the foundation of the entire digital age. Data compression (ZIP, MP3, JPEG), error-correcting codes (in every phone call, hard drive, and satellite link), cryptography, and even machine learning all rest on information-theoretic principles. The bit — Shannon's basic unit — became the atom of the digital world.

These simulations let you explore Shannon's key results: measure the entropy of messages, see how compression approaches the theoretical limit, watch error-correcting codes recover data from noise, and compute the capacity of noisy channels.

4 interactive simulations

simulation

Channel Capacity Explorer

Visualize Shannon's channel capacity theorem — the ultimate speed limit of communication. Explore how bandwidth, signal-to-noise ratio, and modulation scheme determine the maximum data rate, and see constellation diagrams blur as noise increases.

simulation

Data Compression Simulator

Visualize how Huffman coding compresses data by exploiting symbol frequency imbalance. Compare original size, compressed size, and the theoretical Shannon limit across different source types — from English text to DNA sequences.

simulation

Error Correction Simulator

Watch error correction codes detect and fix transmission errors in real time. Compare uncoded transmission, triple repetition, and Hamming(7,4) codes to see how redundancy trades bandwidth for reliability across noisy channels.

simulation

Shannon Entropy Calculator

Explore the foundational measure of information: Shannon entropy. Adjust symbol probabilities to see how uncertainty, redundancy, and optimal code lengths change. Discover why English text carries about 4.7 bits per character while random text approaches the theoretical maximum.