What Is Entropy
Entropy is one of the most profound concepts in physics — and one of the most misunderstood. It is not 'disorder' in the colloquial sense, but rather a precise measure of how many microscopic arrangements (microstates) are compatible with what we observe macroscopically. Ludwig Boltzmann's tombstone bears the formula S = k_B · ln(W), connecting entropy S to the number of microstates W. This single equation bridges the microscopic world of atoms to the macroscopic world of heat engines and irreversibility.
The Second Law
The Second Law of Thermodynamics — entropy of an isolated system never decreases — is arguably the most universal law in physics. Arthur Eddington wrote: 'If your theory is found to be against the Second Law of Thermodynamics, I can give you no hope; there is nothing for it but to collapse in deepest humiliation.' What makes the law remarkable is that it emerges purely from statistics: there are simply overwhelmingly more ways for a system to be spread out than concentrated.
This Simulation
Start all particles on the left half and watch them diffuse. The entropy graph below tracks S = -k_B · Σ(p_i · ln(p_i)) estimated from the spatial distribution. Notice how entropy rises rapidly at first, then plateaus near S_max = N · ln(2). This plateau is equilibrium — not because particles stop moving, but because the macroscopic distribution stops changing. Try increasing the particle count: the approach to equilibrium becomes smoother and the fluctuations around it become relatively smaller, illustrating the law of large numbers at work.
Irreversibility and the Arrow of Time
The deep mystery is this: the microscopic laws of physics are time-reversible — every collision can run backwards. Yet macroscopically, we never see gas un-mix or eggs un-scramble. The resolution lies in initial conditions: the universe began in an extraordinarily low-entropy state (the Past Hypothesis), and everything since has been entropy increasing toward its maximum. The arrow of time is, fundamentally, the arrow of entropy.