Entropy Simulator: Watch the Second Law of Thermodynamics in Action

calculator beginner ~5 min
Loading simulation...
S_eq ≈ 69.3 (natural units, N=100)

With 100 particles starting on the left side, entropy begins near zero and rises toward the equilibrium value of N·ln(2) ≈ 69.3 as particles diffuse across both halves of the container.

Formula

S = k_B · ln(W)
S = -k_B · Σ(p_i · ln(p_i))
S_max = N · k_B · ln(2)

What Is Entropy

Entropy is one of the most profound concepts in physics — and one of the most misunderstood. It is not 'disorder' in the colloquial sense, but rather a precise measure of how many microscopic arrangements (microstates) are compatible with what we observe macroscopically. Ludwig Boltzmann's tombstone bears the formula S = k_B · ln(W), connecting entropy S to the number of microstates W. This single equation bridges the microscopic world of atoms to the macroscopic world of heat engines and irreversibility.

The Second Law

The Second Law of Thermodynamics — entropy of an isolated system never decreases — is arguably the most universal law in physics. Arthur Eddington wrote: 'If your theory is found to be against the Second Law of Thermodynamics, I can give you no hope; there is nothing for it but to collapse in deepest humiliation.' What makes the law remarkable is that it emerges purely from statistics: there are simply overwhelmingly more ways for a system to be spread out than concentrated.

This Simulation

Start all particles on the left half and watch them diffuse. The entropy graph below tracks S = -k_B · Σ(p_i · ln(p_i)) estimated from the spatial distribution. Notice how entropy rises rapidly at first, then plateaus near S_max = N · ln(2). This plateau is equilibrium — not because particles stop moving, but because the macroscopic distribution stops changing. Try increasing the particle count: the approach to equilibrium becomes smoother and the fluctuations around it become relatively smaller, illustrating the law of large numbers at work.

Irreversibility and the Arrow of Time

The deep mystery is this: the microscopic laws of physics are time-reversible — every collision can run backwards. Yet macroscopically, we never see gas un-mix or eggs un-scramble. The resolution lies in initial conditions: the universe began in an extraordinarily low-entropy state (the Past Hypothesis), and everything since has been entropy increasing toward its maximum. The arrow of time is, fundamentally, the arrow of entropy.

FAQ

What is entropy in thermodynamics?

Entropy (S) is a measure of the number of microscopic configurations (microstates) consistent with a system's macroscopic state. Boltzmann's formula S = k_B · ln(W) connects entropy to the count of microstates W. Higher entropy means more possible arrangements — and nature overwhelmingly favors states with more arrangements.

Why does entropy always increase?

The Second Law of Thermodynamics states that the total entropy of an isolated system never decreases. This is not a fundamental force but a statistical certainty: there are astronomically more disordered states than ordered ones, so random motion naturally evolves toward them.

Can entropy decrease spontaneously?

In principle, yes — but the probability is vanishingly small for macroscopic systems. For 100 particles to all return to the left half spontaneously, you'd need to wait roughly 2^100 ≈ 10^30 times longer than the current age of the universe.

What is the formula for entropy?

Boltzmann entropy: S = k_B · ln(W). Gibbs/Shannon entropy: S = -k_B · Σ(p_i · ln(p_i)), where p_i is the probability of each microstate. Both formulations agree for equilibrium systems.

Sources

Embed

<iframe src="https://homo-deus.com/lab/thermodynamics/entropy-simulator/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub