What Is Entropy?

Entropy measures the number of microscopic arrangements that produce the same macroscopic state. Systems drift toward high entropy not because of a law that pushes them there, but simply because high-entropy states are astronomically more numerous.

Estimated time
~10 min
Difficulty
intro
Sources
3 sources

The phenomenon: spreading and mixing

Drop a single drop of ink into a glass of water. Watch it bloom outward. It never un-blooms.

This isn’t because spreading is forced. It’s because the number of ways the ink molecules can be spread evenly across the glass is unimaginably larger than the number of ways they can all huddle in one corner.

The mechanism: Boltzmann's counting formula

Ludwig Boltzmann crystallised the idea into a single equation carved on his tombstone.

Why the logarithm? Because we want entropy to be additive. Two independent systems combined should have entropy S1+S2S_1 + S_2; since their microstates multiply (Wtotal=W1×W2W_{total} = W_1 \times W_2), the log converts multiplication into addition.

Drag 'Particles in left box' toward the middle. Notice how W and S peak at the equal split — that's why systems drift toward it.
Show the connection to thermodynamic entropy (Clausius)

Rudolf Clausius defined entropy before Boltzmann, from heat flow:

dS=δQrevTdS = \frac{\delta Q_{rev}}{T}

where δQrev\delta Q_{rev} is a reversible heat exchange at temperature TT. Boltzmann’s statistical S=kBlnWS = k_B \ln W is the microscopic explanation of Clausius’s macroscopic definition. When heat flows in, it increases the kinetic energy spread of molecules, enlarging the count of accessible microstates — so W rises, and so does S. The two definitions agree wherever both apply. [The Second Law of Thermodynamics — Atkins' Physical Chemistry]

Where the model breaks

Boltzmann’s formula is extraordinarily powerful, but it has regime boundaries worth naming.

Regime What happens Implication
Near absolute zero Near absolute zeroQuantum ground state → only 1 microstateS → 0 (Third Law of Thermodynamics)
Black holes Black holesBekenstein–Hawking entropy scales with horizon area, not volumeOur 3-D intuition about 'mixing' fails
Living cells Living cellsLocally decrease entropy by coupling to a larger entropy increase (food oxidation)Second Law holds for the *total* system
Everyday gases Everyday gasesBoltzmann counting works beautifullyThe model you've just learned applies directly
Entropy works differently at the extremes

A surprising consequence: the arrow of time

Here is the threshold concept that trips most people: the laws of physics are time-symmetric. Newton’s laws, Maxwell’s equations, and even quantum mechanics run identically forwards and backwards. Yet you never see a broken egg reassemble.

Why? Because the number of microstates leading toward broken eggs is vastly larger than the number leading toward whole eggs. The laws permit both directions; statistics selects one overwhelmingly.

This is why you can remember the past but not the future. Memories are physical records — low-entropy correlations between brain states and past events. The brain forms records only in the direction of increasing entropy, which we call “forward in time.” [Statistical Mechanics: Entropy, Order Parameters, and Complexity]