What Is Entropy?
Entropy measures the number of microscopic arrangements that produce the same macroscopic state. Systems drift toward high entropy not because of a law that pushes them there, but simply because high-entropy states are astronomically more numerous.
- Estimated time
- ~10 min
- Difficulty
- intro
- Sources
- 3 sources
The phenomenon: spreading and mixing
Drop a single drop of ink into a glass of water. Watch it bloom outward. It never un-blooms.
This isn’t because spreading is forced. It’s because the number of ways the ink molecules can be spread evenly across the glass is unimaginably larger than the number of ways they can all huddle in one corner.
The mechanism: Boltzmann's counting formula
Ludwig Boltzmann crystallised the idea into a single equation carved on his tombstone.
Why the logarithm? Because we want entropy to be additive. Two independent systems combined should have entropy ; since their microstates multiply (), the log converts multiplication into addition.
Show the connection to thermodynamic entropy (Clausius)
Rudolf Clausius defined entropy before Boltzmann, from heat flow:
where is a reversible heat exchange at temperature . Boltzmann’s statistical is the microscopic explanation of Clausius’s macroscopic definition. When heat flows in, it increases the kinetic energy spread of molecules, enlarging the count of accessible microstates — so W rises, and so does S. The two definitions agree wherever both apply. [The Second Law of Thermodynamics — Atkins' Physical Chemistry]
Where the model breaks
Boltzmann’s formula is extraordinarily powerful, but it has regime boundaries worth naming.
| Regime | What happens | Implication | |
|---|---|---|---|
| Near absolute zero | Near absolute zero | Quantum ground state → only 1 microstate | S → 0 (Third Law of Thermodynamics) |
| Black holes | Black holes | Bekenstein–Hawking entropy scales with horizon area, not volume | Our 3-D intuition about 'mixing' fails |
| Living cells | Living cells | Locally decrease entropy by coupling to a larger entropy increase (food oxidation) | Second Law holds for the *total* system |
| Everyday gases | Everyday gases | Boltzmann counting works beautifully | The model you've just learned applies directly |
A surprising consequence: the arrow of time
Here is the threshold concept that trips most people: the laws of physics are time-symmetric. Newton’s laws, Maxwell’s equations, and even quantum mechanics run identically forwards and backwards. Yet you never see a broken egg reassemble.
Why? Because the number of microstates leading toward broken eggs is vastly larger than the number leading toward whole eggs. The laws permit both directions; statistics selects one overwhelmingly.
This is why you can remember the past but not the future. Memories are physical records — low-entropy correlations between brain states and past events. The brain forms records only in the direction of increasing entropy, which we call “forward in time.” [Statistical Mechanics: Entropy, Order Parameters, and Complexity]