Statistical Interpretation of Entropy
Up to this point, entropy has been defined through heat transfer and temperature. But there's a deeper explanation: entropy is fundamentally about counting. Specifically, it's about how many microscopic arrangements of particles correspond to the same macroscopic state you observe. The more arrangements possible, the higher the entropy. This statistical view connects the behavior of individual atoms to the large-scale thermodynamic laws you've already studied.
Statistical Nature of Entropy
Entropy measures the disorder or randomness in a system. A gas with molecules spread randomly throughout a container has high entropy. A crystal with atoms locked into a regular lattice has low entropy. But "disorder" is really shorthand for something more precise: the number of possible microscopic arrangements, called microstates, that a system can have.
- A system with more possible microstates has higher entropy (a shuffled deck of cards can be in any of trillions of arrangements).
- A system with fewer possible microstates has lower entropy (a sorted deck has essentially one arrangement).
The second law of thermodynamics, viewed statistically, says that an isolated system naturally evolves toward states with more microstates, simply because those states are overwhelmingly more probable. A broken glass doesn't spontaneously reassemble because the number of microstates corresponding to "broken" vastly outnumbers those corresponding to "intact." Entropy increases until the system reaches equilibrium, the macrostate with the maximum number of microstates.
This tendency also defines the thermodynamic arrow of time: physical processes have a preferred direction (toward higher entropy), which is why time seems to "flow" in one direction.

Probability of Macrostates
A macrostate describes the large-scale, measurable properties of a system: temperature, pressure, volume. Each macrostate can be realized by many different microstates, which are the specific arrangements of every individual particle.
The probability of observing a particular macrostate is proportional to how many microstates produce it. Macrostates with more microstates are simply more likely to occur.
For a simple system with particles, each of which can be in one of possible states, the total number of microstates is:
The probability of a specific macrostate where particles are in state 1, in state 2, and so on, is given by the multinomial distribution:
Example: Consider 4 particles that can each be in one of 2 states. What's the probability that exactly 2 particles end up in each state?
- Calculate the number of ways to arrange 2 particles in state 1 and 2 in state 2:
- Calculate the total number of microstates:
- The probability is
So the most "balanced" macrostate (2 and 2) occurs about 37.5% of the time. Compare that to the probability of all 4 particles being in the same state: , or just 6.25%. The even split is six times more likely. Now imagine particles instead of 4. The most probable macrostate becomes so overwhelmingly dominant that you'll essentially never observe anything else.
The ergodic hypothesis extends this idea to time: over long enough periods, a system will spend time in each microstate in proportion to that microstate's probability.

Entropy and Microstate Quantity
The quantitative link between entropy and microstates is the Boltzmann equation:
- is the entropy of the system
- is the number of microstates corresponding to the system's macrostate
- is the Boltzmann constant: J/K
- is the natural logarithm
The logarithm is important here. It means that entropy grows slowly even as the number of microstates grows astronomically. It also makes entropy additive: if you combine two independent systems, the total number of microstates multiplies (), but because , the total entropy simply adds up ().
As a system evolves toward equilibrium, it moves toward the macrostate with the largest , which corresponds to the highest entropy. A drop of ink in water diffuses until it reaches a uniform concentration because the uniform state has vastly more microstates than the concentrated drop.
The second law, in this statistical view, isn't a mysterious force. It's just probability: systems evolve toward the most probable macrostate, and the most probable macrostate is the one with the highest entropy.
Entropy and Information
Information theory offers another way to think about entropy. You can interpret entropy as a measure of uncertainty or missing information about a system's exact microstate. If you know the macrostate but not the microstate, higher entropy means there are more possible microstates you'd have to guess from.
Irreversible processes increase entropy precisely because they destroy information about the system's initial conditions. Once a gas expands freely into a vacuum, you can no longer determine which side of the container each molecule started on. That lost information corresponds directly to the entropy increase. Reversible processes, by contrast, preserve all information and produce no net entropy change.