Microcanonical Ensemble
Microcanonical Ensemble Characteristics
The microcanonical ensemble describes a completely isolated system where three quantities are held fixed: total energy (), volume (), and number of particles (). Because the system is isolated, there's no exchange of energy or matter with the surroundings. Think of it as a perfectly insulated, rigid box of gas molecules that can't leak.
The central postulate is the equal a priori probability assumption: every microstate consistent with the same total energy is equally likely to be occupied. The system wanders through all accessible microstates over time, spending equal time in each one on average. This is sometimes called the fundamental postulate of statistical mechanics.
The quantity that captures everything about the ensemble is the density of states , which counts the total number of microstates accessible at energy . Once you know , you can derive all the thermodynamic properties of the system: entropy, temperature, pressure, and chemical potential.
Microstates and Entropy Relationship
The bridge between the microscopic world (microstates) and the macroscopic world (thermodynamic quantities) is the Boltzmann entropy formula:
- is the thermodynamic entropy
- J/K is the Boltzmann constant
- is the number of accessible microstates
A system with more accessible microstates has higher entropy. This makes intuitive sense: more microstates means more ways the particles can arrange themselves while still having the same total energy, which corresponds to greater "spread" or multiplicity.
Within the microcanonical ensemble, the equilibrium macrostate is the one with the largest number of microstates. The system doesn't "choose" to maximize entropy on purpose. Rather, the macrostate with the most microstates is simply the one that's overwhelmingly most probable. For large , the peak in is so sharp that fluctuations away from it are negligible. This statistical fact is what gives the second law its force.

Entropy Calculation with the Boltzmann Equation
Here's how to compute entropy for a microcanonical system:
-
Count the microstates () for the system at the given energy. This means finding the number of distinct particle configurations that all produce the same total energy . For large systems, you'll often need Stirling's approximation () to handle factorials in combinatorial expressions.
-
Apply the Boltzmann equation: . Take the natural logarithm (base ) of and multiply by .
-
Check your result. Entropy should be non-negative and extensive (it scales with system size).
Worked example: Suppose a system has microstates.
Note that , not . The original value of J/K was incorrect. Always compute carefully, remembering that is the natural logarithm, not .
Entropy and the Second Law of Thermodynamics

Statistical Foundation of the Second Law
The second law of thermodynamics states that the entropy of an isolated system never decreases. In reversible processes entropy stays constant; in irreversible (spontaneous) processes it increases. This establishes a preferred direction for natural processes and an "arrow of time."
The microcanonical ensemble gives this law a statistical explanation. An isolated system explores all accessible microstates over time. Because the macrostate with the most microstates is overwhelmingly the most probable, the system will almost certainly be found in (or very near) that maximum-entropy configuration. Moving away from it isn't forbidden by any microscopic law, but for a system of particles, the probability of a significant entropy decrease is so vanishingly small that it never happens in practice.
Equilibrium, then, is simply the state where is maximized subject to the constraints , , and . The system doesn't need an external push to get there; the statistics do all the work.
Microcanonical Ensemble Problem-Solving
-
Identify the constraints. Write down the fixed quantities: total energy (), volume (), number of particles (), and whether is given or needs to be calculated.
-
Choose the right equation.
- If is known or can be counted: use .
- If you need to derive : use the specific density-of-states formula for your system (e.g., the combinatorial expression for an Einstein solid, or the phase-space volume for an ideal gas).
-
Substitute and solve. Plug in values and solve for the unknown (entropy, number of microstates, or a derived quantity like temperature via ). Use logarithm properties to simplify, and keep units consistent throughout.
-
Interpret physically.
- Higher entropy means greater multiplicity, not just "more disorder."
- Spontaneous processes in isolated systems move toward higher .
- At equilibrium, the entropy is at its constrained maximum, and macroscopic properties stop changing.