Fiveable

🎲Statistical Mechanics Unit 1 Review

QR code for Statistical Mechanics practice questions

1.1 Microscopic and macroscopic states

1.1 Microscopic and macroscopic states

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Statistical Mechanics
Unit & Topic Study Guides

Statistical mechanics connects the behavior of individual particles to the bulk properties you can measure in the lab, like temperature and pressure. By understanding how microscopic configurations map onto macroscopic observables, you gain the conceptual foundation for everything else in this course.

Microscopic vs macroscopic states

A physical system can be described at two very different levels. The microscopic description specifies the exact state of every single particle: where it is and how fast it's moving. The macroscopic description captures only the bulk, averaged quantities you'd measure with instruments: temperature, pressure, volume, internal energy.

Statistical mechanics is the framework that connects these two levels. You start from the microscopic rules governing particles and derive the macroscopic thermodynamic behavior. The central question is: how do the properties of 102310^{23} individual particles combine to produce the smooth, predictable behavior we observe at human scales?

Microstates and macrostates

Definition of microstates

A microstate is one specific, fully detailed configuration of every particle in a system. In classical mechanics, that means specifying the position and momentum of each particle. For a gas of NN particles, a single microstate is a list of all 3N3N position coordinates and all 3N3N momentum components.

  • The number of microstates grows exponentially with system size. Even a tiny box of gas has an astronomically large number of possible microstates.
  • Each microstate is equally fundamental; statistical mechanics treats them as the basic "building blocks" of any calculation.

Definition of macrostates

A macrostate is defined by a small set of macroscopic variables: temperature TT, pressure PP, volume VV, internal energy EE, particle number NN, and so on. These are the quantities you'd record in a lab.

  • A macrostate doesn't tell you which microstate the system is in. It only tells you the bulk properties.
  • Thermodynamic descriptions of a system are macrostate descriptions.

Relationship between micro and macro

The key idea: many different microstates correspond to the same macrostate. Think of it this way. If you have a box of gas at a certain temperature and pressure, the individual molecules could be arranged in a huge number of different ways while still producing those same bulk readings.

  • The statistical weight (or multiplicity) Ω\Omega counts how many microstates correspond to a given macrostate.
  • Macroscopic properties are computed by averaging over all the microstates compatible with a macrostate.
  • The macrostate with the largest multiplicity is overwhelmingly the most probable one. This is why equilibrium states are so stable: they correspond to an enormous number of microstates compared to non-equilibrium configurations.

Phase space

Configuration space

Configuration space represents all possible positions of the particles. For NN particles moving in three dimensions, configuration space has 3N3N dimensions. Each single point in this space specifies where every particle is located.

Momentum space

Momentum space represents all possible momenta. It also has 3N3N dimensions for NN particles in 3D. A point in momentum space tells you the momentum (and therefore velocity) of every particle.

Phase space volume

Phase space combines configuration space and momentum space into one unified space of 6N6N dimensions. A single point in phase space specifies the complete microstate of the system: all positions and all momenta.

  • The phase space volume accessible to a system is directly related to the number of microstates. A larger accessible volume means more microstates and higher entropy.
  • Liouville's theorem states that the phase space density is conserved along trajectories of the system's evolution. In practical terms, the "cloud" of representative points in phase space can change shape over time, but its volume stays constant. This is a foundational result for justifying equilibrium statistical mechanics.

Statistical weight

Multiplicity of states

The multiplicity Ω\Omega counts the number of microstates that realize a given macrostate. For most physical systems, Ω\Omega grows exponentially with the number of particles. This exponential scaling is why the most probable macrostate dominates so completely: even a tiny shift away from equilibrium causes Ω\Omega to drop by a huge factor.

  • Ω\Omega determines the probability of observing a particular macrostate. More microstates means higher probability.
  • Calculating Ω\Omega is often the first step in deriving thermodynamic quantities.

Boltzmann's principle

Boltzmann's principle provides the bridge between the microscopic count of states and the macroscopic concept of entropy:

S=kBlnΩS = k_B \ln \Omega

Here SS is the entropy, kB1.38×1023J/Kk_B \approx 1.38 \times 10^{-23} \, \text{J/K} is Boltzmann's constant, and Ω\Omega is the multiplicity. The logarithm is crucial: it converts the multiplicative scaling of Ω\Omega (when you combine independent systems, multiplicities multiply) into the additive scaling of entropy (entropies add). This is what makes entropy an extensive quantity.

Entropy and statistical weight

Entropy quantifies how "spread out" a system is over its available microstates. A system with more accessible microstates has higher entropy.

  • The second law of thermodynamics follows naturally from statistics: an isolated system evolves toward the macrostate with the largest Ω\Omega simply because that macrostate is overwhelmingly more probable. There's no need to impose the second law as a separate axiom; it emerges from counting states.
  • Describing entropy as "disorder" is common but can be misleading. It's more precise to think of entropy as a measure of the number of microscopic ways a macrostate can be realized.

Ensemble theory

An ensemble is a large (conceptually infinite) collection of copies of your system, each in a microstate consistent with certain macroscopic constraints. Different constraints define different ensembles.

Microcanonical ensemble

  • Describes an isolated system with fixed energy EE, volume VV, and particle number NN.
  • Every microstate with the correct energy is equally probable. This is the equal a priori probability postulate, one of the foundational assumptions of statistical mechanics.
  • Entropy is defined directly as S=kBlnΩ(E,V,N)S = k_B \ln \Omega(E, V, N).
  • Most useful for deriving fundamental results, though it can be harder to work with in practice because fixing energy exactly is mathematically restrictive.

Canonical ensemble

  • Describes a system in thermal contact with a heat bath at temperature TT. Energy can fluctuate, but TT, VV, and NN are fixed.
  • The probability of finding the system in microstate ii with energy EiE_i follows the Boltzmann distribution: Pi=eβEiZP_i = \frac{e^{-\beta E_i}}{Z}, where β=1kBT\beta = \frac{1}{k_B T}.
  • The partition function Z=ieβEiZ = \sum_i e^{-\beta E_i} encodes all thermodynamic information about the system. Once you have ZZ, you can extract energy, entropy, free energy, and more through derivatives.
  • This is the workhorse ensemble for most calculations in statistical mechanics.

Grand canonical ensemble

  • Describes an open system that exchanges both energy and particles with a reservoir. Temperature TT and chemical potential μ\mu are fixed, while both energy and particle number fluctuate.
  • The grand partition function Z=N=0ieβ(EiμN)\mathcal{Z} = \sum_{N=0}^{\infty} \sum_i e^{-\beta(E_i - \mu N)} generalizes the canonical partition function by including a sum over particle numbers.
  • Particularly useful for systems where particle number isn't conserved or is hard to fix, such as quantum gases and adsorption problems.

Ergodic hypothesis

Time averages vs ensemble averages

There are two ways to compute the average value of a physical quantity:

  • Time average: follow a single system for a very long time and average the observable over that trajectory.
  • Ensemble average: take a snapshot of many copies of the system (the ensemble) at one instant and average across copies.

The ergodic hypothesis asserts that these two averages are equal for systems in equilibrium. This is what lets you use the mathematically convenient ensemble average in place of the experimentally relevant time average.

Ergodicity in statistical mechanics

Ergodicity assumes that, given enough time, a system will visit all accessible microstates with the correct statistical frequency. If this holds, the long-time behavior of a single system is representative of the full ensemble.

This assumption justifies the entire ensemble framework. Without it, there would be no guarantee that ensemble calculations correspond to what you'd actually measure.

Limitations of ergodicity

Not all systems are ergodic:

  • Glassy systems get trapped in local regions of phase space and fail to explore the full set of accessible states on any reasonable timescale.
  • Systems with very long relaxation times may appear non-ergodic on experimental timescales even if they're technically ergodic in the infinite-time limit.
  • Quantum systems can exhibit many-body localization, where interactions and disorder prevent the system from thermalizing.

Non-ergodic behavior is an active area of research and leads to breakdowns of standard statistical mechanics predictions.

Definition of microstates, Statistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying ...

Quantum mechanical considerations

Quantum microstates

In quantum mechanics, microstates are discrete energy eigenstates characterized by a set of quantum numbers. Unlike classical microstates (which are points in continuous phase space), quantum microstates are countable.

  • Particles are either bosons (integer spin, symmetric wave functions) or fermions (half-integer spin, antisymmetric wave functions). This distinction fundamentally changes how you count microstates.
  • Fermions obey the Pauli exclusion principle: no two fermions can occupy the same quantum state. Bosons have no such restriction.

Density of states

The density of states g(E)g(E) counts the number of quantum states per unit energy interval near energy EE. It depends on the system's geometry, boundary conditions, and dimensionality.

  • For a free particle in a 3D box of volume VV, the density of states scales as g(E)VE1/2g(E) \propto V \cdot E^{1/2}.
  • The density of states appears directly in partition function calculations: sums over discrete states can be replaced by integrals weighted by g(E)g(E) when energy levels are closely spaced.

Quantum statistical mechanics

Classical statistical mechanics breaks down when thermal energy kBTk_B T becomes comparable to the spacing between quantum energy levels. In this regime, you need quantum statistics:

  • Fermi-Dirac statistics for fermions: the mean occupation of a state with energy ϵ\epsilon is n=1eβ(ϵμ)+1\langle n \rangle = \frac{1}{e^{\beta(\epsilon - \mu)} + 1}.
  • Bose-Einstein statistics for bosons: n=1eβ(ϵμ)1\langle n \rangle = \frac{1}{e^{\beta(\epsilon - \mu)} - 1}.
  • These reduce to the classical Maxwell-Boltzmann distribution at high temperatures (low density), where quantum effects become negligible.
  • Quantum statistics explain phenomena with no classical analog, such as Bose-Einstein condensation (macroscopic occupation of the ground state) and electron degeneracy pressure (which supports white dwarf stars against gravitational collapse).

Thermodynamic properties

Derivation from microscopic states

The power of statistical mechanics is that macroscopic thermodynamic quantities are derived, not assumed. You compute ensemble averages of microscopic quantities to obtain measurable properties:

  • Internal energy: E=lnZβ\langle E \rangle = -\frac{\partial \ln Z}{\partial \beta}
  • Pressure: P=1βlnZV\langle P \rangle = \frac{1}{\beta} \frac{\partial \ln Z}{\partial V}

These formulas show why the partition function is so central: differentiate it in the right way, and you get any thermodynamic quantity you need.

Partition function

The partition function ZZ is the single most important quantity in statistical mechanics calculations. It's a sum over all microstates, weighted by the Boltzmann factor:

Z=ieβEiZ = \sum_i e^{-\beta E_i}

Every equilibrium thermodynamic property can be extracted from ZZ or its logarithm. Different ensembles have their own versions of the partition function (canonical ZZ, grand canonical Z\mathcal{Z}), but the logic is the same: sum over states, then differentiate.

Free energy and entropy

  • Helmholtz free energy: F=kBTlnZF = -k_B T \ln Z. This is the natural thermodynamic potential for the canonical ensemble (fixed TT, VV, NN). At equilibrium, FF is minimized.
  • Gibbs free energy: G=F+PVG = F + PV. This is the natural potential when pressure (rather than volume) is held fixed.
  • Entropy can be obtained from the partition function: S=FTV,N=kBlnZ+ETS = -\frac{\partial F}{\partial T}\bigg|_{V,N} = k_B \ln Z + \frac{\langle E \rangle}{T}.

Free energy minimization is the statistical mechanics version of "the system goes to equilibrium."

Applications in statistical mechanics

Ideal gas model

The ideal gas is the simplest and most important model system: NN non-interacting particles in a box.

  • Starting from the partition function for a single particle and using the fact that particles are independent, you can derive the equation of state PV=NkBTPV = Nk_BT purely from microscopic considerations.
  • The Maxwell-Boltzmann speed distribution also follows: the probability of a particle having speed vv is proportional to v2eβmv2/2v^2 e^{-\beta m v^2 / 2}, giving a characteristic peak at vmost probable=2kBT/mv_{\text{most probable}} = \sqrt{2k_BT/m}.
  • The ideal gas serves as the reference system against which more realistic models are compared.

Paramagnetic systems

A paramagnet consists of magnetic moments (spins) that tend to align with an external field but are randomized by thermal fluctuations.

  • For NN independent spin-12\frac{1}{2} particles in a field BB, each spin has two states: aligned (energy μB-\mu B) or anti-aligned (energy +μB+\mu B). The partition function is straightforward to compute.
  • At high temperature, the magnetization follows Curie's law: MB/TM \propto B/T.
  • This system nicely illustrates the competition between energy minimization (spins want to align) and entropy maximization (spins want to be random).

Lattice models

Lattice models place degrees of freedom on a regular grid and are powerful tools for studying phase transitions.

  • The Ising model assigns a spin variable si=±1s_i = \pm 1 to each lattice site, with nearest-neighbor interactions. It's the simplest model that exhibits a phase transition (in 2D and higher).
  • The 1D Ising model can be solved exactly and shows no phase transition at finite temperature. The 2D Ising model, solved by Onsager, does exhibit a sharp transition.
  • Lattice gas models map fluid problems onto spin models, connecting the physics of magnetism and liquid-gas transitions.

Fluctuations and correlations

Fluctuations in macroscopic observables

Even in equilibrium, macroscopic quantities like energy and particle number fluctuate around their average values due to thermal motion.

  • The relative size of fluctuations scales as 1/N1/\sqrt{N}. For a system of N1023N \sim 10^{23} particles, relative fluctuations are of order 101210^{-12}, which is why macroscopic measurements appear perfectly sharp.
  • Fluctuations aren't just noise; they carry physical information. The variance of energy fluctuations in the canonical ensemble is directly related to the heat capacity: (ΔE)2=kBT2CV\langle (\Delta E)^2 \rangle = k_B T^2 C_V.

Correlation functions

Correlation functions measure how the value of a quantity at one point or time is related to its value at another.

  • The radial distribution function g(r)g(r) describes how particle density varies as a function of distance from a reference particle. It reveals the structure of liquids and solids.
  • Temporal correlation functions describe how quickly a system "forgets" a perturbation, characterizing relaxation processes.
  • Near a phase transition, spatial correlations extend over increasingly large distances. The correlation length diverges at the critical point.

Fluctuation-dissipation theorem

The fluctuation-dissipation theorem (FDT) connects spontaneous equilibrium fluctuations to the system's response to an external perturbation.

  • If you know how a quantity fluctuates in equilibrium, you can predict how the system responds when you push it. The converse is also true.
  • Concrete examples: the Einstein relation connects diffusion coefficient to mobility (D=kBT/γD = k_B T / \gamma), and Johnson-Nyquist noise relates voltage fluctuations in a resistor to its resistance and temperature.
  • The FDT is a cornerstone of linear response theory and extends into non-equilibrium statistical mechanics.

Symmetry and conservation laws

Symmetry in phase space

Symmetries of the underlying physical laws constrain the structure of statistical mechanics.

  • Translational symmetry implies momentum conservation. Rotational symmetry implies angular momentum conservation. Time-translation symmetry implies energy conservation.
  • These connections are formalized by Noether's theorem: every continuous symmetry of the system's dynamics corresponds to a conserved quantity.
  • Symmetries also constrain the form of partition functions and thermodynamic potentials, often simplifying calculations significantly.

Conservation of energy

Energy conservation is the most fundamental constraint in statistical mechanics.

  • In the microcanonical ensemble, the system is restricted to an energy shell in phase space: the thin surface where total energy equals EE.
  • The equipartition theorem states that, in classical systems at temperature TT, each quadratic degree of freedom contributes 12kBT\frac{1}{2}k_BT to the average energy. For example, a monatomic ideal gas has 3 translational degrees of freedom per particle, giving E=32NkBT\langle E \rangle = \frac{3}{2}Nk_BT.
  • Equipartition breaks down at low temperatures where quantum effects freeze out certain degrees of freedom (this explains why the heat capacity of solids drops below the classical Dulong-Petit value at low TT).

Other conserved quantities

Beyond energy, other conserved quantities shape the statistical description:

  • Momentum conservation is associated with translational symmetry and gives rise to pressure as the corresponding thermodynamic variable.
  • Particle number conservation (in systems without reactions or particle exchange) introduces the chemical potential μ\mu as the conjugate variable.
  • Each conserved quantity adds a constraint that reduces the number of accessible microstates, and each has an associated thermodynamic variable that appears naturally in the appropriate ensemble.