Fiveable

🎲Statistical Mechanics Unit 2 Review

QR code for Statistical Mechanics practice questions

2.2 Second law of thermodynamics

2.2 Second law of thermodynamics

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🎲Statistical Mechanics
Unit & Topic Study Guides

The second law of thermodynamics governs the direction of natural processes and sets fundamental limits on energy conversion. It introduces entropy as the central quantity connecting microscopic particle behavior to macroscopic thermodynamic properties, making it one of the most important bridges between thermodynamics and statistical mechanics.

Fundamental concepts

Entropy and disorder

Entropy quantifies the number of ways a system's microscopic constituents can be arranged while still producing the same macroscopic state. A common shorthand is "disorder," but more precisely, entropy measures the spread of energy and configurations available to a system.

  • Closed systems naturally evolve toward states of maximum entropy (thermodynamic equilibrium)
  • A process is spontaneous when it increases the total entropy of the system plus its surroundings
  • Entropy is a state function: its value depends only on the current state, not on how the system got there

Irreversibility of processes

Real thermodynamic processes are irreversible. Friction, unrestrained expansion, heat conduction across a finite temperature difference, and mixing all generate entropy that cannot be undone without external intervention.

  • A truly reversible process is an idealization where the system passes through a continuous sequence of equilibrium states
  • The "arrow of time" at the macroscopic level is a direct consequence of entropy increase: you never see a shattered glass spontaneously reassemble because the reassembled state has overwhelmingly fewer microstates

Heat flow direction

Heat spontaneously flows from hotter regions to colder regions. This is not just an empirical observation; it follows directly from entropy maximization. Transferring energy dQdQ from a hot body at THT_H to a cold body at TCT_C produces a net entropy change of dQ/TCdQ/TH>0dQ/T_C - dQ/T_H > 0 whenever TH>TCT_H > T_C.

Reversing this flow (moving heat from cold to hot) requires work input, which is exactly what a refrigerator does.

Second law statements

Several equivalent formulations of the second law each highlight a different physical consequence.

Clausius statement

No process is possible whose sole result is the transfer of heat from a colder body to a hotter body.

This emphasizes the natural direction of heat transfer. A perfect refrigerator that moves heat from cold to hot with zero work input is impossible.

Kelvin-Planck statement

No process is possible whose sole result is the complete conversion of heat from a single reservoir into work.

This means every cyclic heat engine must reject some heat to a cold reservoir. You can never build an engine that turns 100% of absorbed heat into useful work.

Equivalence of statements

These two statements are logically equivalent. If you could violate one, you could construct a device that violates the other. The proof proceeds by contradiction: assume a perfect refrigerator exists, couple it to a normal engine, and you get a device that violates Kelvin-Planck (and vice versa). The choice of which statement to use typically depends on whether you're analyzing heat transfer problems (Clausius) or power cycles (Kelvin-Planck).

Mathematical formulations

Clausius inequality

For any cyclic process, the second law requires:

dQT0\oint \frac{dQ}{T} \leq 0

The equality holds only for a reversible cycle. For irreversible cycles the integral is strictly negative. This inequality is the mathematical backbone of the second law and can be used to derive entropy as a state function.

Entropy change calculations

For a reversible process between initial state ii and final state ff:

ΔS=ifdQrevT\Delta S = \int_{i}^{f} \frac{dQ_{\text{rev}}}{T}

Because entropy is a state function, ΔS\Delta S between two states is the same regardless of the actual path. You always calculate it along a reversible path connecting those states, even if the real process is irreversible. For an irreversible process in an isolated system, ΔS>0\Delta S > 0 always.

Carnot cycle efficiency

The Carnot engine operates between a hot reservoir at THT_H and a cold reservoir at TCT_C through two isothermal and two adiabatic steps. Its efficiency is:

ηCarnot=1TCTH\eta_{\text{Carnot}} = 1 - \frac{T_C}{T_H}

where temperatures must be in Kelvin. This is the theoretical maximum efficiency for any heat engine operating between these two temperatures. For example, an engine running between TH=600KT_H = 600\,\text{K} and TC=300KT_C = 300\,\text{K} has a maximum efficiency of 50%. Real engines always fall below this due to irreversibilities.

Microscopic interpretation

Statistical mechanics gives entropy a concrete physical meaning by connecting it to the counting of microscopic configurations.

Boltzmann's entropy formula

S=kBlnΩS = k_B \ln \Omega

Here SS is entropy, kB1.38×1023J/Kk_B \approx 1.38 \times 10^{-23}\,\text{J/K} is Boltzmann's constant, and Ω\Omega is the number of microstates consistent with the macrostate. This single equation is the bridge between the microscopic world of particles and macroscopic thermodynamics.

Entropy and disorder, Statistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying ...

Microstates vs macrostates

A microstate specifies the exact position and momentum of every particle in the system. A macrostate is defined by macroscopic observables like temperature, pressure, and volume.

  • Many distinct microstates typically correspond to the same macrostate
  • The macrostate with the largest number of corresponding microstates is the equilibrium state
  • Systems evolve toward equilibrium because states with more microstates are overwhelmingly more probable

Statistical definition of entropy

The reason systems tend toward higher entropy is purely probabilistic. A macrostate with Ω=1023\Omega = 10^{23} microstates is not just "a little more likely" than one with Ω=103\Omega = 10^{3}; it is astronomically more probable. For macroscopic systems, the numbers are so lopsided that spontaneous decreases in entropy are effectively impossible, even though they are not strictly forbidden by the microscopic laws of motion.

Thermodynamic potentials

Thermodynamic potentials are state functions constructed to predict equilibrium and spontaneity under specific constraints.

Helmholtz free energy

F=UTSF = U - TS

where UU is internal energy, TT is temperature, and SS is entropy. At constant temperature and volume, a system reaches equilibrium when FF is minimized. The decrease in FF during a process equals the maximum work extractable from the system (beyond PdVPdV work).

Gibbs free energy

G=HTSG = H - TS

where HH is enthalpy. At constant temperature and pressure, a system reaches equilibrium when GG is minimized. This is the most commonly used potential in chemistry and materials science because lab conditions usually involve fixed TT and PP.

  • ΔG<0\Delta G < 0: process is spontaneous
  • ΔG=0\Delta G = 0: system is at equilibrium
  • ΔG>0\Delta G > 0: process is non-spontaneous (requires work input)

Entropy vs free energy

The choice between maximizing entropy and minimizing free energy depends on the system's constraints:

  • Isolated system (no heat or work exchange): equilibrium corresponds to maximum SS
  • System at constant TT and VV (in contact with a heat bath): equilibrium corresponds to minimum FF
  • System at constant TT and PP: equilibrium corresponds to minimum GG

Free energy minimization effectively accounts for entropy changes in both the system and its surroundings, packaged into a single system-only quantity.

Applications and consequences

Spontaneous processes

A spontaneous process is one that proceeds without external driving. Diffusion of a gas into a vacuum, heat conduction from hot to cold, and the mixing of two gases at the same temperature and pressure are all spontaneous. In every case, the total entropy of the universe increases.

Heat engines and refrigerators

A heat engine absorbs heat QHQ_H from a hot reservoir, converts part of it to work WW, and rejects the remainder QCQ_C to a cold reservoir. A refrigerator does the reverse: it uses work to move heat from cold to hot.

  • Engine efficiency: η=W/QH=1QC/QH\eta = W/Q_H = 1 - Q_C/Q_H
  • Refrigerator coefficient of performance: COP=QC/W\text{COP} = Q_C/W

Maximum efficiency limits

The second law caps these performance metrics. No engine can beat ηCarnot\eta_{\text{Carnot}}, and no refrigerator can exceed COPCarnot=TC/(THTC)\text{COP}_{\text{Carnot}} = T_C/(T_H - T_C). Real devices fall short because of friction, finite-rate heat transfer, and other irreversibilities.

Entropy and information theory

Shannon entropy

Shannon defined information entropy as:

H=ipilog2piH = -\sum_i p_i \log_2 p_i

where pip_i is the probability of outcome ii. This is mathematically analogous to the Gibbs entropy formula in statistical mechanics (with log2\log_2 replaced by ln\ln and a factor of kBk_B). Shannon entropy measures the average information gained per observation and is foundational in data compression and communication theory.

Information and thermodynamics

Landauer's principle states that erasing one bit of information in a computational device must dissipate at least kBTln2k_B T \ln 2 of energy as heat. This sets a fundamental lower bound on the energy cost of computation and shows that information is not abstract; it has real thermodynamic consequences.

Entropy and disorder, Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy · Physics

Maxwell's demon paradox

Maxwell imagined a tiny being (a "demon") that could observe individual molecules and sort fast ones from slow ones, apparently decreasing entropy without doing work. The resolution is that the demon must acquire, store, and eventually erase information about the molecules. When you account for the entropy cost of that information processing (via Landauer's principle), the total entropy of the universe still increases.

Fluctuations and the second law

Fluctuation theorems

The second law holds on average, but in very small systems, brief entropy-decreasing fluctuations can occur. Fluctuation theorems quantify this: they give the ratio of the probability of observing an entropy increase ΔS\Delta S to the probability of an equal entropy decrease. The Crooks fluctuation theorem, for instance, relates the probability distributions of work in forward and time-reversed processes.

Jarzynski equality

eβW=eβΔF\langle e^{-\beta W} \rangle = e^{-\beta \Delta F}

Here β=1/(kBT)\beta = 1/(k_B T), WW is the work performed on the system in a non-equilibrium process, and ΔF\Delta F is the equilibrium free energy difference between the final and initial states. This remarkable result lets you extract equilibrium information from repeated non-equilibrium experiments, and it reduces to the second law inequality WΔF\langle W \rangle \geq \Delta F via Jensen's inequality.

Second law in small systems

As system size shrinks toward the nanoscale, thermal fluctuations become comparable to the average energy flows. Molecular motors, RNA folding, and colloidal particles all operate in this regime. Stochastic thermodynamics extends classical thermodynamics to handle these cases, defining entropy production along individual fluctuating trajectories rather than only for ensemble averages.

Entropy production

Irreversible processes

Every irreversible process generates entropy. The total entropy production ΔStot=ΔSsys+ΔSsurr>0\Delta S_{\text{tot}} = \Delta S_{\text{sys}} + \Delta S_{\text{surr}} > 0 for any real process. Sources of irreversibility include friction, heat transfer across finite temperature differences, chemical reactions proceeding away from equilibrium, and viscous fluid flow. The entropy production rate S˙\dot{S} is a central quantity in non-equilibrium thermodynamics.

Entropy generation minimization

Engineering design often aims to minimize entropy production, since entropy generation corresponds directly to lost work (wasted potential for doing something useful). This principle is applied in heat exchanger design, distillation column optimization, and power plant engineering. In practice, minimizing entropy production must be balanced against cost, size, and other constraints.

Steady-state systems

A steady-state system has constant macroscopic properties even though energy and matter continuously flow through it. Living organisms are a classic example: they maintain low internal entropy by exporting entropy to their surroundings at a constant rate. In steady state, the entropy production rate is constant, and analyzing it reveals how efficiently the system dissipates the driving gradients.

Second law in non-equilibrium systems

Local equilibrium assumption

Many non-equilibrium theories assume that each small volume element of a system is approximately in local thermodynamic equilibrium, so that temperature, pressure, and chemical potential remain well-defined locally. This assumption works well when gradients are not too steep, meaning the system is not driven too far from equilibrium.

Onsager reciprocal relations

Near equilibrium, thermodynamic fluxes JiJ_i (such as heat flux or particle flux) are linearly related to thermodynamic forces XjX_j (such as temperature gradients or chemical potential gradients):

Ji=jLijXjJ_i = \sum_j L_{ij} X_j

Onsager showed that the matrix of transport coefficients satisfies Lij=LjiL_{ij} = L_{ji}. This symmetry has measurable consequences. For example, in thermoelectric materials, the coefficient linking heat flux to an electric field equals the coefficient linking electric current to a temperature gradient (the Peltier and Seebeck effects).

Far-from-equilibrium phenomena

Systems driven far from equilibrium can exhibit behavior with no equilibrium analog: self-organized patterns (Bénard convection cells), oscillating chemical reactions (Belousov-Zhabotinsky reaction), and turbulent flows. These phenomena involve nonlinear dynamics and often require frameworks beyond linear irreversible thermodynamics, such as those developed by Prigogine and others.