Thermodynamic potentials are functions that encode all the thermodynamic information about a system in a compact form. By choosing the right potential for your constraints (constant temperature? constant pressure?), you can directly read off equilibrium conditions, predict spontaneous processes, and connect microscopic statistical mechanics to macroscopic measurements.
The four main potentials are internal energy (), enthalpy (), Helmholtz free energy (), and Gibbs free energy (). Each one is most naturally expressed in terms of specific variables, and the mathematical machinery of Legendre transforms and Maxwell relations lets you move between them systematically.
Fundamental thermodynamic potentials
Each potential is tailored to a particular set of experimental constraints. The "natural variables" of a potential are the independent variables in terms of which the potential's total differential takes its simplest form. When you hold those natural variables fixed, the corresponding potential is minimized at equilibrium.
Internal energy
represents the total energy contained within a thermodynamic system, including kinetic and potential energies of all particles. It's the most fundamental potential because the First Law directly governs it:
The natural variables are , , and . This means is most useful when you control entropy and volume, which isn't always experimentally convenient. That's exactly why we introduce the other potentials.
Enthalpy
Enthalpy is what you want for constant-pressure processes (which is most bench-top chemistry and many engineering applications). Its differential is:
Natural variables: and . Changes in enthalpy directly give the heat absorbed or released at constant pressure, which is why calorimetry data are usually reported as enthalpy changes.
Helmholtz free energy
This potential tells you the maximum useful work extractable from a closed system at constant temperature. Its differential is:
Natural variables: and . At constant and , equilibrium corresponds to the minimum of . This is the potential that connects most directly to the canonical partition function (more on that below).
Gibbs free energy
This is the workhorse potential for constant temperature and pressure, which covers most laboratory and biological conditions. Its differential is:
Natural variables: and . At constant and , equilibrium minimizes . Phase coexistence, chemical equilibrium, and reaction spontaneity are all most naturally described through .
Mathematical formulations
Legendre transformations
Legendre transforms are the systematic way to switch between potentials. The idea: you trade an extensive natural variable for its conjugate intensive variable while preserving all thermodynamic information.
The recipe is straightforward. To replace a natural variable with its conjugate , you form the new potential:
For example, going from to : the conjugate of is , so . Each of the four potentials is related to the others by one or two Legendre transforms, forming a square (sometimes called the "thermodynamic square") that's worth memorizing.
Partial derivative relationships
From the differential , you can read off:
Analogous relations hold for every potential from its own differential. These are not just formal identities; they're how you extract measurable quantities from a known potential. If someone hands you for a substance, you get entropy from and volume from .
Maxwell relations
Because mixed partial derivatives commute (), each potential generates an equality between seemingly unrelated derivatives. The four fundamental Maxwell relations are:
- From :
- From :
- From :
- From :
The practical payoff: Maxwell relations let you replace hard-to-measure derivatives (like how entropy changes with volume) with easy-to-measure ones (like how pressure changes with temperature). The third relation above, for instance, is what you'd use to compute the entropy change of a gas during isothermal expansion from an equation of state alone.
Properties and applications

Equilibrium conditions
Thermodynamic equilibrium means the system's macroscopic properties are stationary. The appropriate potential reaches an extremum (minimum for free energies, maximum for entropy in isolated systems) under the given constraints:
- Thermal equilibrium: uniform throughout the system
- Mechanical equilibrium: uniform throughout the system
- Chemical equilibrium: uniform for each species; no net reactions or mass transfer
Which potential you minimize depends on what's held fixed. Constant and ? Minimize . Constant and ? Minimize . Picking the wrong potential for your constraints makes the analysis much harder than it needs to be.
Spontaneity criteria
The Second Law, expressed through the potentials, gives clean criteria for whether a process will proceed spontaneously:
- Isolated system (constant , , ):
- Constant and :
- Constant and :
Equality holds for reversible processes. A system at equilibrium satisfies (at constant , ), which is the starting point for deriving the equilibrium constant expression in chemical thermodynamics.
Phase transitions
Phase transitions are classified by the behavior of the potential's derivatives:
- First-order: first derivatives of (like and ) are discontinuous. You see latent heat and a volume change (e.g., boiling water).
- Second-order (continuous): first derivatives are continuous, but second derivatives (like heat capacity and compressibility) diverge or jump. Examples include the superconducting transition and the ferromagnetic Curie point.
The Gibbs phase rule determines how many intensive variables you can independently vary while maintaining phase coexistence:
where is the number of components and is the number of coexisting phases. For a single-component system at a triple point (), : the temperature and pressure are completely fixed.
Thermodynamic potential diagrams
Energy surfaces
You can visualize any potential as a surface over its natural variables. For example, is a surface in three dimensions. The slope in the -direction gives , and the slope in the -direction gives .
These surfaces are useful because their curvature encodes stability information, and phase transitions show up as regions where the surface develops flat portions or kinks.
Stability conditions
A stable equilibrium corresponds to a local minimum of the appropriate potential. This requires the second derivatives to be positive:
- Thermal stability: , which is equivalent to (heat capacity must be positive)
- Mechanical stability: , which means (compressibility must be positive)
When these conditions are violated, the system is unstable and will spontaneously evolve toward a new state, often through a phase transition. The boundary where stability is marginally lost defines the spinodal curve.
Critical points
At a critical point, the distinction between two phases vanishes (e.g., liquid and gas become indistinguishable above the critical temperature). Mathematically, both the first and second derivatives of the chemical potential with respect to density vanish:
Near critical points, fluctuations grow to macroscopic scales, susceptibilities diverge, and the system exhibits universal behavior characterized by critical exponents that depend only on dimensionality and symmetry, not on microscopic details. The liquid-gas critical point of water ( K, MPa) and the Curie point of iron ( K) are classic examples.
Connections to statistical mechanics
Partition function relationships
This is where thermodynamic potentials meet microscopic physics. The canonical partition function encodes all equilibrium thermodynamics of a system at fixed , , :
- Helmholtz free energy:
- Internal energy:
- Entropy:
Once you have , every thermodynamic quantity follows by differentiation. This is why computing partition functions is the central task of statistical mechanics.

Ensemble averages
Different statistical ensembles correspond to different sets of fixed variables, and each connects naturally to a specific potential:
- Microcanonical ensemble (fixed , , ) → entropy
- Canonical ensemble (fixed , , ) → Helmholtz free energy
- Grand canonical ensemble (fixed , , ) → grand potential
The ergodic hypothesis justifies equating time averages (what you measure in an experiment) with ensemble averages (what you calculate in stat mech). In the thermodynamic limit (), all ensembles give equivalent results for bulk properties.
Fluctuations and response functions
Statistical mechanics naturally predicts fluctuations around equilibrium, and these fluctuations are directly tied to thermodynamic response functions:
- Energy fluctuations in the canonical ensemble:
- Particle number fluctuations in the grand canonical ensemble:
Key response functions include:
- Heat capacity:
- Isothermal compressibility:
The fluctuation-dissipation theorem formalizes this connection: equilibrium fluctuations determine how the system responds to small perturbations. Near critical points, fluctuations diverge, which is why response functions like and blow up there.
Experimental relevance
Measurable quantities
Thermodynamic potentials themselves aren't directly measurable. What you measure in the lab are quantities like , , , heat capacity , and compressibility . From these, you reconstruct the potentials through integration.
In practice, experiments focus on changes in potentials (, ) rather than absolute values, since the absolute value depends on an arbitrary reference. The Third Law provides a natural reference point by setting at for perfect crystals.
Calorimetry techniques
Calorimetry measures heat transfer and energy changes directly. The main types:
- Bomb calorimeter: measures heat of combustion at constant volume, giving directly
- Flow calorimeter: measures enthalpy changes in flowing systems at constant pressure
- Differential scanning calorimeter (DSC): tracks heat capacity as a function of temperature, revealing phase transitions as peaks or discontinuities in
DSC is particularly powerful for mapping out phase diagrams and detecting subtle transitions that might not involve visible changes in the sample.
Equation of state derivations
An equation of state relates , , and for a substance. Combined with heat capacity data, it lets you compute all thermodynamic potentials.
- Ideal gas: (no interactions, works well at low density)
- Van der Waals: (adds attractive interactions via and finite molecular volume via )
The van der Waals equation, despite its simplicity, qualitatively captures liquid-gas phase transitions and predicts a critical point at , . More sophisticated equations (Redlich-Kwong, Peng-Robinson) improve quantitative accuracy for real gases.
Advanced concepts
Massieu functions
Massieu functions are entropy-based alternatives to the usual energy-based potentials. They're defined by Legendre-transforming the entropy rather than the energy:
- Massieu function:
- Planck potential:
These are particularly natural in statistical mechanics because the canonical partition function gives directly: . They also simplify expressions involving temperature derivatives, which is why the Gibbs-Helmholtz equation () takes such a clean form.
Grand potential
For open systems where particles can enter and leave, the grand potential is the right choice:
Natural variables: , , and . Its differential is:
For a homogeneous system, Euler's theorem gives the simple result . The grand potential connects to the grand canonical partition function via , making it essential for quantum gases, adsorption problems, and any situation where particle number fluctuates.
Landau theory of phase transitions
Landau theory provides a phenomenological framework for continuous (second-order) phase transitions. The central idea: expand the free energy as a power series in an order parameter (which is zero in the disordered phase and nonzero in the ordered phase):
The coefficient changes sign at the transition temperature , typically as . Above , the minimum is at . Below , the minimum shifts to , signaling the phase transition.
Landau theory predicts mean-field critical exponents (e.g., ) that are exact in high dimensions but only approximate in 2D and 3D, where fluctuations (ignored by Landau theory) become important. Despite this limitation, it provides the conceptual foundation for understanding symmetry breaking across physics, from ferromagnets to superconductors to the Higgs mechanism.