๐ŸงคPhysical Chemistry I

Entropy Formulas

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Entropy answers the fundamental question in physical chemistry: why do things happen? These formulas connect microscopic particle behavior to macroscopic thermodynamic predictions, and they appear throughout thermodynamics, statistical mechanics, and chemical equilibrium. You need them for predicting reaction spontaneity, understanding phase transitions, and solving nearly every problem involving the Second Law.

Rather than memorizing blindly, focus on what each formula reveals. The Boltzmann formula connects particle arrangements to disorder. The Clausius definition links heat flow to entropy. The Gibbs relation determines whether reactions actually occur. When you hit an entropy problem on an exam, ask yourself: Am I dealing with microscopic statistics, macroscopic heat transfer, or a spontaneity prediction? That question points you to the right formula.


Fundamental Laws and Definitions

These formulas establish what entropy is and the universal constraints governing it. Everything else builds on them.

Second Law of Thermodynamics

  • ฮ”Suniverseโ‰ฅ0\Delta S_{\text{universe}} \geq 0 โ€” the total entropy of the universe never decreases
  • The equality (==) holds only for reversible (ideal) processes. All real, spontaneous processes are irreversible and produce a strict increase.
  • This law encodes the directionality of nature: heat flows from hot to cold, gases expand into vacuums, and none of these processes spontaneously reverse.

Third Law of Thermodynamics

  • Sโ†’0S \rightarrow 0 as Tโ†’0ย KT \rightarrow 0 \text{ K} โ€” a perfect crystal at absolute zero has exactly zero entropy
  • This gives you an absolute reference point for entropy, which is something you don't have for enthalpy or internal energy. You can look up absolute molar entropies (Sโˆ˜S^\circ) in tables precisely because of this law.
  • The unattainability principle follows: reaching absolute zero would require an infinite number of steps, so it's physically impossible.

Clausius Entropy Definition

  • dS=ฮดQrevTdS = \frac{\delta Q_{\text{rev}}}{T} โ€” entropy change equals reversible heat divided by temperature
  • The subscript "rev" is critical. You must calculate along a reversible path, even if the actual process is irreversible. Since SS is a state function, the result still applies to the real process.
  • Notice the TT in the denominator: the same amount of heat transfer produces a smaller ฮ”S\Delta S at higher temperature. Adding 1 kJ of heat at 100 K changes entropy far more than adding 1 kJ at 1000 K.

Compare: The Second Law governs processes (entropy of the universe increases), while the Third Law establishes a reference point (absolute zero baseline). Exam problems often require both: use the Third Law to get absolute entropy values, and the Second Law to assess spontaneity.


Statistical Mechanics Foundations

These formulas reveal entropy's microscopic meaning, connecting probability and particle arrangements to thermodynamic quantities.

Boltzmann's Entropy Formula

  • S=kBlnโกWS = k_B \ln W โ€” entropy equals the Boltzmann constant times the natural log of the number of microstates
  • Microstates (WW) are the number of distinguishable arrangements of particles that all produce the same macroscopic state (same UU, VV, NN).
  • This is the bridge equation between statistical mechanics and classical thermodynamics. It tells you that entropy is a measure of how many microscopic configurations are compatible with what you observe macroscopically.

Gibbs Entropy Formula

  • S=โˆ’kBโˆ‘ipilnโกpiS = -k_B \sum_i p_i \ln p_i โ€” entropy from the full probability distribution over microstates
  • pip_i is the probability of the system being in microstate ii, and the sum runs over all accessible microstates.
  • Maximum entropy occurs when all microstates are equally probable (a uniform distribution). In that special case, pi=1/Wp_i = 1/W for all ii, and the Gibbs formula reduces exactly to S=kBlnโกWS = k_B \ln W.

Compare: Boltzmann's formula assumes all microstates are equally probable, which is valid for an isolated system at equilibrium (the microcanonical ensemble). The Gibbs formula handles any probability distribution, making it more general. If a problem specifies non-uniform probabilities or a canonical ensemble, reach for the Gibbs version.


Process-Specific Entropy Changes

These are the workhorses of problem-solving. Each formula calculates ฮ”S\Delta S for a specific type of physical or chemical process.

Entropy Change for an Ideal Gas

ฮ”S=nRlnโกV2V1+nCVlnโกT2T1\Delta S = nR \ln\frac{V_2}{V_1} + nC_V \ln\frac{T_2}{T_1}

This accounts for both volume and temperature changes simultaneously.

  • The volume term (nRlnโกV2V1nR \ln \frac{V_2}{V_1}) reflects the increase in spatial arrangements available to the molecules.
  • The temperature term (nCVlnโกT2T1nC_V \ln \frac{T_2}{T_1}) reflects the increase in accessible energy levels at higher temperature.
  • Even though this is derived by integrating dSdS along a reversible path, the result is path-independent because entropy is a state function. You can use it for irreversible expansions too.

An equivalent form using pressure is sometimes more convenient:

ฮ”S=nRlnโกV2V1+nCPlnโกT2T1โˆ’nRlnโกP2P1\Delta S = nR \ln\frac{V_2}{V_1} + nC_P \ln\frac{T_2}{T_1} - nR\ln\frac{P_2}{P_1}

though this isn't independent of the first form; it follows from the ideal gas law.

For special cases: in an isothermal process (T1=T2T_1 = T_2), the temperature term drops out. In an isochoric process (V1=V2V_1 = V_2), the volume term drops out.

Entropy of Mixing (Ideal Gases)

ฮ”Smix=โˆ’nRโˆ‘ixilnโกxi\Delta S_{\text{mix}} = -nR \sum_i x_i \ln x_i

  • xix_i is the mole fraction of component ii. Since xi<1x_i < 1 for every component in a mixture, lnโกxi<0\ln x_i < 0, and the negative sign makes ฮ”Smix\Delta S_{\text{mix}} always positive.
  • For ideal gases, mixing has ฮ”Hmix=0\Delta H_{\text{mix}} = 0, so the process is driven entirely by this entropy increase.
  • Physically, each gas expands into a larger effective volume, which is why this formula has the same nRlnโกnR \ln structure as the ideal gas expansion formula.

Entropy Change for Phase Transitions

ฮ”S=ฮ”HtrsTtrs\Delta S = \frac{\Delta H_{\text{trs}}}{T_{\text{trs}}}

  • This applies at the equilibrium transition temperature and constant pressure, where the two phases coexist and the process is reversible.
  • It works for melting, vaporization, sublimation, or any first-order phase transition.
  • Vaporization produces a much larger ฮ”S\Delta S than fusion because molecules gain enormously more freedom going from liquid to gas. For water: ฮ”Svapโ‰ˆ109ย Jย molโˆ’1Kโˆ’1\Delta S_{\text{vap}} \approx 109 \text{ J mol}^{-1}\text{K}^{-1} vs. ฮ”Sfusโ‰ˆ22ย Jย molโˆ’1Kโˆ’1\Delta S_{\text{fus}} \approx 22 \text{ J mol}^{-1}\text{K}^{-1}.

Compare: The ideal gas expansion formula and the mixing formula both contain nRlnโกnR \ln terms because both processes increase the number of spatial configurations available to molecules. Expansion gives each molecule more volume; mixing gives each molecule access to the total volume. Same statistical origin, different physical situations.


Thermodynamic Relations and Spontaneity

These formulas connect entropy to other thermodynamic quantities and predict whether processes actually occur.

Gibbs Free Energy Relation

ฮ”G=ฮ”Hโˆ’Tฮ”S\Delta G = \Delta H - T\Delta S

This is the master equation for spontaneity at constant TT and PP.

  • ฮ”G<0\Delta G < 0: spontaneous. ฮ”G=0\Delta G = 0: equilibrium. ฮ”G>0\Delta G > 0: non-spontaneous.
  • The Tฮ”ST\Delta S term means that temperature controls the competition between enthalpy and entropy. At high TT, the entropy term dominates; at low TT, the enthalpy term dominates.
  • A reaction with ฮ”H>0\Delta H > 0 and ฮ”S>0\Delta S > 0 (endothermic, entropy-increasing) becomes spontaneous above T=ฮ”H/ฮ”ST = \Delta H / \Delta S. A reaction with ฮ”H<0\Delta H < 0 and ฮ”S<0\Delta S < 0 becomes non-spontaneous above that same crossover temperature.

Maxwell Relation (Entropy-Volume)

(โˆ‚Sโˆ‚V)T=(โˆ‚Pโˆ‚T)V\left(\frac{\partial S}{\partial V}\right)_T = \left(\frac{\partial P}{\partial T}\right)_V

  • This connects entropy changes (which you can't measure directly) to PP-VV-TT data (which you can measure).
  • It's derived from the equality of mixed second partial derivatives of the Helmholtz free energy AA: since dA=โˆ’SdVโˆ’PdVdA = -SdV - PdV... more precisely, since dA=โˆ’SdTโˆ’PdVdA = -SdT - PdV, taking โˆ‚2A/โˆ‚Tโˆ‚V\partial^2 A / \partial T \partial V in both orders gives this relation.
  • Practical use: if you have an equation of state for a real gas, you can compute (โˆ‚P/โˆ‚T)V(\partial P / \partial T)_V directly and use it to find how entropy changes with volume.

Compare: The Clausius definition defines how to calculate dSdS from heat transfer. The Gibbs relation uses ฮ”S\Delta S (along with ฮ”H\Delta H) to predict spontaneity. Clausius tells you how to get the number; Gibbs tells you what the number means for whether a process occurs.


Quick Reference Table

ConceptKey Formulas
Fundamental Lawsฮ”Suniverseโ‰ฅ0\Delta S_{\text{universe}} \geq 0, Sโ†’0S \rightarrow 0 as Tโ†’0T \rightarrow 0
Entropy Definition (Classical)dS=ฮดQrev/TdS = \delta Q_{\text{rev}}/T
Entropy Definition (Statistical)S=kBlnโกWS = k_B \ln W, S=โˆ’kBโˆ‘pilnโกpiS = -k_B \sum p_i \ln p_i
Ideal Gas Processesฮ”S=nRlnโก(V2/V1)+nCVlnโก(T2/T1)\Delta S = nR \ln(V_2/V_1) + nC_V \ln(T_2/T_1)
Mixingฮ”Smix=โˆ’nRโˆ‘xilnโกxi\Delta S_{\text{mix}} = -nR \sum x_i \ln x_i
Phase Transitionsฮ”S=ฮ”Htrs/Ttrs\Delta S = \Delta H_{\text{trs}}/T_{\text{trs}}
Spontaneityฮ”G=ฮ”Hโˆ’Tฮ”S\Delta G = \Delta H - T\Delta S
Maxwell Relations(โˆ‚S/โˆ‚V)T=(โˆ‚P/โˆ‚T)V(\partial S/\partial V)_T = (\partial P/\partial T)_V

Self-Check Questions

  1. Which two entropy formulas both contain the term nRlnโกnR \ln, and what physical principle connects them?

  2. If you need to calculate the absolute entropy of a substance at 298 K, which law provides your reference point, and which definition would you integrate from 0 K?

  3. Under what conditions does the Gibbs entropy formula S=โˆ’kBโˆ‘pilnโกpiS = -k_B \sum p_i \ln p_i reduce to the Boltzmann formula S=kBlnโกWS = k_B \ln W?

  4. A reaction has ฮ”H>0\Delta H > 0 and ฮ”S>0\Delta S > 0. Using the Gibbs relation, at what temperature does this reaction become spontaneous? Write the expression for the crossover temperature.

  5. An ideal gas expands isothermally into a vacuum (free expansion). Which formula gives you ฮ”S\Delta S, and why does the Clausius definition dS=ฮดQrev/TdS = \delta Q_{\text{rev}}/T require you to construct a hypothetical reversible path even though Q=0Q = 0 for the actual process?