Why This Matters
Entropy answers the fundamental question in physical chemistry: why do things happen? These formulas connect microscopic particle behavior to macroscopic thermodynamic predictions, and they appear throughout thermodynamics, statistical mechanics, and chemical equilibrium. You need them for predicting reaction spontaneity, understanding phase transitions, and solving nearly every problem involving the Second Law.
Rather than memorizing blindly, focus on what each formula reveals. The Boltzmann formula connects particle arrangements to disorder. The Clausius definition links heat flow to entropy. The Gibbs relation determines whether reactions actually occur. When you hit an entropy problem on an exam, ask yourself: Am I dealing with microscopic statistics, macroscopic heat transfer, or a spontaneity prediction? That question points you to the right formula.
Fundamental Laws and Definitions
These formulas establish what entropy is and the universal constraints governing it. Everything else builds on them.
Second Law of Thermodynamics
- ฮSuniverseโโฅ0 โ the total entropy of the universe never decreases
- The equality (=) holds only for reversible (ideal) processes. All real, spontaneous processes are irreversible and produce a strict increase.
- This law encodes the directionality of nature: heat flows from hot to cold, gases expand into vacuums, and none of these processes spontaneously reverse.
Third Law of Thermodynamics
- Sโ0 as Tโ0ย K โ a perfect crystal at absolute zero has exactly zero entropy
- This gives you an absolute reference point for entropy, which is something you don't have for enthalpy or internal energy. You can look up absolute molar entropies (Sโ) in tables precisely because of this law.
- The unattainability principle follows: reaching absolute zero would require an infinite number of steps, so it's physically impossible.
Clausius Entropy Definition
- dS=TฮดQrevโโ โ entropy change equals reversible heat divided by temperature
- The subscript "rev" is critical. You must calculate along a reversible path, even if the actual process is irreversible. Since S is a state function, the result still applies to the real process.
- Notice the T in the denominator: the same amount of heat transfer produces a smaller ฮS at higher temperature. Adding 1 kJ of heat at 100 K changes entropy far more than adding 1 kJ at 1000 K.
Compare: The Second Law governs processes (entropy of the universe increases), while the Third Law establishes a reference point (absolute zero baseline). Exam problems often require both: use the Third Law to get absolute entropy values, and the Second Law to assess spontaneity.
Statistical Mechanics Foundations
These formulas reveal entropy's microscopic meaning, connecting probability and particle arrangements to thermodynamic quantities.
- S=kBโlnW โ entropy equals the Boltzmann constant times the natural log of the number of microstates
- Microstates (W) are the number of distinguishable arrangements of particles that all produce the same macroscopic state (same U, V, N).
- This is the bridge equation between statistical mechanics and classical thermodynamics. It tells you that entropy is a measure of how many microscopic configurations are compatible with what you observe macroscopically.
- S=โkBโโiโpiโlnpiโ โ entropy from the full probability distribution over microstates
- piโ is the probability of the system being in microstate i, and the sum runs over all accessible microstates.
- Maximum entropy occurs when all microstates are equally probable (a uniform distribution). In that special case, piโ=1/W for all i, and the Gibbs formula reduces exactly to S=kBโlnW.
Compare: Boltzmann's formula assumes all microstates are equally probable, which is valid for an isolated system at equilibrium (the microcanonical ensemble). The Gibbs formula handles any probability distribution, making it more general. If a problem specifies non-uniform probabilities or a canonical ensemble, reach for the Gibbs version.
Process-Specific Entropy Changes
These are the workhorses of problem-solving. Each formula calculates ฮS for a specific type of physical or chemical process.
Entropy Change for an Ideal Gas
ฮS=nRlnV1โV2โโ+nCVโlnT1โT2โโ
This accounts for both volume and temperature changes simultaneously.
- The volume term (nRlnV1โV2โโ) reflects the increase in spatial arrangements available to the molecules.
- The temperature term (nCVโlnT1โT2โโ) reflects the increase in accessible energy levels at higher temperature.
- Even though this is derived by integrating dS along a reversible path, the result is path-independent because entropy is a state function. You can use it for irreversible expansions too.
An equivalent form using pressure is sometimes more convenient:
ฮS=nRlnV1โV2โโ+nCPโlnT1โT2โโโnRlnP1โP2โโ
though this isn't independent of the first form; it follows from the ideal gas law.
For special cases: in an isothermal process (T1โ=T2โ), the temperature term drops out. In an isochoric process (V1โ=V2โ), the volume term drops out.
Entropy of Mixing (Ideal Gases)
ฮSmixโ=โnRโiโxiโlnxiโ
- xiโ is the mole fraction of component i. Since xiโ<1 for every component in a mixture, lnxiโ<0, and the negative sign makes ฮSmixโ always positive.
- For ideal gases, mixing has ฮHmixโ=0, so the process is driven entirely by this entropy increase.
- Physically, each gas expands into a larger effective volume, which is why this formula has the same nRln structure as the ideal gas expansion formula.
Entropy Change for Phase Transitions
ฮS=TtrsโฮHtrsโโ
- This applies at the equilibrium transition temperature and constant pressure, where the two phases coexist and the process is reversible.
- It works for melting, vaporization, sublimation, or any first-order phase transition.
- Vaporization produces a much larger ฮS than fusion because molecules gain enormously more freedom going from liquid to gas. For water: ฮSvapโโ109ย Jย molโ1Kโ1 vs. ฮSfusโโ22ย Jย molโ1Kโ1.
Compare: The ideal gas expansion formula and the mixing formula both contain nRln terms because both processes increase the number of spatial configurations available to molecules. Expansion gives each molecule more volume; mixing gives each molecule access to the total volume. Same statistical origin, different physical situations.
Thermodynamic Relations and Spontaneity
These formulas connect entropy to other thermodynamic quantities and predict whether processes actually occur.
Gibbs Free Energy Relation
ฮG=ฮHโTฮS
This is the master equation for spontaneity at constant T and P.
- ฮG<0: spontaneous. ฮG=0: equilibrium. ฮG>0: non-spontaneous.
- The TฮS term means that temperature controls the competition between enthalpy and entropy. At high T, the entropy term dominates; at low T, the enthalpy term dominates.
- A reaction with ฮH>0 and ฮS>0 (endothermic, entropy-increasing) becomes spontaneous above T=ฮH/ฮS. A reaction with ฮH<0 and ฮS<0 becomes non-spontaneous above that same crossover temperature.
Maxwell Relation (Entropy-Volume)
(โVโSโ)Tโ=(โTโPโ)Vโ
- This connects entropy changes (which you can't measure directly) to P-V-T data (which you can measure).
- It's derived from the equality of mixed second partial derivatives of the Helmholtz free energy A: since dA=โSdVโPdV... more precisely, since dA=โSdTโPdV, taking โ2A/โTโV in both orders gives this relation.
- Practical use: if you have an equation of state for a real gas, you can compute (โP/โT)Vโ directly and use it to find how entropy changes with volume.
Compare: The Clausius definition defines how to calculate dS from heat transfer. The Gibbs relation uses ฮS (along with ฮH) to predict spontaneity. Clausius tells you how to get the number; Gibbs tells you what the number means for whether a process occurs.
Quick Reference Table
|
| Fundamental Laws | ฮSuniverseโโฅ0, Sโ0 as Tโ0 |
| Entropy Definition (Classical) | dS=ฮดQrevโ/T |
| Entropy Definition (Statistical) | S=kBโlnW, S=โkBโโpiโlnpiโ |
| Ideal Gas Processes | ฮS=nRln(V2โ/V1โ)+nCVโln(T2โ/T1โ) |
| Mixing | ฮSmixโ=โnRโxiโlnxiโ |
| Phase Transitions | ฮS=ฮHtrsโ/Ttrsโ |
| Spontaneity | ฮG=ฮHโTฮS |
| Maxwell Relations | (โS/โV)Tโ=(โP/โT)Vโ |
Self-Check Questions
-
Which two entropy formulas both contain the term nRln, and what physical principle connects them?
-
If you need to calculate the absolute entropy of a substance at 298 K, which law provides your reference point, and which definition would you integrate from 0 K?
-
Under what conditions does the Gibbs entropy formula S=โkBโโpiโlnpiโ reduce to the Boltzmann formula S=kBโlnW?
-
A reaction has ฮH>0 and ฮS>0. Using the Gibbs relation, at what temperature does this reaction become spontaneous? Write the expression for the crossover temperature.
-
An ideal gas expands isothermally into a vacuum (free expansion). Which formula gives you ฮS, and why does the Clausius definition dS=ฮดQrevโ/T require you to construct a hypothetical reversible path even though Q=0 for the actual process?