Entropy and Thermodynamics
Entropy tells you which direction a process will naturally go. It quantifies the disorder or randomness in a system, and it's one of the main tools for predicting whether a reaction happens on its own (spontaneously) or needs outside help.
Entropy in Thermodynamic Systems
Entropy () is a thermodynamic state function that measures the degree of disorder or randomness in a system.
- Higher entropy = greater disorder. Gases have high entropy because their particles move freely and spread out.
- Lower entropy = greater order. Solids have low entropy because their particles are locked in fixed positions.
Entropy is directly tied to spontaneity, which describes whether a process occurs on its own without continuous outside input.
- Spontaneous processes increase the total entropy of the universe (system + surroundings). Ice melting at room temperature is a classic example: the solid becomes a less-ordered liquid, and the universe's total entropy goes up.
- Non-spontaneous processes would decrease the total entropy of the universe. Water freezing at room temperature doesn't happen on its own because it would lower total entropy.
The Second Law of Thermodynamics formalizes this: the total entropy of the universe always increases for any spontaneous process. Think of heat flowing from a hot object to a cold one. That transfer always goes in one direction on its own, and it's irreversible without adding energy from outside.

Entropy's Relation to Microstates
So why does entropy increase? The answer comes from microstates.
A microstate is one specific arrangement of all the particles in a system that still gives the same overall measurable properties (temperature, pressure, etc.). Think of it like a deck of cards: the "macrostate" might be shuffled, but there are millions of specific card orderings (microstates) that count as shuffled, versus only one ordering that counts as perfectly sorted.
The Boltzmann equation connects entropy to microstates:
where is the Boltzmann constant () and is the number of microstates.
More microstates means higher entropy. A gas has far more possible particle arrangements than a solid at the same temperature, so its entropy is much higher. Systems naturally tend toward arrangements with more microstates because those arrangements are statistically overwhelmingly more likely.

Entropy Changes in Reactions
You can predict whether entropy increases or decreases () for many processes by thinking about how the disorder of particles changes.
Phase changes:
-
Entropy increases when matter moves from a more ordered phase to a less ordered one:
- Melting (solid → liquid):
- Vaporization (liquid → gas):
-
Entropy decreases when matter moves from a less ordered phase to a more ordered one:
- Condensation (gas → liquid):
- Freezing (liquid → solid):
Temperature changes:
- Raising the temperature of a substance increases entropy () because particles move faster and access more microstates.
- Lowering the temperature decreases entropy ().
Chemical reactions:
The key rule of thumb: count the moles of gas on each side.
-
If the number of moles of gas increases, entropy increases. For example, the decomposition of calcium carbonate: You go from zero moles of gas to one mole of gas, so .
-
If the number of moles of gas decreases, entropy decreases. For example, the synthesis of ammonia: You go from 4 moles of gas to 2 moles of gas, so .
To calculate precisely, use standard molar entropies (), which are tabulated values:
Remember to multiply each value by the coefficient from the balanced equation before summing.
Entropy and Energy
Entropy doesn't work alone. Heat transfer directly affects entropy: when a system absorbs heat, its entropy increases; when it releases heat, its entropy decreases.
To predict whether a reaction is truly spontaneous, you need Gibbs free energy (), which combines enthalpy () and entropy () into one value. That relationship is covered in the next section on free energy, but the core idea is that nature favors both lower energy and higher entropy. Systems move toward equilibrium, the state where entropy is maximized given the system's energy constraints.