Written by the Fiveable Content Team • Last updated September 2025
Written by the Fiveable Content Team • Last updated September 2025
Definition
Entropy is a measure of the disorder or randomness in a system. It also quantifies the unavailability of a system's energy to do work.
5 Must Know Facts For Your Next Test
Entropy always increases in an isolated system according to the Second Law of Thermodynamics.
The change in entropy ($\Delta S$) can be calculated using $\Delta S = \frac{Q_{rev}}{T}$, where $Q_{rev}$ is the reversible heat transfer and $T$ is the temperature.
In any real process, total entropy (system + surroundings) always increases.
Entropy can be seen as a measure of energy dispersal within a system.
At absolute zero temperature (0 K), a perfect crystal has zero entropy according to the Third Law of Thermodynamics.
States that the total entropy of an isolated system can never decrease over time and is constant if all processes are reversible.
$Q_{rev}$: $Q_{rev}$ denotes reversible heat transfer, which occurs without increasing the total entropy of the system plus surroundings.
$\Delta S$: $\Delta S$ represents the change in entropy, often used to quantify how much disorder or randomness has increased or decreased within a system.