Entropy and the Second Law of Thermodynamics

Entropy and the Second Law
Entropy () is a measure of the disorder or randomness in a system. A gas spreading freely through a room has high entropy; a crystal with atoms locked in a regular lattice has low entropy. The concept gives us a way to quantify why certain processes happen on their own and others don't.
The Second Law of Thermodynamics states that the total entropy of an isolated system can never decrease over time. For any spontaneous process, the entropy of the universe (system + surroundings) increases. In a perfectly reversible process, total entropy stays constant, but truly reversible processes are idealizations. Every real process is at least slightly irreversible, so in practice, total entropy always goes up.
This has a few direct consequences:
- Processes that increase total entropy happen spontaneously (ice melting at room temperature, salt dissolving in water, heat flowing from a hot object to a cold one).
- Processes that would decrease total entropy don't happen on their own. You need to put energy in. A refrigerator, for example, moves heat from cold to hot, but only because it consumes electrical energy, and the total entropy of the universe still increases.
- The Second Law gives a fundamental explanation for the arrow of time: certain processes are irreversible precisely because reversing them would require a decrease in total entropy.

Applications of the Second Law
Energy transformations always produce some increase in total entropy. Even when you convert heat into work, some energy disperses in ways that raise entropy. Here are common examples:
- Heat transfer: A hot cup of coffee cooling to room temperature. Thermal energy flows from the coffee (high ) to the air (lower ), and the total entropy of the coffee-plus-air system increases.
- Mixing: Cream poured into coffee disperses throughout the cup. The mixed state has far more possible molecular arrangements (microstates) than the separated state, so entropy increases.
- Diffusion: Perfume released in one corner of a room eventually spreads everywhere. The molecules move from a concentrated, low-entropy arrangement to a dispersed, high-entropy one.
Non-spontaneous processes go the other direction. Separating a mixture of gases, or converting waste heat entirely into useful work, would require a local decrease in entropy. These processes only happen when an external energy source drives them, and even then, the entropy of the surroundings increases enough to keep the total positive.
The Second Law also helps you predict feasibility: if a proposed process would decrease the total entropy of the universe, it simply won't happen without outside intervention.

Entropy Changes in Systems
The change in entropy for a reversible process at constant temperature is:
- is the heat transferred to or from the system (in joules). Heat added to the system is positive.
- is the absolute temperature of the system (in Kelvin). You must use Kelvin here, not Celsius.
Example: If 500 J of heat flows into a system at 250 K:
When temperature changes during the process, you need the integral form:
This applies to reversible processes like the isothermal expansion of an ideal gas, where you integrate along the path of the process.
For irreversible processes, the actual entropy change is always greater than . This is captured by the Clausius inequality:
The equality holds only for reversible processes. Any real process (heat transfer across a finite temperature difference, friction, free expansion) produces more entropy than the reversible case.
Statistical mechanics approach: Entropy can also be understood at the microscopic level through Boltzmann's equation:
- is the number of microstates (distinct microscopic arrangements) consistent with the system's macroscopic state.
- J/K is the Boltzmann constant.
This equation connects the microscopic picture (how many ways can the particles be arranged?) to the macroscopic thermodynamic quantity . A system with more accessible microstates has higher entropy, which is why gases have higher entropy than solids at the same temperature.
Thermodynamic Cycles and Efficiency
A heat engine converts thermal energy into mechanical work by operating between a hot reservoir (at temperature ) and a cold reservoir (at temperature ). The Second Law places a hard limit on how efficient any heat engine can be: you can never convert all the input heat into work because some heat must be rejected to the cold reservoir to satisfy the entropy requirement.
The Carnot cycle is an idealized cycle consisting of two isothermal and two adiabatic steps, all performed reversibly. It achieves the maximum possible efficiency for any engine operating between the same two temperatures:
Both temperatures must be in Kelvin. For example, an engine operating between K and K has a maximum efficiency of:
No real engine reaches Carnot efficiency because all real processes involve some irreversibility (friction, turbulence, heat loss). The Carnot efficiency sets the theoretical ceiling.
Gibbs free energy () combines enthalpy and entropy into a single quantity that predicts spontaneity at constant temperature and pressure. A process is spontaneous when , which accounts for both the energy change and the entropy change of the system. This is especially useful in chemistry and engineering for determining whether a reaction will proceed and how much useful work it can deliver.