Try this tutorial with accompanying videos to help teach students on post-16 courses about the concept of entropy and the second law of thermodynamics

This tutorial is designed to develop and consolidate students’ understanding of entropy as a fundamental concept in thermodynamics. Students build on their knowledge about the chance behaviour of particles and energy to learn about:

- The second law of thermodynamics
- The role of energy in determining the entropy or randomness of a chemical system
- Entropy change and how to calculate it

The tutorial also features video demonstrations and animations to illustrate key ideas.

**Note**

The interactive ‘simulations’ for this tutorial are currently unavailable. However, you may find it helpful to read any accompanying instructions, observations and conclusions relating to the simulations below.

## Introducing entropy

Scientists have a mathematical way of measuring randomness – it is called ‘entropy’ and is related to the number of arrangements of particles (such as molecules) that lead to a particular state. (By a ‘state’, we mean an observable situation, such as a particular number of particles in each of two boxes.)

As the numbers of particles increases, the number of possible arrangements in any particular state increases astronomically so we need a scaling factor to produce numbers that are easy to work with. Entropy, S, is defined by the equation:

*S* = *k* ln*W*

where *W* is the number of ways of arranging the particles that gives rise to a particular observed state of the system, and *k* is a constant called Boltzmann’s constant which has the value 1.38 x 10^{-23} J K^{-1}. In the expression above, *k* has the effect of scaling the vast number *W* to a smaller, more manageable number.

The natural logarithm, ln, also has the effect of scaling a vast number to a small one – the natural log of 10^{-23} is 52.95, for example.

**Note**

Just as logarithms to the base 10 are derived from 10* ^{x}*, natural logarithms are derived from the exponent of the function

*e*, where

^{x}*e*has the value 2.718. There are certain mathematical advantages to using this base number.

Don’t let students worry about ‘ln’, just get them to use the correct button on their calculators. Some examples of calculating the ln of large numbers might help students to see the scaling effect.

Entropies are measured in joules per kelvin per mole (J K^{-1} mol^{-1}). Notice the difference between the units of entropy and those of ‘enthalpy’ (energy), kilojoules per mole (kJ mol^{-1}).

The key point to remember is that entropy is a figure that measures randomness and gases, where the particles could be anywhere, tend to have greater entropies than liquids, which tend to have greater entropies than solids, where the particles are very regularly arranged. You can see this general trend from the animations of the three states below.

Substance | Physical state at standard conditions | Entropy, S J K^{-1} mol^{-1} |
---|---|---|

Carbon (diamond) | solid | 2.4 |

Copper | solid | 33 |

Calcium oxide | solid | 40 |

Calcium carbonate | solid | 93 |

Ammonium chloride | solid | 95 |

Water (ice) | solid | 48 |

Water | liquid | 70 |

Water (steam) | gas | 189 |

Hydrogen chloride | gas | 187 |

Ammonia | gas | 192 |

Carbon dioxide | gas | 214 |

**Note**

Observe the not all solids have smaller entropy values than all liquids nor do all liquids have smaller values than all gases.

There is, however, a general trend that:

*S*_{solids} < *S*_{liquids} < *S*_{gasses}

## The second law of thermodynamics

Since processes take place by chance alone, and this leads to increasing randomness, we can say that in any spontaneous process (one that takes place of its own accord and is not driven by outside influences) entropy increases.

This is called the second law of thermodynamics, and is probably the most fundamental physical law. So processes that involve spreading out and increasing disorder are favoured over those where things become more ordered – students may have noticed this in their bedrooms!

The word ‘feasible’ is also used to mean the same as ‘spontaneous’. The terms have nothing to do with the rate of a process. So a reaction may be feasible (spontaneous) but occur so slowly that in practice it does not occur at all. For instance, the reaction of diamond (pure carbon) with oxygen to form carbon dioxide is feasible (spontaneous) but we do not have to worry about jewellery burning in air at room temperature – ‘diamonds are forever’!

### Illustrating the second law

Watch the two videos below. You would have no difficulty in deciding which one is being played in reverse – randomly arranged fragments do not spontaneously arrange themselves into an ordered configuration.

#### Video 1

#### Video 2

### Entropy and solutions

So the idea of entropy increasing *seems * to explain why a crystal of salt (sodium chloride) will dissolve in a beaker of water of its own accord (spontaneously). The resulting solution in which sodium and chloride ions are thoroughly mixed with water molecules is more random than a beaker of pure water and a crystal of solid sodium chloride in which the ions are in a highly ordered crystal lattice.

**Note**

There is some local increase in order on forming the solution as (disordered) water molecules cluster round the Na^{+} and Cl^{–} ions but this is outweighed by the large increase in disorder as the sodium chloride lattice breaks up.

It also seems to explain our initial query, or why the reaction between magnesium and hydrochloric acid,

Mg_{(s)} + 2HCl_{(aq) }→ H_{2(g)} + MgCl_{2(aq)}

occurs, but the reverse reaction,

H_{2(g)} + MgCl_{2(aq)}→ Mg_{(s)} + 2HCl_{(aq)}

does not. In the first case, the production of a gas from a solid clearly involves an increase in entropy while the reverse has a decrease.

## Is the second law wrong?

It is not difficult to find examples of chemical reactions that appear to contradict the rule that entropy increases in spontaneous processes.

Take, for example, the demonstration illustrated in the video below, where the two gases, hydrogen chloride and ammonia, diffuse along a tube and produce a white ring of solid ammonium chloride:

HCl_{(g)} + NH_{3(g)}→ NH_{4} Cl_{(s)}

The two gases forming a solid clearly involve a decrease in entropy, yet the reaction occurs spontaneously.

In fact, we can calculate the numerical value of the entropy change from the figures in the table above (see Introducing entropy):

- Total entropy of starting materials 187 + 192 = 379 J K
^{-1}mol^{-1} - Entropy of product = –95 J K
^{-1}mol^{-1} - Entropy change = –284 J K
^{-1}mol^{-1}

As expected, a significant decrease. (Remember, we expect spontaneous reactions to have an increase in entropy.) Does this mean the second law is wrong?

## The role of energy

Energy also has a role to play in the entropy or randomness of a chemical system, by which we mean a quantity of substance or substances (such as a reaction mixture).

Energy exists in ‘packets’ called quanta. You can have any whole number of quanta of energy but not 11/2 or 3.25 quanta.

Like the distribution of atoms in space, the distribution of quanta of energy between molecules is also random because, like molecules, energy quanta do not ‘know’ where they are supposed to be – energy ‘doesn’t care’. We can simulate this distribution.

We have seen that the number of ways of arranging particles contributes to the entropy of a physical system. The distribution of energy quanta also contributes to the system’s entropy.

### The distribution of energy simulation

#### Introduction and instructions

This is a simple example is to consider quanta of energy distributed between the vibrational energy levels of a set of diatomic molecules. Such energy levels are evenly spaced and can be represented like the rungs of ladders. How many ways are there of distributing *x* quanta of energy between*y* molecules?

- Here you can vary the number of molecules and the number of quanta available to be distributed between them and the simulation will allow the energy to be exchanged between the molecules in all possible ways at random.
- Start by setting the number of quanta and molecules to a low value. Use the step button to move through all the combinations (for example: there are four ways to distribute three quanta among two molecules).

#### Key observations

The more quanta of energy there are to be shared between a given number of molecules, the more ways there are of arranging them. Also, the more molecules there are, the more ways there are of sharing.

The tables below show the possible arrangements of four quanta between two molecules, five quanta between two molecules and three quanta between three molecules respectively. Try exploring these using the simulator.

Molecule 1 | Molecule 2 |
---|---|

0 | 4 |

1 | 3 |

2 | 2 |

3 | 1 |

4 | 0 |

Molecule 1 | Molecule 2 |
---|---|

0 | 5 |

1 | 4 |

2 | 3 |

3 | 2 |

4 | 1 |

5 | 0 |

Molecule 1 | Molecule 2 | Molecule 3 |
---|---|---|

3 | 0 | 0 |

0 | 0 | 3 |

0 | 3 | 0 |

1 | 1 | 1 |

1 | 2 | 0 |

2 | 1 | 0 |

1 | 0 | 2 |

2 | 0 | 1 |

0 | 1 | 2 |

0 | 2 | 1 |

#### Conclusion

The distribution of energy quanta also contributes to entropy because of the relationship *S* = *k* ln*W*. The more heat energy we put into anything, the more its entropy increases because there are more quanta and thus more ways, *W*, to distribute them.

Most chemical reactions involve a change of heat energy (enthalpy), either given out from the reactants to the surroundings or taken in from the surroundings into the products. So we must also take this into account when we are considering the entropy change of a chemical reaction. It is not just the chemical reaction that matters but the surroundings as well.

## The system and its surroundings

Here is the solution to the puzzle about the ammonia–hydrogen chloride reaction. This reaction is strongly exothermic (gives out a lot of heat to the surroundings – in fact ΔH is –176 kJ mol^{-1}). The key is the idea of the surroundings.

For each mole of ammonium chloride that is formed, 176 kJ of heat energy is transferred to the surroundings. As we have seen, this increases the entropy of the surroundings because of the increased number of ways of arranging the quanta of energy. So, within the reaction itself (ie starting materials and products), entropy decreases but, because of the heat energy passed to the surroundings, the entropy of the surroundings increases, and more than compensates for the decrease in entropy in the reaction.

In other words, this increase in the entropy of the surroundings is more than the decrease in entropy of the reaction and thus there is an overall increase in entropy.

We call the reaction itself ‘the system’ and everything else ‘the surroundings’. In principle, ‘the surroundings’ is literally the rest of the Universe, but in practice it is the area immediately around the reaction vessel.

So, the second law of thermodynamics is not broken by the ammonia–hydrogen chloride reaction; the problem was that we had forgotten the surroundings. This is because, as chemists, we are used to concentrating only on what happens inside our reaction vessel.

### How does this affect our understanding of the second law?

We can build on these insights to make a better statement of the second law of thermodynamics: in a spontaneous change, the entropy of the Universe increases, ie the sum of the entropy of the system and the entropy of the surroundings increases.

## Total entropy change

As we have seen above, the entropy change of the ammonia–hydrogen chloride reaction (‘the system’) is –284 J K^{-1} mol^{-1}. It is negative as we have calculated (and predicted from the reaction being two gases going to a solid).

But how can we evaluate the entropy change caused by ‘dumping’ 176 kJ mol^{-1} of heat energy into the surroundings? (Notice that this is 176 000 J mol^{-1}). It must be positive as more quanta of energy lead to more possible arrangements and it must be more than 284 J K^{-1} mol^{-1}.

The formal derivation is complex but leads to the expression Δ*S* of the surroundings:

ΔS_{surroundings} = –Δ*H*/ *T*

This makes sense because:

- The negative sign means that an exothermic reaction (Δ
*H*is negative, heat given out) produces an increase in the entropy of the surroundings. - The more negative the value of Δ
*H*, the more positive the entropy increase of the surroundings. - The same amount of energy dumped into the surroundings will make more difference at lower temperature – this rationalises the ‘division by
*T*’.

For the ammonia–hydrogen chloride reaction at 298 K:

Δ*S*_{surroundings} = –Δ*H*/ *T* = –(–176 000) / 298

Δ*S*_{surroundings} = 591 J K^{-1} mol^{-1}, more than enough to outweigh the value of Δ*S*_{system} of –284 J K^{-1} mol^{-1}

So the total entropy change (of the Universe, ie system + surroundings) brought about by the reaction is +307 J K^{-1} mol^{-1}.

If we want to predict the direction of a chemical reaction, we must take account of the total entropy change of the system and the surroundings, and that includes the effect on entropy of any heat change from the system to the surroundings (or, in the other direction, heat taken in from surroundings to system).

### The equation for total entropy change

Total entropy change is the sum of the entropy changes of the system and its surroundings:

Δ*S*_{total} = Δ*S*_{system} + Δ*S*_{surroundings}

If Δ*S*_{total} for a reaction is positive, the reaction will be feasible, if negative it will not be feasible.

### Additional information

This resource was originally published by the Royal Society of Chemistry in 2015 as part of the Quantum Casino website, with support from Reckitt Benckiser.

## Post-16 thermodynamics tutorials

- 1
- 2
- 3Currently reading
### Entropy

- 4

## No comments yet