Describing entropy

21st Dec 2019

You should already know the differences between the system and the surroundings, that atoms and bonds oscillate about a fixed position and how σ-bonds rotate. Access to molecular models will help considerably.

Entropy is always a challenging idea to follow at first. Nevertheless, it is an important concept because it helps us to predict what occurs. What follows is how I describe (as opposed to formally define) entropy in three ways with some examples.

  1. Entropy is the extent to which heat and matter are spread out or dispersed
  2. The greater the spread, the higher the entropy. Picture this. When we heat a block of metal in the corner (Figure 2.1), normally the temperature of the remainder of the block increases over time because the heat energy is transferred from the hotter region (hot source) to the cooler region (cold sink) of the block (Figure 2.2).

    Heating an object Figure 2.1. Heat energy concentrated over a small area (lower entropy)
    Heat energy spreading randomly across an object Figure 2.2. Heat energy dispersed over larger area (higher entropy)

    Related to the extent to which heat energy is spread, the entropy of the block the moment we heated the block (Figure 2.1) is low compared to the situation when more of block warms up. The entropy of the system (the system would be the metal block) increases as more heat energy spreads out across the metal block.

    Regarding matter, recall what happens to a pellet of dry ice when it is exposed to the atmosphere at room temperature. We find the pellet becomes smaller and observe water vapour formation as the cooler carbon dioxide gas escapes to the surroundings. In this example, the carbon dioxide molecules spread out (Figure 2.3) randomly. You can also observe the same result when you place a drop of liquid bromine in a colourless glass jar: the jar gets filled with an orange-brown vapour. In both given cases, the extent to which matter is spread out or dispersed increases and therefore the entropy is increases.

    Dispersion of molecules Figure 2.3. Molecules becoming more dispersed

    We can extend the idea about dispersion a little further. What happens if we use a larger glass jar? The capacity or potential for further dispersion of matter (bromine vapour in this case) is higher, and so the final entropy of a larger glass jar of bromine is greater than the final entropy of a smaller jar. How does this relate to the expansion of the universe? I will leave this thought. In chemistry, you will quite often encounter situations when the expansion of a gas is accompanied by an increase in the entropy of the system.

  3. Entropy is a measure of how many ways one can configure a system
  4. The more ways in which a system can be configured, the higher the entropy. This definition is a more quantitative description of entropy and is quite useful when comparing different systems without considering temperature or heat. What do I mean by "configure"?

    1. Let us first assume that the atoms in each molecule are identical and frozen in position (bond angles are fixed) so the only other option we have is to decide where in the "container" do we place each molecule. Comparing pure (molecular) hydrogen with a mixture of hydrogen and helium (with the same number of combined molecules, see Figure 2.4), we can see that the entropy of the hydrogen-helium mixture is greater than the entropy of the pure hydrogen. This is because there are more ways of configuring the mixture:
    2. Examples of configurations Figure 2.4. Pure hydrogen on the left, hydrogen-helium mixture on the right

      Consequently, the more complex a mixture, the more configurations are possible and the higher the entropy. It also follows that the more molecules there are e.g. the greater the volume of gas present, the more possible configurations there are and consequently the higher the entropy.

    3. If we compare individual molecules, then we need to compare how many ways we can position the atoms (without breaking bonds) by stretching bonds and/or bending molecules. Start simple. A monatomic species e.g. a noble gas can only be configured by positioning an atom in space. A diatomic molecule e.g. hydrogen chloride HCl can be configured such that the H and the Cl atoms can be placed a number of distances apart. How? By stretching the H-Cl bond (Figure 2.5).
    4. Stretching of an H-Cl bond Figure 2.5. Stretching the HCl bond

      This is a reasonable idea because we believe that atoms are not fixed in position. In fact, the bonds oscillate like mechanical springs (this action is the basis of infra-red spectroscopy). Within the HCl molecule, there are a number of ways of configuring the atoms' position.

      As molecules become larger, they give rise to more configurations and usually higher entropy. For example, when we consider carbon dioxide CO2, not only can one stretch the C=O bonds but also bend the molecule (Figure 2.6). The bond angle is temporarily smaller.

      Bending of a CO2 molecule Figure 2.6. Bending the CO2 molecule

      We can also arrange atoms in a molecule by rotating along σ-bonds. Using a ball and stick model, try rotating the C-C bonds of propane (Figure 2.7) and then try again with cyclopropane (without breaking bonds).

      Rotation of C-C bonds Figure 2.7. Rotating a C-C σ-bond of propane

      Evidently, it is possible to generate more configurations of a propane molecule compared to a cyclopropane molecule. As a result, one would expect the entropy of propane to be greater than that of cyclopropane. We can generalise this idea to other molecules (which need not be carbon-containing): straight-chain (linear) structures of chain length x have higher entropy than cyclic (ring) structures of ring-size x.

      If you have access to a table of absolute entropies (from a data booklet) you should now be able to appreciate why the entropy of substance is low for some and higher for others by applying the above ideas.

  5. Entropy is the amount of disorder
  6. The higher the amount of disorder, the higher the entropy. This description tends to be the most often applied of the three descriptions and one to remember if you can only remember one description. The more random motion (disorder) a system has, the more entropy is present. There is more random motion present in a gas than in a solid, hence the entropy of a gas is greater than the entropy of a solid.

Perhaps you can see how each description can be applied to the same system. Gas molecules tend to be more dispersed than solid molecules (the first description). There are more ways to configure a finite amount of gas compared to the same number of molecules in the solid state (the second description). Resulting from the third description, there is more random motion in gases than there is in solids. All lead to the same conclusion: the entropy of a gas is greater than the entropy of the solid.

I hope this article has helped you appreciate the idea of entropy and where it can be applied. I suggest you take a minute or two to compare the entropy of other systems you can think of, using one or more of the above descriptions. For those of you who go on to university study, you will learn about more precise (rigorous) ways of defining entropy.

In future articles we will discuss changes in entropy, the important role entropy has on deducing the likelihood that a process occurs and why we propose that all reactions are ultimately reversible.