20.1: Entropy
Lesson Objectives
 Describe what it means to call a process spontaneous, and give examples.
 Define entropy.
 Be able to predict whether the entropy of a system is increasing or decreasing.
 Calculate ΔS°_{rxn} from appropriate data.
Vocabulary
 spontaneous: Describes a reaction in which the given conditions favor the formation of the products.
 entropy (S): A measure of probability or the degree of order.
 standard entropy: A measure of the entropy of a substance at 25°C and 1 atm of pressure.
 second law of thermodynamics: States that the entropy of the universe will increase for any spontaneous process.
 third law of thermodynamics: States that a perfectly regular crystal at a temperature of 0 K (absolute zero) would have an entropy value of 0.
Check Your Understanding
 How can we tell whether a reaction is exothermic or endothermic?
 What does the value of an equilibrium constant tell us about a reversible reaction?
 What information is needed to calculate the enthalpy change for a given reaction?
Introduction
Have you ever wondered why its easy to let your bedroom get messy, but much more difficult to keep it neat? When your bedroom is neat and orderly, everything is in a place where it belongs. When it's messy, everything seems to be out of order. It's as if you're bedroom naturally becomes messy, but you have to make an effort to keep it neat. Physical and chemical processes can be thought of in a similar way. There is a natural tendency for a physical or chemical process to progress in a certain direction. In this lesson, you will learn about the quantity used describe orderliness and how to predict the reaction direction in physical and chemical processes.
What is a Spontaneous Process?
Chemists want to be able to predict the outcome of reactions. They would like it to be possible to predict what will happen when reactants are added together under a given set of conditions. The conditions of a reaction might include things like temperature, pressure, and concentrations of various reaction components. If the given conditions favor the formation of products, the reaction is said to be spontaneous.
It should be noted that just because a process is spontaneous does not mean that it occurs quickly. The rusting of iron is a spontaneous process that takes place over a long period of time. The combustion of gasoline in oxygen (also a spontaneous process) is extremely fast when provided with a spark, but gasoline can be stored in air for quite a while without spontaneously combusting. Thermodynamics predicts the direction in which a reaction will eventually proceed, but it does not tell us anything about the rate at which the reaction occurs. The rate of a reaction depends on many factors, including activation energy, temperature, concentration, and the presence or absence of a catalyst. Chemical kinetics focuses on the pathway between reactants and products, while thermodynamics considers only the difference between the initial and final states.
We know of many examples of processes that occur spontaneously. If the temperature is below 0°C, a glass of liquid water will freeze. If the temperature is above 0°C, a cube of ice will melt. A hot object will gradually lose heat to its surroundings. Many chemical reactions also occur spontaneously. Two molecules of hydrogen will react with one molecule of oxygen to form water, releasing a significant amount of energy in the process. The spontaneous reaction between aluminum and bromine to form aluminum bromide also releases energy.
Melting ice in the Beaufort Sea off the North Slope of Alaska
However, a process does not need to be exothermic in order to be spontaneous. For example, the melting of ice is an endothermic process, but it is still spontaneous at high enough temperatures. If we add ammonium nitrate to water, it will spontaneously dissolve, but the resulting solution will be cooler, indicating that energy in the form of heat was consumed in the process. To determine whether a process is spontaneous, we need to look not only at the change in enthalpy, but also the change in a factor called entropy.
What Is Entropy?
At its most basic level, entropy (S) is a measure of probability. States that have a high probability of occurring by random chance have a high entropy value, and states that are unlikely to occur by random chance have a low entropy value. There is a natural tendency for things to increase in entropy over time. An equivalent statement is that nature will spontaneously move toward the states that have the highest probability of existing.
Entropy can also be thought of as the number of possible arrangements that lead to a certain state. The more ways that a given state can be achieved, the greater the probability of finding that state, and the higher its entropy value. For example, think about the objects in your bedroom. Imagine every item being randomly placed at some location within the room. Now imagine this happening again and again. How many of the resulting arrangements would lead you to classify your room as "messy?" How many would qualify as a "clean" room? In this hypothetical example, every state has an equal possibility of happening, but because there are so many more ways to arrange items to make a messy room than a clean room, the "messy" state would have a higher entropy value than the "clean" state.
There are many examples in the chemical world of changes in entropy. Phase transitions are one obvious example. When a substance makes a transition from the liquid state to the gaseous state, the particles have many more possible arrangements, because they are no longer confined to a specified volume in which they are close to each other; gas particles can move freely throughout their container. Vaporization represents an increase in entropy. In the opposite direction, a liquid loses entropy when it freezes to a solid. Because solids have very ordered structures, there are fewer possible arrangements of particles that would result in the properties associated with a solid.
The Second Law of Thermodynamics
Recall that, according to the first law of thermodynamics, the total amount of energy in the universe is conserved for any given process. Entropy is not conserved; in fact, it is always increasing. Nature is constantly moving from less probable states to more probable ones. The second law of thermodynamics states that the entropy of the universe will increase for any spontaneous process.
To determine whether a given process is spontaneous, it is often helpful to break down the total entropy change as follows:


 ΔS_{univ} = ΔS_{sys} + ΔS_{surr}

where ΔS_{sys} and ΔS_{surr} represent the changes in entropy that occur in the system and in the surroundings, respectively.
To predict whether a given reaction will be spontaneous, we need to know the sign of ΔS_{univ}. If ΔS_{univ} is positive, the entropy of the universe increases, and the reaction is spontaneous in the direction that it is written. If ΔS_{univ} is negative, the reaction is spontaneous in the reverse direction. If ΔS_{univ} is equal to zero, the system is at equilibrium. To predict whether a reaction is spontaneous, we need to determine the entropy changes in the system and in the surroundings.
Entropy of a System (ΔS_{sys})
Let’s consider the change of state for one mole of water from liquid to gas:


 H_{2}O_{(l)} → H_{2}O_{(g)}

In this case, the water is the system, and the surrounding are everything else. How does the entropy of the water change in this process? As we saw earlier, the vaporization process leads to an increase in entropy, because there are many more possible ways to arrange the individual water molecules when they are allowed to move freely about their container. The entropy of the system increases, so ΔS_{sys} will be a positive value.
Phase changes are one type of process for which we can reliably predict the sign of the change in entropy. Not all transformations are quite so obvious, but the direction of the change in entropy can be easily predicted for certain types of chemical reactions.
 If there is a difference in the number of gaseous components between the reactants and the products, the side with more moles of gas will most likely have a higher entropy value. This is because a greater number of moles indicates a greater number of gas particles and a greater number of arrangements of the gas particles. For example, consider the following reaction:
 2 H
 _{2}
 (g) + O
 _{2}
 (g) → 2 H
 _{2}
 O(g) Three moles of gaseous reactants combine to make two moles of gaseous products. Therefore, we would expect this process to result in a decrease in entropy. ΔS
 _{sys}
 will have a negative value for this reaction.
 Dissolved substances have a higher entropy value than their corresponding precipitate. For example, if we mix silver nitrate with sodium chloride, we have the following reaction:
 AgNO
 _{3}
 (aq) + NaCl(aq) → NaNO
 _{3}
 (aq) + AgCl(s) The ions from the aqueous components are free to move around the entire solution, but the silver and chloride ions in the solid AgCl are all clustered together into a precipitate, ordered into a specific pattern of alternating cations and anions. Overall, this transformation represents a decrease in the entropy of the system.
 All else being equal, more separate particles corresponds to a higher degree of entropy. This makes sense when we think about the arrangements available. Consider the following reaction:
 C(s) + O
 _{2}
 (g) → CO
 _{2}
 (g) Each side of this equation contains one mole of gas particles, so that will not be a deciding factor. However, there are more total particles on the reactants side than on the products side. Because there are more ways to arrange two moles of particles than one mole of particles, this process represents an overall decrease in entropy.
 If there is an increase in temperature, entropy will increase.
 So far, we have been thinking about entropy in terms of the ways in which particles can be distributed over a certain amount of space. However, other factors that are subject to random distributions also make contributions to the entropy of a system. As you know, an increase in temperature means that there is more overall kinetic energy available to the individual particles. This energy is distributed randomly through enormous amounts of collisions between particles. Having more energy available means that there are more ways that it can be distributed, so an increase in temperature also corresponds to an increase in entropy.
Entropy of the Surroundings (ΔS_{surr})
In general, the process of interest is taking place in the system, and there are no changes in the composition of the surroundings. However, the temperature of the surroundings does generally change. Entropy changes in the surroundings are determined primarily by the flow of heat into or out of the system. In an exothermic process, heat flows into the surroundings, increasing the kinetic energy of the nearby particles. For an exothermic reaction, ΔS_{surr} is positive. Conversely, heat flows from the surroundings into the system during an endothermic process, lowering the kinetic energy available to the surroundings and resulting in a negative value for ΔS_{surr}.
As it turns out, the amount of entropy change for a given amount of heat transfer also depends on the absolute temperature. We will not go into the exact derivation, but it turns out that the entropy change of the surroundings can be defined in terms of the enthalpy change of the system:



\begin{align*}\Delta S_{surr}=\frac{\Delta H_{sys}}{T}\end{align*}
ΔSsurr=−ΔHsysT

\begin{align*}\Delta S_{surr}=\frac{\Delta H_{sys}}{T}\end{align*}

where T is the temperature in Kelvin. For an exothermic reaction, ΔH_{sys} is negative, so ΔS_{surr} would be a positive value. This makes sense, because heat is being released into the surroundings, increasing the amount of kinetic energy available to the surrounding particles. For an endothermic reaction, ΔH_{sys} is positive, so ΔS_{surr} would be a negative value.
Entropy of the Universe (ΔS_{univ})
Substituting this into our earlier equation for ΔS_{univ}, we get the following:



\begin{align*}\Delta S_{univ}=\Delta S_{sys}+\Delta S_{surr}\end{align*}
ΔSuniv=ΔSsys+ΔSsurr 
\begin{align*}\Delta S_{univ}=\Delta S_{sys}\frac{\Delta H_{sys}}{T}\end{align*}
ΔSuniv=ΔSsys−ΔHsysT

\begin{align*}\Delta S_{univ}=\Delta S_{sys}+\Delta S_{surr}\end{align*}

This is a particularly useful equation, because it allows us to determine whether a process is spontaneous by looking only at the system of interest. It also helps to explain why not all exothermic reactions are spontaneous, and not all reactions that increase the entropy of the system are spontaneous. The enthalpy change, entropy change, and overall temperature all factor into whether a given transformation will proceed spontaneously.
The Third Law of Thermodynamics
When we discussed enthalpy, we always talked about changes in enthalpy, never about the absolute enthalpy of a substance. Even the standard enthalpy of formation value (ΔH°_{f}) is a measure of the change in enthalpy between a compound and its elements in their standard states. There is no absolute zero for enthalpy, but this is not true for entropy. The third law of thermodynamics says that a perfectly regular crystal at a temperature of 0 K (absolute zero) would have an entropy value of 0.
As the temperature of a perfect crystal increases, its particles start to vibrate slightly around their optimal positions, thus increasing the entropy of the system. The dependence of entropy on temperature varies by substance, so the only temperature at which all crystals have the same entropy is absolute zero. The standard entropy of a substance is a measure of its entropy at 25°C and 1 atm of pressure. Like standard enthalpy of formation values, standard entropies are tabulated for a wide range of substances. However, unlike enthalpy of formation values, all standard entropy values are positive, because the absolute zero for entropy is the most ordered possible state. Additionally, this means that pure elements in their standard states do not have a standard entropy of zero.
Because entropy changes are generally small compared to enthalpy changes, we generally express their units in terms of joules instead of kilojoules. Standard entropy values are most commonly given in units of J/K•mol. A few representative values are given in the following table:
Substance  Standard Entropy S° J/K•mol 

H_{2}O_{(l)}  69.95 
H_{2}O_{(g)}  188.84 
carbon (graphite)  5.6 
carbon (diamond)  2.377 
carbon (vapor)  158.1 
methane  CH_{4}(g)  186.26 
ethane  C_{2}H_{6}(g)  229.2 
propane  C_{3}H_{8}(g)  270.3 
Note: When referring to standard entropy, standard enthalpy of formation, and standard heat of formation, we use the notation with the degree symbol to indicate the standard conditions of 25°C and 1 atm. Without the degree symbol these values are not necessarily from the standard state.
As expected, the entropy values for solids are low, the values for gases are high, and the ones for liquids are intermediate. Another observation can be made by looking at the three hydrocarbon gases at the end of the table. For similar molecules, a higher molecular weight generally leads to a larger standard entropy value. Although this is a drastic oversimplification, we can think of this in terms of the electrons that make up each molecule. A larger molecular weight generally means more protons, which also means more electrons. There are more ways to arrange a large number of electrons within a molecule than there are to arrange a smaller number. Although these arrangements are heavily constrained by the positions of the various nuclei, there is still an overall trend for larger molecules to have higher entropy values.
Calculating ΔS_{rxn}
Calculations of the change in entropy for a given reaction are analogous to those used to determine ΔH_{rxn}. The entropy change for a reaction can be calculated by taking the difference between the total of the standard entropy values of the products and those of the reactants:



\begin{align*}\mathrm{\Delta S ^\circ_{rxn}=\Sigma n S^\circ(products) \Sigma n S^\circ(reactants)}\end{align*}
ΔS∘rxn=ΣnS∘(products)−ΣnS∘(reactants)

\begin{align*}\mathrm{\Delta S ^\circ_{rxn}=\Sigma n S^\circ(products) \Sigma n S^\circ(reactants)}\end{align*}

As with our enthalpy calculations, each standard entropy value is multiplied by the coefficient of the corresponding substance in the balanced equation. Extensive tables of standard entropy values can be found on the internet.
Example 20.1
Calculate ΔS_{rxn} for the following reaction:



\begin{align*}N_2(g)+3H_2(g) \to 2NH_3(g)\end{align*}
N2(g)+3H2(g)→2NH3(g)

\begin{align*}N_2(g)+3H_2(g) \to 2NH_3(g)\end{align*}

The standard entropy values for N_{2}, H_{2}, and NH_{3} are 191.6, 130.7, and 192.5 J/K･mol, respectively.
Answer:
Multiply the standard entropy value of each component by its coefficient from the balanced equation, and subtract the values of the reactants from those of the products.



\begin{align*}\mathrm{\Delta S ^\circ_{rxn}=\Sigma n S^\circ(products) \Sigma n S^\circ(reactants)}\end{align*}
ΔS∘rxn=ΣnS∘(products)−ΣnS∘(reactants) 
\begin{align*}\mathrm{\Delta S ^\circ_{rxn}=2S^\circ(NH_3)[S^\circ(N_2)+3S^\circ(H_2)]}\end{align*}
ΔS∘rxn=2S∘(NH3)−[S∘(N2)+3S∘(H2)]  \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=2(192.5 \ J/K \cdot mol)[191.6 \ J/K \cdot mol+3(130.7 \ J/K \cdot mol)]}\end{align*}
 \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=198.7 \ J/K \cdot mol}\end{align*}

\begin{align*}\mathrm{\Delta S ^\circ_{rxn}=\Sigma n S^\circ(products) \Sigma n S^\circ(reactants)}\end{align*}

There is a substantial decrease in entropy over the course of this reaction. This could have been predicted simply by looking at the balanced equation. There are four moles of gaseous reactants and just two moles of gaseous products. In general, the side of the equation with more moles of gas has a higher total entropy.
Example 20.2
Predict whether the change in entropy would be positive or negative for the following reaction:


 \begin{align*}CaCO_3(s) \to CaO(s)+CO_2(g)\end{align*}

Then, use the standard entropy values for each substance to calculate the exact change in entropy.
Answer:
A single solid reactant is splitting into two products, one of which is a gas. We would expect the entropy of the system to increase over the course of this reaction. The exact change can be calculated as follows:


 \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=\Sigma n S^\circ(products) \Sigma n S^\circ(reactants)}\end{align*}
 \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=[S^\circ(CaO)+S^\circ(CO_2)]S^\circ(CaCO_3)}\end{align*}
 \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=[39.8 \ J/K \cdot mol + 213.6 \ J/K \cdot mol]92.9 \ J/K \cdot mol}\end{align*}
 \begin{align*}\mathrm{\Delta S ^\circ_{rxn}=160.5 \ J/K \cdot mol}\end{align*}

As predicted, ΔS_{rxn} is a positive value, indicating that entropy increases upon going from reactants to products.
Lesson Summary
 A process is spontaneous if the formation of products is favored under the given conditions. Spontaneous processes may be fast or slow.
 Entropy (S) measures the probability of attaining a given state. Things naturally progress towards more probable states, so entropy has a tendency to increase.
 The second law of thermodynamics states that the entropy of the universe will increase during any spontaneous process.
 The change in the entropy of the universe can be broken down into the following components:


 ΔS_{univ} = ΔS_{sys} + ΔS_{surr}

 The sign of ΔS_{univ} tells us whether or not a process is spontaneous in the direction that it is written. If ΔS_{univ} is positive, then the forward reaction is spontaneous.
 The third law of thermodynamics states that the entropy of a perfect crystal at 0 K is zero.
 Standard entropy is the entropy of a substance at 25°C and 1 atm of pressure.
 The entropy change for a reaction can be calculated using standard entropy values.
Lesson Review Questions
 Define entropy.
 State the second law of thermodynamics.
 For each of the following situations, state whether there is an increase or decrease in entropy. Explain your reasoning in each case.
 liquid water freezes.
 a car is in a collision that completely demolishes it.
 wood burns.
 I_{2}(s) → I_{2}(g).
 2 Mg(s) + O_{2}(g) → 2 MgO(s).
 Use the entropy values from
 http://chemed.chem.wisc.edu/chempaths/TableofStandardMolarEntropies1184.html
 for the following calculations:
 Calculate ΔS°_{rxn} for the reaction H_{2}(g) + Cl_{2}(g) → 2HCl(g).
 Calculate ΔS°_{rxn} for reactions d and e of question 3, and comment on how well your predictions matched the actual entropy changes.
 Which form of carbon has a higher degree of organization: diamond or graphite? Explain your answer.
 Compare the S° of ethane with that of ethanol (159.9 J/K•mol). Explain why ethane has a higher standard entropy value than ethanol. (Hint: the boiling point of ethane is 89°C, and the boiling point of ethanol is 78°C. At what temperature are standard entropy values tabulated?
Further Reading/Supplementary Links
 Some basic ideas about entropy: http://entropysimple.oxy.edu/content.htm
 Entropy values: http://boomeria.org/chemtextbook/cch20.html
 Table of standard molar entropies: http://chemed.chem.wisc.edu/chempaths/TableofStandardMolarEntropies1184.html
Points to Consider
 Can we reliably predict under what conditions a reaction will be spontaneous?