• General College Chemistry
  • Foreword
  • Unit I. Atoms
  • Unit II. Molecules
  • Unit III. Interactions
  • Unit IV. Reactions
  • Download
  • Translations
  • 39

    Entropy

    MatterEnergyHeatEntropy

    Entropy (S) is a state function that can be related to the number of microstates for a system (the number of ways the system can be arranged) and to the ratio of reversible heat to kelvin temperature. It may be interpreted as a measure of the dispersal or distribution of matter and/or energy in a system, and it is often described as representing the “disorder” of the system. For a given substance, entropy depends on phase with Ssolid < Sliquid < Sgas. For different substances in the same physical state at a given temperature, entropy is typically greater for heavier atoms or more complex molecules. Entropy increases when a system is heated and when solutions form. Using these guidelines, the sign of entropy changes for some chemical reactions and physical changes may be reliably predicted. The second law of thermodynamics states that a spontaneous process increases the entropy of the universe, Suniv > 0. If ΔSuniv < 0, the process is nonspontaneous, and if ΔSuniv = 0, the system is at equilibrium. The third law of thermodynamics establishes the zero for entropy as that of a perfect, pure crystalline solid at 0 K. With only one possible microstate, the entropy is zero. We may compute the standard entropy change for a process by using standard entropy values for the reactants and products involved in the process.

    39.1 Entropy

    Learning Objectives

    By the end of this section, you will be able to:

    • Define entropy
    • Explain the relationship between entropy and the number of microstates
    • Predict the sign of the entropy change for chemical and physical processes

    In 1824, at the age of 28, Nicolas Léonard Sadi Carnot (Figure 39.1) published the results of an extensive study regarding the efficiency of steam heat engines. A later review of Carnot’s findings by Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat (qrev) and the kelvin temperature (T). In thermodynamics, a reversible process is one that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change in some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible.

    Figure 39.1

    (a) Nicholas Léonard Sadi Carnot’s research into steam-powered machinery and (b) Rudolf Clausius’s later study of those findings led to groundbreaking discoveries about spontaneous heat flow processes.

     A portrait of Rudolf Clasius is shown.

    Similar to other thermodynamic properties, this new quantity is a state function, so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy (S) and defined its change for any process as the following:

    ΔS=qrevTΔS=qrevT

    The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.

    Entropy and Microstates

    Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates (W) possible for the system. A microstate is a specific configuration of all the locations and energies of the atoms or molecules that make up a system. The relation between a system’s entropy and the number of possible microstates is

    S=klnWS=klnW

    where k is the Boltzmann constant, 1.38 ×× 10−23 J/K.

    As for other state functions, the change in entropy for a process is the difference between its final (Sf) and initial (Si) values:

    ΔS=SfSi=klnWfklnWi=klnWfWiΔS=SfSi=klnWfklnWi=klnWfWi

    For processes involving an increase in the number of microstates, Wf > Wi, the entropy of the system increases and ΔS > 0. Conversely, processes that reduce the number of microstates, Wf < Wi, yield a decrease in system entropy, ΔS < 0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.

    Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is nN. For example, distributing four particles among two boxes will result in 24 = 16 different microstates as illustrated in Figure 39.2. Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions. The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy.

    Figure 39.2

    The sixteen microstates associated with placing four particles in two boxes are shown. The microstates are collected into five distributions—(a), (b), (c), (d), and (e)—based on the numbers of particles in each box.

    Five rows of diagrams that look like dominoes are shown and labeled a, b, c, d, and e. Row a has one “domino” that has four dots on the left side, red, green, blue and yellow in a clockwise pattern from the top left, and no dots on the right. Row b has four “dominos,” each with three dots on the left and one dot on the right. The first shows a “domino” with green, yellow and blue on the left and red on the right. The second “domino” has yellow, blue and red on the left and green on the right. The third “domino” has red, green and yellow on the left and blue on the right while the fourth has red, green and blue on the left and yellow on the right. Row c has six “dominos”, each with two dots on either side. The first has a red and green on the left and a blue and yellow on the right. The second has a red and blue on the left and a green and yellow on the right while the third has a yellow and red on the left and a green and blue on the right. The fourth has a green and blue on the left and a red and yellow on the right. The fifth has a green and yellow on the left and a red and blue on the right. The sixth has a blue and yellow on the left and a green and red on the right. Row d has four “dominos,” each with one dot on the left and three on the right. The first “domino” has red on the left and a blue, green and yellow on the right. The second has a green on the left and a red, yellow and blue on the right. The third has a blue on the left and a red, green and yellow on the right. The fourth has a yellow on the left and a red, green and blue on the right. Row e has 1 “domino” with no dots on the left and four dots on the right that are red, green, blue and yellow.

    For this system, the most probable configuration is one of the six microstates associated with distribution (c) where the particles are evenly distributed between the boxes, that is, a configuration of two particles in each box. The probability of finding the system in this configuration is 616616 or 38.38. The least probable configuration of the system is one in which all four particles are in one box, corresponding to distributions (a) and (e), each with a probability of 116.116. The probability of finding all particles in only one box (either the left box or right box) is then (116+116)=216(116+116)=216 or 18.18.

    As you add more particles to the system, the number of possible microstates increases exponentially (2N). A macroscopic (laboratory-sized) system would typically consist of moles of particles (N ~ 1023), and the corresponding number of microstates would be staggeringly huge. Regardless of the number of particles in the system, however, the distributions in which roughly equal numbers of particles are found in each box are always the most probable configurations.

    This matter dispersal model of entropy is often described qualitatively in terms of the disorder of the system. By this description, microstates in which all the particles are in a single box are the most ordered, thus possessing the least entropy. Microstates in which the particles are more evenly distributed among the boxes are more disordered, possessing greater entropy.

    The previous description of an ideal gas expanding into a vacuum (Figure 39.3) is a macroscopic example of this particle-in-a-box model. For this system, the most probable distribution is confirmed to be the one in which the matter is most uniformly dispersed or distributed between the two flasks. Initially, the gas molecules are confined to just one of the two flasks. Opening the valve between the flasks increases the volume available to the gas molecules and, correspondingly, the number of microstates possible for the system. Since Wf > Wi, the expansion process involves an increase in entropy (ΔS > 0) and is spontaneous.

    Figure 39.3

    An isolated system consists of an ideal gas in one flask that is connected by a closed valve to a second flask containing a vacuum. Once the valve is opened, the gas spontaneously becomes evenly distributed between the flasks.

    A diagram shows two two-sided flasks connected by a right-facing arrow labeled “Spontaneous” and a left-facing arrow labeled “Nonspontaneous.” Each pair of flasks are connected to one another by a tube with a stopcock. In the left pair of flasks, the left flask contains thirty particles evenly dispersed while the right flask contains nothing and the stopcock is closed. The right pair of flasks has an open stopcock and equal numbers of particles in both flasks.

    A similar approach may be used to describe the spontaneous flow of heat. Consider a system consisting of two objects, each containing two particles, and two units of thermal energy (represented as “*”) in Figure 39.4. The hot object is comprised of particles A and B and initially contains both energy units. The cold object is comprised of particles C and D, which initially has no energy units. Distribution (a) shows the three microstates possible for the initial state of the system, with both units of energy contained within the hot object. If one of the two energy units is transferred, the result is distribution (b) consisting of four microstates. If both energy units are transferred, the result is distribution (c) consisting of three microstates. Thus, we may describe this system by a total of ten microstates. The probability that the heat does not flow when the two objects are brought into contact, that is, that the system remains in distribution (a), is 310.310. More likely is the flow of heat to yield one of the other two distribution, the combined probability being 710.710. The most likely result is the flow of heat to yield the uniform dispersal of energy represented by distribution (b), the probability of this configuration being 410.410. This supports the common observation that placing hot and cold objects in contact results in spontaneous heat flow that ultimately equalizes the objects’ temperatures. And, again, this spontaneous process is also characterized by an increase in system entropy.

    Figure 39.4

    This shows a microstate model describing the flow of heat from a hot object to a cold object. (a) Before the heat flow occurs, the object comprised of particles A and B contains both units of energy and as represented by a distribution of three microstates. (b) If the heat flow results in an even dispersal of energy (one energy unit transferred), a distribution of four microstates results. (c) If both energy units are transferred, the resulting distribution has three microstates.

    Three rows labeled a, b, and c are shown and each contains rectangles with two sides where the left side is labeled, “A,” and “B,” and the right is labeled, “C,” and “D.” Row a has three rectangles where the first has a dot above and below the letter A, the second has a dot above the A and B, and the third which has a dot above and below the letter B. Row b has four rectangles; the first has a dot above A and C, the second has a dot above A and D, the third has a dot above B and C and the fourth has a dot above B and D. Row c has three rectangles; the first has a dot above and below the letter C, the second has a dot above C and D and the third has a dot above and below the letter D.

    Example 39.1

    Determination of ΔS

    Calculate the change in entropy for the process depicted below.A diagram shows one rectangle with two sides that has four dots, red, green, yellow and blue written on the left side. A right-facing arrow leads to six more two-sided rectangles, each with two dots on the left and right sides. The first rectangle has a red and green dot on the left and a blue and yellow on the right, while the second shows a red and blue on the left and a green and yellow on the right. The third rectangle has a red and yellow dot on the left and a blue and green on the right, while the fourth shows a green and blue on the left and a red and yellow on the right. The fifth rectangle has a yellow and green dot on the left and a blue and red on the right, while the sixth shows a yellow and blue on the left and a green and red on the right.

    Solution

    The initial number of microstates is one, the final six:

    ΔS=klnWcWa=1.38×10−23J/K×ln61=2.47×10−23J/KΔS=klnWcWa=1.38×10−23J/K×ln61=2.47×10−23J/K

    The sign of this result is consistent with expectation; since there are more microstates possible for the final state than for the initial state, the change in entropy should be positive.

    Check Your Learning

    Consider the system shown in Figure 39.4. What is the change in entropy for the process where all the energy is transferred from the hot object (AB) to the cold object (CD)?

    0 J/K

    Predicting the Sign of ΔS

    The relationships between entropy, microstates, and matter/energy dispersal described previously allow us to make generalizations regarding the relative entropies of substances and to predict the sign of entropy changes for chemical and physical processes. Consider the phase changes illustrated in Figure 39.5. In the solid phase, the atoms or molecules are restricted to nearly fixed positions with respect to each other and are capable of only modest oscillations about these positions. With essentially fixed locations for the system’s component particles, the number of microstates is relatively small. In the liquid phase, the atoms or molecules are free to move over and around each other, though they remain in relatively close proximity to one another. This increased freedom of motion results in a greater variation in possible particle locations, so the number of microstates is correspondingly greater than for the solid. As a result, Sliquid > Ssolid and the process of converting a substance from solid to liquid (melting) is characterized by an increase in entropy, ΔS > 0. By the same logic, the reciprocal process (freezing) exhibits a decrease in entropy, ΔS < 0.

    Figure 39.5

    The entropy of a substance increases (ΔS > 0) as it transforms from a relatively ordered solid, to a less-ordered liquid, and then to a still less-ordered gas. The entropy decreases (ΔS < 0) as the substance transforms from a gas to a liquid and then to a solid.

    Three stoppered flasks are shown with right and left-facing arrows in between each; the first is labeled above as, “delta S greater than 0,” and below as, “delta S less than 0,” while the second is labeled above as, “delta S greater than 0,” and below as, “delta S less than 0.” A long, right-facing arrow is drawn above all the flasks and labeled, “Increasing entropy.” The left flask contains twenty-seven particles arranged in a cube in the bottom of the flask and is labeled, “Crystalline solid,” below. The middle flask contains twenty-seven particles dispersed randomly in the bottom of the flask and is labeled, “Liquid,” below. The right flask contains twenty-seven particles dispersed inside of the flask and moving rapidly and is labeled, “Gas,” below.

    Now consider the gaseous phase, in which a given number of atoms or molecules occupy a much greater volume than in the liquid phase. Each atom or molecule can be found in many more locations, corresponding to a much greater number of microstates. Consequently, for any substance, Sgas > Sliquid > Ssolid, and the processes of vaporization and sublimation likewise involve increases in entropy, ΔS > 0. Likewise, the reciprocal phase transitions, condensation and deposition, involve decreases in entropy, ΔS < 0.

    According to kinetic-molecular theory, the temperature of a substance is proportional to the average kinetic energy of its particles. Raising the temperature of a substance will result in more extensive vibrations of the particles in solids and more rapid translations of the particles in liquids and gases. At higher temperatures, the distribution of kinetic energies among the atoms or molecules of the substance is also broader (more dispersed) than at lower temperatures. Thus, the entropy for any substance increases with temperature (Figure 39.6).

    Figure 39.6

    Entropy increases as the temperature of a substance is raised, which corresponds to the greater spread of kinetic energies. When a substance undergoes a phase transition, its entropy changes significantly.

    Two graphs are shown. The y-axis of the left graph is labeled, “Fraction of molecules,” while the x-axis is labeled, “Velocity, v ( m / s ),” and has values of 0 through 1,500 along the axis with increments of 500. Four lines are plotted on this graph. The first, labeled, “100 K,” peaks around 200 m / s while the second, labeled, “200 K,” peaks near 300 m / s and is slightly lower on the y-axis than the first. The third line, labeled, “500 K,” peaks around 550 m / s and is lower than the first two on the y-axis. The fourth line, labeled, “1000 K,” peaks around 750 m / s and is the lowest of the four on the y-axis. Each line get increasingly broad. The second graph has a y-axis labeled, “Entropy, S,” with an upward-facing arrow and an x-axis labeled, “Temperature ( K ),” and a right-facing arrow. The graph has three equally spaced columns in the background, labeled, “Solid,” “Liquid,” and, “Gas,” from left to right. A line extends slightly upward through the first column in a slight upward direction, then goes straight up in the transition between the first two columns. In then progresses in a slight upward direction through the second column, then goes up dramatically between the second and third columns, then continues in a slight upward direction once more. The first vertical region of this line is labeled, “Melting,” and the second is labeled, “Boiling.”

    The entropy of a substance is influenced by the structure of the particles (atoms or molecules) that comprise the substance. With regard to atomic substances, heavier atoms possess greater entropy at a given temperature than lighter atoms, which is a consequence of the relation between a particle’s mass and the spacing of quantized translational energy levels (a topic beyond the scope of this text). For molecules, greater numbers of atoms increase the number of ways in which the molecules can vibrate and thus the number of possible microstates and the entropy of the system.

    Finally, variations in the types of particles affects the entropy of a system. Compared to a pure substance, in which all particles are identical, the entropy of a mixture of two or more different particle types is greater. This is because of the additional orientations and interactions that are possible in a system comprised of nonidentical components. For example, when a solid dissolves in a liquid, the particles of the solid experience both a greater freedom of motion and additional interactions with the solvent particles. This corresponds to a more uniform dispersal of matter and energy and a greater number of microstates. The process of dissolution therefore involves an increase in entropy, ΔS > 0.

    Considering the various factors that affect entropy allows us to make informed predictions of the sign of ΔS for various chemical and physical processes as illustrated in Example 39.2.

    Example 39.2

    Predicting the Sign of ∆S

    Predict the sign of the entropy change for the following processes. Indicate the reason for each of your predictions.

    (a) One mole liquid water at room temperature one mole liquid water at 50 °C

    (b) Ag+(aq)+Cl(aq)AgCl(s)Ag+(aq)+Cl(aq)AgCl(s)

    (c) C6H6(l)+152O2(g)6CO2(g)+3H2O(l)C6H6(l)+152O2(g)6CO2(g)+3H2O(l)

    (d) NH3(s)NH3(l)NH3(s)NH3(l)

    Solution

    (a) positive, temperature increases

    (b) negative, reduction in the number of ions (particles) in solution, decreased dispersal of matter

    (c) negative, net decrease in the amount of gaseous species

    (d) positive, phase transition from solid to liquid, net increase in dispersal of matter

    Check Your Learning

    Predict the sign of the entropy change for the following processes. Give a reason for your prediction.

    (a) NaNO3(s)Na+(aq)+NO3(aq)NaNO3(s)Na+(aq)+NO3(aq)

    (b) the freezing of liquid water

    (c) CO2(s)CO2(g)CO2(s)CO2(g)

    (d) CaCO3(s)CaO(s)+CO2(g)CaCO3(s)CaO(s)+CO2(g)

    (a) Positive; The solid dissolves to give an increase of mobile ions in solution. (b) Negative; The liquid becomes a more ordered solid. (c) Positive; The relatively ordered solid becomes a gas. (d) Positive; There is a net increase in the amount of gaseous species.

    Link to Supplemental Exercises

    Supplemental exercises are available if you would like more practice with these concepts.

    39.2 The Second and Third Laws of Thermodynamics

    Learning Objectives

    By the end of this section, you will be able to:

    • State and explain the second and third laws of thermodynamics
    • Calculate entropy changes for phase transitions and chemical reactions under standard conditions

    The Second Law of Thermodynamics

    In the quest to identify a property that may reliably predict the spontaneity of a process, a promising candidate has been identified: entropy. Processes that involve an increase in entropy of the systemS > 0) are very often spontaneous; however, examples to the contrary are plentiful. By expanding consideration of entropy changes to include the surroundings, we may reach a significant conclusion regarding the relation between this property and spontaneity. In thermodynamic models, the system and surroundings comprise everything, that is, the universe, and so the following is true:

    ΔSuniv=ΔSsys+ΔSsurrΔSuniv=ΔSsys+ΔSsurr

    To illustrate this relation, consider again the process of heat flow between two objects, one identified as the system and the other as the surroundings. There are three possibilities for such a process:

    1. The objects are at different temperatures, and heat flows from the hotter to the cooler object. This is always observed to occur spontaneously. Designating the hotter object as the system and invoking the definition of entropy yields the following:
      ΔSsys=qrevTsysandΔSsurr=qrevTsurrΔSsys=qrevTsysandΔSsurr=qrevTsurr
      The magnitudes of −qrev and qrev are equal, their opposite arithmetic signs denoting loss of heat by the system and gain of heat by the surroundings. Since Tsys > Tsurr in this scenario, the entropy decrease of the system will be less than the entropy increase of the surroundings, and so the entropy of the universe will increase:
      |ΔSsys|<|ΔSsurr|ΔSuniv=ΔSsys+ΔSsurr>0| ΔSsys |<| ΔSsurr |ΔSuniv=ΔSsys+ΔSsurr>0
    2. The objects are at different temperatures, and heat flows from the cooler to the hotter object. This is never observed to occur spontaneously. Again designating the hotter object as the system and invoking the definition of entropy yields the following:
      ΔSsys=qrevTsysandΔSsurr=qrevTsurrΔSsys=qrevTsysandΔSsurr=qrevTsurr
      The arithmetic signs of qrev denote the gain of heat by the system and the loss of heat by the surroundings. The magnitude of the entropy change for the surroundings will again be greater than that for the system, but in this case, the signs of the heat changes (that is, the direction of the heat flow) will yield a negative value for ΔSuniv. This process involves a decrease in the entropy of the universe.
    3. The objects are at essentially the same temperature, TsysTsurr, and so the magnitudes of the entropy changes are essentially the same for both the system and the surroundings. In this case, the entropy change of the universe is zero, and the system is at equilibrium.
      |ΔSsys||ΔSsurr|ΔSuniv=ΔSsys+ΔSsurr=0| ΔSsys || ΔSsurr |ΔSuniv=ΔSsys+ΔSsurr=0

    These results lead to a profound statement regarding the relation between entropy and spontaneity known as the second law of thermodynamics: all spontaneous changes cause an increase in the entropy of the universe. A summary of these three relations is provided in Table 39.1.

    Table 39.1

    The Second Law of Thermodynamics

    ΔSuniv > 0spontaneous
    ΔSuniv < 0nonspontaneous (spontaneous in opposite direction)
    ΔSuniv = 0at equilibrium

    For many realistic applications, the surroundings are vast in comparison to the system. In such cases, the heat gained or lost by the surroundings as a result of some process represents a very small, nearly infinitesimal, fraction of its total thermal energy. For example, combustion of a fuel in air involves transfer of heat from a system (the fuel and oxygen molecules undergoing reaction) to surroundings that are infinitely more massive (the earth’s atmosphere). As a result, qsurr is a good approximation of qrev, and the second law may be stated as the following:

    ΔSuniv=ΔSsys+ΔSsurr=ΔSsys+qsurrTΔSuniv=ΔSsys+ΔSsurr=ΔSsys+qsurrT

    We may use this equation to predict the spontaneity of a process as illustrated in Example 39.3.

    Example 39.3

    Will Ice Spontaneously Melt?

    The entropy change for the process
    H2O(s)H2O(l)H2O(s)H2O(l)

    is 22.1 J/K and requires that the surroundings transfer 6.00 kJ of heat to the system. Is the process spontaneous at −10.00 °C? Is it spontaneous at +10.00 °C?

    Solution

    We can assess the spontaneity of the process by calculating the entropy change of the universe. If ΔSuniv is positive, then the process is spontaneous. At both temperatures, ΔSsys = 22.1 J/K and qsurr = −6.00 kJ.

    At −10.00 °C (263.15 K), the following is true:

    ΔSuniv=ΔSsys+ΔSsurr=ΔSsys+qsurrT=22.1 J/K+−6.00×103J263.15 K=−0.7J/KΔSuniv=ΔSsys+ΔSsurr=ΔSsys+qsurrT=22.1 J/K+−6.00×103J263.15 K=−0.7J/K

    Suniv < 0, so melting is nonspontaneous (not spontaneous) at −10.0 °C.

    At 10.00 °C (283.15 K), the following is true:

    ΔSuniv=ΔSsys+qsurrT=22.1J/K+−6.00×103J283.15 K=+0.9 J/KΔSuniv=ΔSsys+qsurrT=22.1J/K+−6.00×103J283.15 K=+0.9 J/K

    Suniv > 0, so melting is spontaneous at 10.00 °C.

    Check Your Learning

    Using this information, determine if liquid water will spontaneously freeze at the same temperatures. What can you say about the values of Suniv?

    Entropy is a state function, so ΔSfreezing = −ΔSmelting = −22.1 J/K and qsurr = +6.00 kJ. At −10.00 °C spontaneous, +0.7 J/K; at +10.00 °C nonspontaneous, −0.9 J/K.

    The Third Law of Thermodynamics

    The previous section described the various contributions of matter and energy dispersal that contribute to the entropy of a system. With these contributions in mind, consider the entropy of a pure, perfectly crystalline solid possessing no kinetic energy (that is, at a temperature of absolute zero, 0 K). This system may be described by a single microstate, as its purity, perfect crystallinity and complete lack of motion means there is but one possible location for each identical atom or molecule comprising the crystal (W = 1). According to the Boltzmann equation, the entropy of this system is zero.

    S=klnW=kln(1)=0S=klnW=kln(1)=0

    This limiting condition for a system’s entropy represents the third law of thermodynamics: the entropy of a pure, perfect crystalline substance at 0 K is zero.

    Careful calorimetric measurements can be made to determine the temperature dependence of a substance’s entropy and to derive absolute entropy values under specific conditions. Standard entropies (S°) are for one mole of substance under standard conditions (a pressure of 1 bar and a temperature of 298.15 K; see details regarding standard conditions in the thermochemistry chapter of this text). The standard entropy change (ΔS°) for a reaction may be computed using standard entropies as shown below:

    ΔS°=νS°(products)νS°(reactants)ΔS°=νS°(products)νS°(reactants)

    where ν represents stoichiometric coefficients in the balanced equation representing the process. For example, ΔS° for the following reaction at room temperature

    mA+nBxC+yD,mA+nBxC+yD,

    is computed as:

    =[xS°(C)+yS°(D)][mS°(A)+nS°(B)]=[xS°(C)+yS°(D)][mS°(A)+nS°(B)]

    A partial listing of standard entropies is provided in Table 39.2, and additional values are provided in OpenStax's Appendix G. The example exercises that follow demonstrate the use of S° values in calculating standard entropy changes for physical and chemical processes.

    Table 39.2

    Standard entropies for selected substances measured at 1 atm and 298.15 K. (Values are approximately equal to those measured at 1 bar, the currently accepted standard state pressure.)

    SubstanceS°S° (J mol−1 K−1)
    carbon
    C(s, graphite)5.740
    C(s, diamond)2.38
    CO(g)197.7
    CO2(g)213.8
    CH4(g)186.3
    C2H4(g)219.5
    C2H6(g)229.5
    CH3OH(l)126.8
    C2H5OH(l)160.7
    hydrogen
    H2(g)130.57
    H(g)114.6
    H2O(g)188.71
    H2O(l)69.91
    HCI(g)186.8
    H2S(g)205.7
    oxygen
    O2(g)205.03

    Example 39.4

    Determination of ΔS°

    Calculate the standard entropy change for the following process:
    H2O(g)H2O(l)H2O(g)H2O(l)

    Solution

    Calculate the entropy change using standard entropies as shown above:
    ΔS°=(1mol)(70.0Jmol1K1)(1mol)(188.8Jmol1K1)=118.8J/KΔS°=(1mol)(70.0Jmol1K1)(1mol)(188.8Jmol1K1)=118.8J/K

    The value for ΔS° is negative, as expected for this phase transition (condensation), which the previous section discussed.

    Check Your Learning

    Calculate the standard entropy change for the following process:
    H2(g)+C2H4(g)C2H6(g)H2(g)+C2H4(g)C2H6(g)

    −120.6 J K–1 mol–1

    Example 39.5

    Determination of ΔS°

    Calculate the standard entropy change for the combustion of methanol, CH3OH:
    2CH3OH(l)+3O2(g)2CO2(g)+4H2O(l)2CH3OH(l)+3O2(g)2CO2(g)+4H2O(l)

    Solution

    Calculate the entropy change using standard entropies as shown above:
    ΔS°=νS°(products)νS°(reactants)ΔS°=νS°(products)νS°(reactants)
    [2mol×S°(CO2(g))+4mol×S°(H2O(l))][2mol×S°(CH3OH(l))+3mol×S°(O2(g))]={[2(213.8)+4×70.0][2(126.8)+3(205.03)]}=−161.1J/K[2mol×S°(CO2(g))+4mol×S°(H2O(l))][2mol×S°(CH3OH(l))+3mol×S°(O2(g))]={[2(213.8)+4×70.0][2(126.8)+3(205.03)]}=−161.1J/K

    Check Your Learning

    Calculate the standard entropy change for the following reaction:
    Ca(OH)2(s)CaO(s)+H2O(l)Ca(OH)2(s)CaO(s)+H2O(l)

    24.7 J/K

    Link to Supplemental Exercises

    Supplemental exercises are available if you would like more practice with these concepts.

    Files

    Previous Citation(s)
    Flowers, P., Neth, E. J., Robinson, W. R., Theopold, K., & Langley, R. (2019). Chemistry in Context. In Chemistry: Atoms First 2e. OpenStax. https://openstax.org/books/chemistry-atoms-first-2e/pages/12-introduction

    This content is provided to you freely by EdTech Books.

    Access it online or download it at https://edtechbooks.org/general_college_chemistry/entropy.