Tải bản đầy đủ - 0 (trang)
2 Enthalpy, Entropy, and Spontaneous Processes: A Brief Review

2 Enthalpy, Entropy, and Spontaneous Processes: A Brief Review

Tải bản đầy đủ - 0trang

16.2 ENTHALPY, ENTROPY, AND SPONTANEOUS PROCESSES: A BRIEF REVIEW



643



and O2 is partly converted to heat, which flows from the system (reactants plus products) to the surroundings:

CH 4(g) + 2 O2(g) ¡ CO2(g) + 2 H 2O(l)



¢H° = -890.3 kJ



Because heat is lost by the system, the reaction is exothermic and the standard

enthalpy of reaction is negative ( ¢H° = -890.3 kJ). The total energy is conserved, so

all the energy lost by the system shows up as heat gained by the surroundings.

Because spontaneous reactions so often give off heat, the nineteenth-century

French chemist Marcellin Berthelot proposed that spontaneous chemical or physical

changes are always exothermic. But Berthelot’s proposal can’t be correct. Ice, for

example, spontaneously absorbs heat from the surroundings and melts at temperatures above 0 °C. Similarly, liquid water absorbs heat and spontaneously boils at

temperatures above 100 °C. As further examples, gaseous N2O4 absorbs heat when it

decomposes to NO2 at 400 K, and table salt absorbs heat when it dissolves in water at

room temperature:

H 2O(s)

H 2O(l)

N2O4(g)

NaCl(s)



¡ H 2O(l)

¡ H 2O(g)

¡ 2 NO2(g)

¡ Na + (aq) + Cl -(aq)



H2O

CO2



¢Hfusion = +6.01 kJ

¢Hvap = +40.7 kJ

¢H° = +55.3 kJ

¢H° = +3.88 kJ



All these processes are endothermic, yet all are spontaneous. In all cases, the system

moves spontaneously to a state of higher enthalpy by absorbing heat from the

surroundings.

Because some spontaneous reactions are exothermic and others are endothermic,

enthalpy alone can’t account for the direction of spontaneous change; a second factor

must be involved. This second determinant of spontaneous change is nature’s tendency to move to a condition of maximum randomness (Section 8.12).

Molecular randomness is called entropy and is denoted by the symbol S. Entropy

is a state function (Section 8.2), and the entropy change ¢S for a process thus depends

only on the initial and final states of the system:

¢S = Sfinal - Sinitial

When the randomness of a system increases, ¢S has a positive value; when randomness decreases, ¢S is negative.

The randomness of a system comes about because the particles in the system

(atoms, ions, and molecules) are in incessant motion, moving about in the accessible volume, colliding with each other and continually exchanging energy.

Randomness—and thus entropy—is a probability concept, related to the number of

ways that a particular state of a system can be achieved. A particular state of a macroscopic system, characterized by its temperature, pressure, volume, and number of

particles, can be achieved in a vast number of ways in which the fluctuating positions and energies of the individual particles differ but the volume and total energy

are constant.

We’ll examine the relationship between entropy and probability in the next section, but first let’s take a qualitative look at the four spontaneous endothermic

processes mentioned previously (melting of ice, boiling of liquid water, decomposition of N2O4, and dissolving of NaCl in water). Each of these processes involves an

increase in the randomness of the system. When ice melts, for example, randomness

increases because the highly ordered crystalline arrangement of tightly held water

molecules collapses and the molecules become free to move about in the liquid.

When liquid water vaporizes, randomness further increases because the molecules

can now move independently in the much larger volume of the gas. In general,

processes that convert a solid to a liquid or a liquid to a gas involve an increase in

randomness and thus an increase in entropy (Figure 16.3).



O2

CH4



᭡ The combustion of natural

gas (mainly CH4) in air is a

spontaneous, exothermic

reaction.



Remember...

A state function is a function or property

whose value depends only on the present

state (condition) of the system, not on the

path used to arrive at that condition. Pressure, volume, temperature, enthalpy, and

entropy are state functions. (Section 8.2)



644



Chapter 16 THERMODYNAMICS: ENTROPY, FREE ENERGY, AND EQUILIBRIUM



Figure 16.3



How molecular randomness—and thus

entropy—changes when solids, liquids,

and gases interconvert.



Less randomness

(less entropy)



More randomness

(more entropy)



𝚫S > 0



𝚫S > 0



𝚫S < 0



𝚫S < 0



Melting



Vaporization



Solid



Gas



Liquid

Freezing



Condensation



The decomposition of N2O4 (O2N ¬ NO2) is accompanied by an increase in randomness because breaking the N ¬ N bond allows the two gaseous NO2 fragments

to move independently. Whenever a molecule breaks into two or more pieces, the

amount of molecular randomness increases. More specifically, randomness—and

thus entropy—increases whenever a reaction results in an increase in the number of

gaseous particles (Figure 16.4).

Figure 16.4



How molecular randomness—and thus

entropy—changes when the number of

gaseous particles changes.



Less randomness

(less entropy)



More randomness

(more entropy)



𝚫S > 0

𝚫S < 0



Reaction increases number of gas particles.



N2O4



Remember...

Hydrated ions are surrounded and stabilized

by an ordered shell of solvent water molecules. The stabilization results from ion–dipole

attractions. (Section 11.2)



Reaction decreases number of gas particles.



2 NO2



The entropy change on dissolving sodium chloride in water occurs because the

crystal structure of solid NaCl is disrupted and the Na + and Cl - ions become

hydrated (Section 11.2). Disruption of the crystal increases randomness because the

Na + and Cl - ions are tightly held in the solid but are free to move about in the liquid.

The hydration process, however, decreases randomness because the polar, hydrating

water molecules adopt an orderly arrangement about the Na + and Cl - ions. It turns

out that the overall dissolution process for NaCl results in a net increase in randomness, and ¢S is thus positive (Figure 16.5). This is usually the case for the dissolution

of molecular solids, such as HgCl2, and salts that contain +1 cations and -1 anions.

For salts such as CaSO4, which contain more highly charged ions, the hydrating

water molecules are more strongly attached to the ions and the dissolution process

often results in a net decrease in entropy. The following dissolution reactions illustrate the point:

HgCl2(s) ¡ HgCl2(aq)

NaCl(s) ¡ Na + (aq) + Cl -(aq)

CaSO4(s) ¡ Ca2 + (aq) + SO 42-(aq)



¢S = +9 J/(K # mol)

¢S = +43 J/(K # mol)



¢S = -140 J/(K # mol)



16.2 ENTHALPY, ENTROPY, AND SPONTANEOUS PROCESSES: A BRIEF REVIEW



Less randomness

(less entropy)



More randomness

(more entropy)



H2O



H2O



+





− + − + −

+ − + − +

− + − + −

+ − + − +

NaCl − + − + −



NaCl(s) + H2O(l)



The polar H2O molecules are

oriented such that the partially

positive H atoms are near the

anions and the partially negative

O atoms are near the cations.



ΔS > 0

ΔS < 0







+



Na+(aq) + Cl–(aq)



Disruption of the crystal increases the entropy, but the hydration process decreases

the entropy. For the dissolution of NaCl, the net effect is an entropy increase.



Figure 16.5



Dissolution of sodium chloride. When NaCl dissolves in water, the crystal breaks up, and

the Na+ and Cl - ions are surrounded by hydrating water molecules.



WORKED EXAMPLE 16.1



PREDICTING THE SIGN OF ≤S

Predict the sign of ¢S in the system for each of the following processes:

(a) CO2(s) ¡ CO 2(g) (sublimation of dry ice)

(b) CaSO 4(s) ¡ CaO(s) + SO 3(g)

(c) N2(g) + 3 H 2(g) ¡ 2 NH 3(g)

(d) I 2(s) ¡ I 2(aq) (dissolution of iodine in water)

STRATEGY



To predict the sign of ¢S, look to see whether the process involves a phase change, a

change in the number of gaseous molecules, or the dissolution (or precipitation) of a

solid. Entropy generally increases for phase transitions that convert a solid to a liquid

or a liquid to a gas, for reactions that increase the number of gaseous molecules, and for

the dissolution of molecular solids or salts with +1 cations and -1 anions.

SOLUTION



(a) The molecules in a gas are free to move about randomly, whereas the molecules in

a solid are tightly held in a highly ordered arrangement. Therefore, randomness

increases when a solid sublimes and ¢S is positive.

(b) One mole of gaseous molecules appears on the product side of the equation and

none appears on the reactant side. Because the reaction increases the number of

gaseous molecules, the entropy change is positive.

(c) The entropy change is negative because the reaction decreases the number of

gaseous molecules from 4 mol to 2 mol. Fewer particles can move independently

after reaction than before.

(d) Iodine molecules are electrically neutral and form a molecular solid. The dissolution process destroys the order of the crystal and enables the iodine molecules to

move about randomly in the liquid. Therefore, ¢S is positive.



645



646



Chapter 16 THERMODYNAMICS: ENTROPY, FREE ENERGY, AND EQUILIBRIUM



Ī PROBLEM 16.2



(a)

(b)

(c)

(d)



Predict the sign of ¢S in the system for each of the following processes:



H 2O(g) ¡ H 2O(l) (formation of rain droplets)

I 2(g) ¡ 2 I(g)

CaCO 3(s) ¡ CaO(s) + CO 2(g)

Ag +(aq) + Br -(aq) ¡ AgBr(s)



WORKED CONCEPTUAL EXAMPLE 16.2



PREDICTING THE SIGN OF ≤S FOR A GAS-PHASE REACTION

Consider the gas-phase reaction of A2 molecules (red) with B atoms (blue):



(a) Write a balanced equation for the reaction.

(b) Predict the sign of ¢S for the reaction.

STRATEGY



To determine the stoichiometry of the reaction, count the number of reactant A2 molecules and B atoms and the number of product AB molecules. To predict the sign of the

entropy change, see if the reaction increases or decreases the number of gaseous particles.

SOLUTION



(a) In this reaction, 3 A2 molecules and 6 B atoms are consumed and 6 AB molecules

are formed (3 A2 + 6 B : 6 AB). Dividing by 3 to reduce the coefficients to their

smallest whole number values gives the balanced equation A2(g) + 2 B(g) :

2 AB(g).

(b) Because the reaction decreases the number of gaseous particles from 3 mol to 2 mol,

the entropy change is negative.

CONCEPTUAL PROBLEM 16.3



Consider the gas-phase reaction of AB3 and A2 molecules:



(a) Write a balanced equation for the reaction.

(b) What is the sign of the entropy change for the reaction?



16.3



᭡ Shaking a box that contains 20 quarters

gives a random arrangement of heads

and tails.



ENTROPY AND PROBABILITY



Why do systems tend to move spontaneously to a state of maximum randomness?

The answer is that a random arrangement of particles is more probable than

an ordered arrangement because a random arrangement can be achieved in more

ways. To begin with a simple example, suppose that you shake a box containing

20 identical coins and then count the number of heads (H) and tails (T). It’s very

unlikely that all 20 coins will come up heads; that is, a perfectly ordered arrangement

of 20 heads (or 20 tails) is much less probable than a random mixture of heads

and tails.

The probabilities of the ordered and random arrangements are proportional to

the number of ways that the arrangements can be achieved. The perfectly ordered

arrangement of 20 heads can be achieved in only one way because it consists of a



16.3 ENTROPY AND PROBABILITY



single configuration. In how many ways, though, can a random arrangement be

achieved? If there were just two coins in the box, each of them could come up in one

of two ways (H or T), and the two together could come up in 2 * 2 = 22 = 4 ways

(HH, HT, TH, or TT). Three coins could come up in 2 * 2 * 2 = 23 = 8 ways (HHH,

THH, HTH, HHT, HTT, THT, TTH, or TTT), and so on. For the case of 20 coins, the

number of possible arrangements is 220 = 1,048,576.

Because the ordered arrangement of 20 heads (or 20 tails) can be achieved in

only one way and a random mixture of heads and tails can be achieved in

220 - 2 L 220 ways, a random arrangement is 220 times more probable than a perfectly ordered arrangement. If you begin with an ordered arrangement of 20 heads

and shake the box, the system will move to a state with a random mixture of heads

and tails because that state is more probable. (Note that the state with a random

arrangement of coins includes all possible arrangements except the two perfectly

ordered arrangements.)

An analogous chemical example is a crystal containing diatomic molecules such

as carbon monoxide in which the two distinct ends of the CO molecule correspond to

the heads and tails of a coin. Let’s suppose that the long dimensions of the molecules

are oriented vertically (Figure 16.6) and that the temperature is 0 K, so that the molecules are locked into a fixed arrangement. The state in which the molecules pack

together in a perfectly ordered “heads-up” arrangement (Figure 16.6a) can be

achieved in only one way, whereas the state in which the molecules are arranged randomly with respect to the vertical direction can be achieved in many ways—220 ways

for a hypothetical crystal containing 20 CO molecules (Figure 16.6b). Therefore, a

structure in which the molecules are arranged randomly is 220 times more probable

than the perfectly ordered heads-up structure.

The Austrian physicist Ludwig Boltzmann proposed that the entropy of a particular state is related to the number of ways that the state can be achieved, according to

the formula

S = k ln W

where S is the entropy of the state, ln W is the natural logarithm of the number of

ways that the state can be achieved, and k, now known as Boltzmann’s constant, is a

universal constant equal to the gas constant R divided by Avogadro’s number

(k = R/NA = 1.38 * 10-23 J/K). Because a logarithm is dimensionless, the Boltzmann equation implies that entropy has the same units as the constant k, joules per

kelvin.

Now let’s apply Boltzmann’s formula to our hypothetical crystal containing

20 CO molecules. Because a perfectly ordered state can be achieved in only one way

(W = 1 in the Boltzmann equation) and because ln 1 = 0, the entropy of the perfectly

ordered state is zero:

S = k ln W = k ln 1

= 0

The more probable state in which the molecules are arranged randomly can be

achieved in 220 ways and thus has a higher entropy:

S = k ln W = k ln 220

= (1.38 * 10-23 J/K) (20) (ln 2)

= 1.91 * 10-22 J/K

where we have made use of the relation ln x a = a ln x (Appendix A.2).

If our crystal contained 1 mol of CO molecules, the entropy of the perfectly

ordered state (6.02 * 1023 C atoms up) would still be zero, but the entropy of the

state with a random arrangement of CO molecules would be much higher because

Avogadro’s number of molecules

can be arranged randomly in a huge number of

23

ways (W = 2NA = 26.02 * 10 ).



(a) The perfectly ordered

“heads-up” structure.



20 “heads”

0 “tails”



(b) The molecules arranged

randomly in one of the 220

ways in which a disordered

structure can be obtained.



9 “heads”

11 “tails”



Figure 16.6



A hypothetical crystal containing

20 CO molecules.



647



648



Chapter 16 THERMODYNAMICS: ENTROPY, FREE ENERGY, AND EQUILIBRIUM



According to Boltzmann’s formula, the entropy of the state with a random

arrangement of CO molecules is

S = k ln W = k ln 2NA = kNA ln 2

Because k = R/NA,

S = R ln 2 = (8.314 J/K) (0.693)

= 5.76 J/K



Remember...

The dipole moment (M) is a measure of

the net polarity of a molecule and is defined

as m = Q * r, where Q is the magnitude of

the charge at either end of the molecular

dipole and r is the distance between the

charges. (Section 10.1)

Dipole–dipole forces result from electrical

interactions among neighboring polar molecules. (Section 10.2)



Based on experimental measurements, the entropy of 1 mol of solid carbon monoxide near 0 K is about 5 J/K, indicating that the CO molecules adopt a nearly random

arrangement. Entropy associated with a random arrangement of molecules in space

is sometimes called positional, or configurational, entropy.

The nearly random arrangement of CO molecules in crystalline carbon monoxide

is unusual but can be understood in terms of molecular structure. Because CO molecules have a dipole moment of only 0.11 D, intermolecular dipole–dipole forces are

unusually weak (Sections 10.1 and 10.2), and the molecules therefore have little

preference for a slightly lower energy, completely ordered arrangement. By contrast,

HCl, with a larger dipole moment of 1.11 D, forms an ordered crystalline solid, and

so the entropy of 1 mol of solid HCl at 0 K is 0 J/K.

Boltzmann’s formula also explains why a gas expands into a vacuum. If the two

bulbs in Figure 16.1 have equal volumes, each molecule has one chance in two of

being in bulb A (heads, in our coin example) and one chance in two of being in bulb

B (tails) when the stopcock is opened. It’s exceedingly unlikely that all the molecules

in 1 mol of gas will be in bulb A because that state can be achieved in only one way.

The state in which Avogadro’s number of molecules

are randomly distributed

23

between bulbs A and B can be achieved in 26.02 * 10 ways, and the entropy of that

state is therefore higher than the entropy of the ordered state by the now familiar

amount, R ln 2 = 5.76 J/K. Thus, a gas expands spontaneously because the state of

greater volume is more probable.

We can derive a general equation for the entropy change that occurs on the

expansion of an ideal gas at constant temperature by considering the distribution of

N molecules among B hypothetical boxes, or cells, each having an equal volume v.

Volume per box = v

Number of accessible boxes = B

Total volume, V = Bv



Remember...

According to the kinetic–molecular theory,

the kinetic energy of 1 mol of an ideal

gas equals 3RT/2 and is independent of

pressure and volume. (Section 9.6)



Since the energy of an ideal gas depends only on the temperature (Section 9.6), ¢E for

the expansion of an ideal gas at constant temperature is zero. To calculate the entropy

change ¢S = Sfinal - Sinitial using the Boltzmann equation, we have only to find the

number of ways N molecules can be distributed among the B boxes.

A single molecule can go into any one of the boxes and can thus be assigned to

B boxes in B ways. Two molecules can occupy the boxes in B * B = B 2 ways, three

molecules can fill the boxes in B * B * B = B 3 ways, and so on. The number of ways

that N molecules can occupy B boxes is W = B N.

Now suppose that the initial volume comprises Binitial boxes and the final volume

consists of Bfinal boxes:

Vinitial = Binitialv



and



Vfinal = Bfinalv



Then the probabilities of the initial and final states—that is, the number of ways they

can be achieved—are

Winitial = (Binitial) N



and



Wfinal = (Bfinal) N



16.4 ENTROPY AND TEMPERATURE



According to the Boltzmann equation, the entropy change due to a change in volume is

¢S = Sfinal - Sinitial = k ln Wfinal - k ln Winitial = k ln

= k lna



Wfinal

Winitial



Bfinal N

Bfinal

b = kN ln

Binitial

Binitial



Because Vinitial = Binitial v and Vfinal = Bfinalv,

¢S = kN lna



Vfinal/v

Vfinal

b = kN ln

Vinitial/v

Vinitial



Finally, because k = R/NA and the number of particles equals the number of

moles of gas times Avogadro’s number (N = nNA), then

kN = a



R

b(nNA) = nR

NA



and so the entropy change for expansion (or compression) of n moles of an ideal gas

at constant temperature is

¢S = nR ln



Vfinal

Vinitial



For a twofold expansion of 1 mol of an ideal gas at constant temperature, ¢S = R ln 2,

the same result as we obtained previously.

Because the pressure and volume of an ideal gas are related inversely (P =

nRT/V), we can also write

¢S = nR ln



Pinitial

Pfinal



Thus, the entropy of a gas increases when its pressure decreases at constant temperature, and the entropy decreases when its pressure increases. Common sense tells us

that the more we squeeze the gas, the less space the gas molecules have and so

randomness decreases.

Ī PROBLEM 16.4



Which state has the higher entropy? Explain in terms of probability.

(a) A perfectly ordered crystal of solid nitrous oxide (N ‚ N ¬ O) or a disordered crystal in which the molecules are oriented randomly

(b) Quartz glass (Section 10.10) or a quartz crystal

(c) 1 mol of N2 gas at STP or 1 mol of N2 gas at 273 K in a volume of 11.2 L

(d) 1 mol of N2 gas at STP or 1 mol of N2 gas at 273 K and 0.25 atm



16.4



ENTROPY AND TEMPERATURE



Thus far we’ve seen that entropy is associated with the orientation and distribution

of molecules in space. Disordered crystals have higher entropy than ordered crystals,

and expanded gases have higher entropy than compressed gases.

Entropy is also associated with molecular motion. As the temperature of a substance increases, random molecular motion increases and there is a corresponding

increase in the average kinetic energy of the molecules. But not all the molecules

have the same energy. As we saw in Section 9.6, there is a distribution of molecular

speeds in a gas, a distribution that broadens and shifts to higher speeds with increasing temperature (Figure 9.12, page 328). In solids, liquids, and gases, the total energy

of a substance can be distributed among the individual molecules in a number of

ways that increases as the total energy increases. According to Boltzmann’s formula,

the more ways that the energy can be distributed, the greater the randomness of the

state and the higher its entropy. Therefore, the entropy of a substance increases with

increasing temperature (Figure 16.7).



649



650



Chapter 16 THERMODYNAMICS: ENTROPY, FREE ENERGY, AND EQUILIBRIUM



Higher temperature:

•Greater molecular motion

•Broader distribution of individual molecular energies

•More randomness

•Higher entropy



Lower temperature:

•Less molecular motion

•Narrower distribution of individual molecular energies

•Less randomness

•Lower entropy



Figure 16.7



A substance at a higher temperature has greater entropy than the same substance at a lower temperature.



A typical plot of entropy versus temperature is shown in Figure 16.8. At absolute

zero, every substance is a solid whose particles are tightly held in a crystalline structure. If there is no residual orientational disorder, like that in carbon monoxide

(Figure 16.6b), the entropy of the substance at 0 K will be zero, a general result

summarized in the third law of thermodynamics:

Third Law of Thermodynamics The entropy of a perfectly ordered

crystalline substance at 0 K is zero.

(The first law of thermodynamics was discussed in Section 8.1. We’ll review the first

law and discuss the second law in Section 16.6.)



Entropy



Gas



Liquid



Figure 16.8



Entropy versus temperature. The

entropy of a pure substance, equal to

zero at 0 K, shows a steady increase with

rising temperature, punctuated by

discontinuous jumps in entropy at the

temperatures of the phase transitions.



Solid



mp

Temperature (K)



bp



16.5 STANDARD MOLAR ENTROPIES AND STANDARD ENTROPIES OF REACTION



651



As the temperature of a solid is raised, the added energy increases the vibrational

motion of the molecules about their equilibrium positions in the crystal. The number

of ways in which the vibrational energy can be distributed increases with rising temperature, and the entropy of the solid thus increases steadily as the temperature

increases.

At the melting point, there is a discontinuous jump in entropy because there are

many more ways of arranging the molecules in the liquid than in the solid. Furthermore, the molecules in the liquid can undergo translational and rotational as well as

vibrational motion, and so there are many more ways of distributing the total energy

in the liquid. (Translational motion is motion of the center of mass.) An even greater

jump in entropy is observed at the boiling point because molecules in the gas are free

to occupy a much larger volume. Between the melting point and the boiling point, the

entropy of a liquid increases steadily as molecular motion increases and the number

of ways of distributing the total energy among the individual molecules increases. For

the same reason, the entropy of a gas rises steadily as its temperature increases.



16.5



STANDARD MOLAR ENTROPIES AND

STANDARD ENTROPIES OF REACTION



We won’t describe how the entropy of a substance is determined, except to note that

two approaches are available: (1) calculations based on Boltzmann’s formula and (2)

experimental measurements of heat capacities (Section 8.7) down to very low temperatures. Suffice it to say that standard molar entropies, denoted by S°, are known

for many substances.



Remember...

The molar heat capacity is the amount of

heat needed to raise the temperature of

1 mol of a substance by 1 °C. (Section 8.7)



Standard Molar Entropy, S° The entropy of 1 mol of the pure substance

at 1 atm pressure and a specified temperature, usually 25 °C.

Values of S° for some common substances at 25 °C are listed in Table 16.1, and

additional values are given in Appendix B. Note that the units of S° are joules (not

kilojoules) per kelvin mole [J/(K # mol)]. Standard molar entropies are often called

absolute entropies because they are measured with respect to an absolute reference

point—the entropy of the perfectly ordered crystalline substance at 0 K

[S° = 0 J/(K # mol) at T = 0 K].

Standard molar entropies make it possible to compare the entropies of different

substances under the same conditions of temperature and pressure. It’s apparent

from Table 16.1, for example, that the entropies of gaseous substances tend to be



TABLE 16.1



Standard Molar Entropies for Some Common Substances at 25 °C



Substance



Formula



Gases

Acetylene

Ammonia

Carbon dioxide

Carbon monoxide

Ethylene

Hydrogen

Methane

Nitrogen

Nitrogen dioxide

Dinitrogen tetroxide

Oxygen



C2H2

NH3

CO2

CO

C2H4

H2

CH4

N2

NO2

N2O4

O2



#



S° [J/(K mol)]

200.8

192.3

213.6

197.6

219.5

130.6

186.2

191.5

240.0

304.3

205.0



Substance

Liquids

Acetic acid

Ethanol

Methanol

Water

Solids

Calcium carbonate

Calcium oxide

Diamond

Graphite

Iron

Iron(III) oxide



Formula

CH3CO2H

CH3CH2OH

CH3OH

H2O

CaCO3

CaO

C

C

Fe

Fe2O3



#



S° [J/(K mol)]

160

161

127

69.9

91.7

38.1

2.4

5.7

27.3

87.4



652



Chapter 16 THERMODYNAMICS: ENTROPY, FREE ENERGY, AND EQUILIBRIUM



larger than those of liquids, which, in turn, tend to be larger than those of solids.

Table 16.1 also shows that S° values increase with increasing molecular complexity.

Compare, for example, CH3OH, which has S° = 127 J/(K # mol), and CH3CH2OH,

which has S° = 161 J/(K # mol).

Once we have values for standard molar entropies, it’s easy to calculate the

entropy change for a chemical reaction. The standard entropy of reaction ( ≤S°) can

be obtained simply by subtracting the standard molar entropies of all the reactants

from the standard molar entropies of all the products:

¢S° = S°(products) - S°(reactants)

Because S° values are quoted on a per-mole basis, the S° value for each substance

must be multiplied by the stoichiometric coefficient of that substance in the balanced

chemical equation. Thus, for the general reaction

aA + bB ¡ cC + dD

the standard entropy of reaction is

¢S° = [c S°(C) + d S°(D)] - [a S°(A) + b S°(B)]



where the units of the coefficients are moles, the units of S° are J/(K # mol), and the

units of ¢S° are J/K.

As an example, let’s calculate the standard entropy change for the reaction

N2O4(g) ¡ 2 NO2(g)

Using the appropriate S° values from Table 16.1, we find that ¢S° = 175.7 J/K:

¢S° = 2 S°(NO2) - S°(N2O4)

J

J

= (2 mol)a240.0 #

b - (1 mol)a304.3 #

b

K mol

K mol

= 175.7 J/K

Although the standard molar entropy of N2O4 is larger than that of NO2, as expected

for a more complex molecule, ¢S° for the reaction is positive because 1 mol of N2O4

is converted to 2 mol of NO2. As noted earlier, we expect an increase in entropy

whenever a molecule breaks into two or more pieces.

WORKED EXAMPLE 16.3



CALCULATING THE STANDARD ENTROPY OF REACTION

Calculate the standard entropy of reaction at 25 °C for the Haber synthesis of ammonia:

N2(g) + 3 H 2(g) ¡ 2 NH 3(g)

STRATEGY



To calculate ¢S° for the reaction, subtract the standard molar entropies of all the reactants from the standard molar entropies of all the products. Look up the S° values in

Table 16.1 or Appendix B, and remember to multiply the S° value for each substance by

its coefficient in the balanced chemical equation.

SOLUTION



¢S° = 2 S°(NH 3) - [S°(N2) + 3 S°(H 2)]

J

J

J

b - c (1 mol) a191.5 #

b + (3 mol) a130.6 #

bd

= (2 mol)a192.3 #

K mol

K mol

K mol

= -198.7 J/K

BALLPARK CHECK



As predicted in Worked Example 16.1c, ¢S° should be negative because the reaction

decreases the number of gaseous molecules from 4 mol to 2 mol.



16.6 ENTROPY AND THE SECOND LAW OF THERMODYNAMICS



Ī PROBLEM 16.5 Calculate the standard entropy of reaction at 25 °C for the decomposition of calcium carbonate:



CaCO3(s) ¡ CaO(s) + CO 2(g)



16.6



ENTROPY AND THE SECOND LAW

OF THERMODYNAMICS



We’ve seen thus far that molecular systems tend to move spontaneously toward a

state of minimum enthalpy and maximum entropy. In any particular reaction,

though, the enthalpy of the system can either increase or decrease. Similarly, the

entropy of the system can either increase or decrease. How, then, can we decide

whether a reaction will occur spontaneously? In Section 8.13, we said that it is the

value of the free-energy change, ¢G, that is the criterion for spontaneity, where

¢G = ¢H - T¢S. If ¢G 6 0, the reaction is spontaneous; if ¢G 7 0, the reaction is

nonspontaneous; and if ¢G = 0, the reaction is at equilibrium. In this section and the

next, we’ll see how that conclusion was reached. Let’s begin by looking at the first

and second laws of thermodynamics:

First Law of Thermodynamics



In any process, spontaneous or nonspontaneous, the total energy of a

system and its surroundings is

constant.



Second Law of Thermodynamics



In any spontaneous process, the total

entropy of a system and its surroundings always increases.



The first law is simply a statement of the conservation of energy (Section 8.1). It

says that energy (or enthalpy) can flow between a system and its surroundings but

the total energy of the system plus the surroundings always remains constant. In an

exothermic reaction, the system loses enthalpy to the surroundings; in an endothermic reaction, the system gains enthalpy from the surroundings. Because energy is

conserved in all chemical processes, spontaneous and nonspontaneous, the first law

helps us keep track of energy flow between the system and the surroundings but it

doesn’t tell us whether a particular reaction will be spontaneous or nonspontaneous.

The second law, however, provides a clear-cut criterion of spontaneity. It says

that the direction of spontaneous change is always determined by the sign of the

total entropy change:

¢Stotal = ¢Ssystem + ¢Ssurroundings

Specifically,

If ¢Stotal 7 0, the reaction is spontaneous.

If ¢Stotal 6 0, the reaction is nonspontaneous.

If ¢Stotal = 0, the reaction mixture is at equilibrium.

All reactions proceed spontaneously in the direction that increases the entropy of the

system plus surroundings. A reaction that is nonspontaneous in the forward direction

is spontaneous in the reverse direction because ¢Stotal for the reverse reaction equals

- ¢Stotal for the forward reaction. If ¢Stotal is zero, the reaction doesn’t go spontaneously in either direction, and so the reaction mixture is at equilibrium.

To determine the value of ¢Stotal, we need values for the entropy changes in the

system and the surroundings. The entropy change in the system, ¢Ssys, is just the

entropy of reaction, which can be calculated from standard molar entropies (Table

16.1), as described in Section 16.5. For a reaction that occurs at constant pressure, the

entropy change in the surroundings is directly proportional to the enthalpy change



653



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

2 Enthalpy, Entropy, and Spontaneous Processes: A Brief Review

Tải bản đầy đủ ngay(0 tr)

×