Skip to Content
OpenStax Logo
Chemistry 2e

16.2 Entropy

Chemistry 2e16.2 Entropy
  1. Preface
  2. 1 Essential Ideas
    1. Introduction
    2. 1.1 Chemistry in Context
    3. 1.2 Phases and Classification of Matter
    4. 1.3 Physical and Chemical Properties
    5. 1.4 Measurements
    6. 1.5 Measurement Uncertainty, Accuracy, and Precision
    7. 1.6 Mathematical Treatment of Measurement Results
    8. Key Terms
    9. Key Equations
    10. Summary
    11. Exercises
  3. 2 Atoms, Molecules, and Ions
    1. Introduction
    2. 2.1 Early Ideas in Atomic Theory
    3. 2.2 Evolution of Atomic Theory
    4. 2.3 Atomic Structure and Symbolism
    5. 2.4 Chemical Formulas
    6. 2.5 The Periodic Table
    7. 2.6 Molecular and Ionic Compounds
    8. 2.7 Chemical Nomenclature
    9. Key Terms
    10. Key Equations
    11. Summary
    12. Exercises
  4. 3 Composition of Substances and Solutions
    1. Introduction
    2. 3.1 Formula Mass and the Mole Concept
    3. 3.2 Determining Empirical and Molecular Formulas
    4. 3.3 Molarity
    5. 3.4 Other Units for Solution Concentrations
    6. Key Terms
    7. Key Equations
    8. Summary
    9. Exercises
  5. 4 Stoichiometry of Chemical Reactions
    1. Introduction
    2. 4.1 Writing and Balancing Chemical Equations
    3. 4.2 Classifying Chemical Reactions
    4. 4.3 Reaction Stoichiometry
    5. 4.4 Reaction Yields
    6. 4.5 Quantitative Chemical Analysis
    7. Key Terms
    8. Key Equations
    9. Summary
    10. Exercises
  6. 5 Thermochemistry
    1. Introduction
    2. 5.1 Energy Basics
    3. 5.2 Calorimetry
    4. 5.3 Enthalpy
    5. Key Terms
    6. Key Equations
    7. Summary
    8. Exercises
  7. 6 Electronic Structure and Periodic Properties of Elements
    1. Introduction
    2. 6.1 Electromagnetic Energy
    3. 6.2 The Bohr Model
    4. 6.3 Development of Quantum Theory
    5. 6.4 Electronic Structure of Atoms (Electron Configurations)
    6. 6.5 Periodic Variations in Element Properties
    7. Key Terms
    8. Key Equations
    9. Summary
    10. Exercises
  8. 7 Chemical Bonding and Molecular Geometry
    1. Introduction
    2. 7.1 Ionic Bonding
    3. 7.2 Covalent Bonding
    4. 7.3 Lewis Symbols and Structures
    5. 7.4 Formal Charges and Resonance
    6. 7.5 Strengths of Ionic and Covalent Bonds
    7. 7.6 Molecular Structure and Polarity
    8. Key Terms
    9. Key Equations
    10. Summary
    11. Exercises
  9. 8 Advanced Theories of Covalent Bonding
    1. Introduction
    2. 8.1 Valence Bond Theory
    3. 8.2 Hybrid Atomic Orbitals
    4. 8.3 Multiple Bonds
    5. 8.4 Molecular Orbital Theory
    6. Key Terms
    7. Key Equations
    8. Summary
    9. Exercises
  10. 9 Gases
    1. Introduction
    2. 9.1 Gas Pressure
    3. 9.2 Relating Pressure, Volume, Amount, and Temperature: The Ideal Gas Law
    4. 9.3 Stoichiometry of Gaseous Substances, Mixtures, and Reactions
    5. 9.4 Effusion and Diffusion of Gases
    6. 9.5 The Kinetic-Molecular Theory
    7. 9.6 Non-Ideal Gas Behavior
    8. Key Terms
    9. Key Equations
    10. Summary
    11. Exercises
  11. 10 Liquids and Solids
    1. Introduction
    2. 10.1 Intermolecular Forces
    3. 10.2 Properties of Liquids
    4. 10.3 Phase Transitions
    5. 10.4 Phase Diagrams
    6. 10.5 The Solid State of Matter
    7. 10.6 Lattice Structures in Crystalline Solids
    8. Key Terms
    9. Key Equations
    10. Summary
    11. Exercises
  12. 11 Solutions and Colloids
    1. Introduction
    2. 11.1 The Dissolution Process
    3. 11.2 Electrolytes
    4. 11.3 Solubility
    5. 11.4 Colligative Properties
    6. 11.5 Colloids
    7. Key Terms
    8. Key Equations
    9. Summary
    10. Exercises
  13. 12 Kinetics
    1. Introduction
    2. 12.1 Chemical Reaction Rates
    3. 12.2 Factors Affecting Reaction Rates
    4. 12.3 Rate Laws
    5. 12.4 Integrated Rate Laws
    6. 12.5 Collision Theory
    7. 12.6 Reaction Mechanisms
    8. 12.7 Catalysis
    9. Key Terms
    10. Key Equations
    11. Summary
    12. Exercises
  14. 13 Fundamental Equilibrium Concepts
    1. Introduction
    2. 13.1 Chemical Equilibria
    3. 13.2 Equilibrium Constants
    4. 13.3 Shifting Equilibria: Le Châtelier’s Principle
    5. 13.4 Equilibrium Calculations
    6. Key Terms
    7. Key Equations
    8. Summary
    9. Exercises
  15. 14 Acid-Base Equilibria
    1. Introduction
    2. 14.1 Brønsted-Lowry Acids and Bases
    3. 14.2 pH and pOH
    4. 14.3 Relative Strengths of Acids and Bases
    5. 14.4 Hydrolysis of Salts
    6. 14.5 Polyprotic Acids
    7. 14.6 Buffers
    8. 14.7 Acid-Base Titrations
    9. Key Terms
    10. Key Equations
    11. Summary
    12. Exercises
  16. 15 Equilibria of Other Reaction Classes
    1. Introduction
    2. 15.1 Precipitation and Dissolution
    3. 15.2 Lewis Acids and Bases
    4. 15.3 Coupled Equilibria
    5. Key Terms
    6. Key Equations
    7. Summary
    8. Exercises
  17. 16 Thermodynamics
    1. Introduction
    2. 16.1 Spontaneity
    3. 16.2 Entropy
    4. 16.3 The Second and Third Laws of Thermodynamics
    5. 16.4 Free Energy
    6. Key Terms
    7. Key Equations
    8. Summary
    9. Exercises
  18. 17 Electrochemistry
    1. Introduction
    2. 17.1 Review of Redox Chemistry
    3. 17.2 Galvanic Cells
    4. 17.3 Electrode and Cell Potentials
    5. 17.4 Potential, Free Energy, and Equilibrium
    6. 17.5 Batteries and Fuel Cells
    7. 17.6 Corrosion
    8. 17.7 Electrolysis
    9. Key Terms
    10. Key Equations
    11. Summary
    12. Exercises
  19. 18 Representative Metals, Metalloids, and Nonmetals
    1. Introduction
    2. 18.1 Periodicity
    3. 18.2 Occurrence and Preparation of the Representative Metals
    4. 18.3 Structure and General Properties of the Metalloids
    5. 18.4 Structure and General Properties of the Nonmetals
    6. 18.5 Occurrence, Preparation, and Compounds of Hydrogen
    7. 18.6 Occurrence, Preparation, and Properties of Carbonates
    8. 18.7 Occurrence, Preparation, and Properties of Nitrogen
    9. 18.8 Occurrence, Preparation, and Properties of Phosphorus
    10. 18.9 Occurrence, Preparation, and Compounds of Oxygen
    11. 18.10 Occurrence, Preparation, and Properties of Sulfur
    12. 18.11 Occurrence, Preparation, and Properties of Halogens
    13. 18.12 Occurrence, Preparation, and Properties of the Noble Gases
    14. Key Terms
    15. Summary
    16. Exercises
  20. 19 Transition Metals and Coordination Chemistry
    1. Introduction
    2. 19.1 Occurrence, Preparation, and Properties of Transition Metals and Their Compounds
    3. 19.2 Coordination Chemistry of Transition Metals
    4. 19.3 Spectroscopic and Magnetic Properties of Coordination Compounds
    5. Key Terms
    6. Summary
    7. Exercises
  21. 20 Organic Chemistry
    1. Introduction
    2. 20.1 Hydrocarbons
    3. 20.2 Alcohols and Ethers
    4. 20.3 Aldehydes, Ketones, Carboxylic Acids, and Esters
    5. 20.4 Amines and Amides
    6. Key Terms
    7. Summary
    8. Exercises
  22. 21 Nuclear Chemistry
    1. Introduction
    2. 21.1 Nuclear Structure and Stability
    3. 21.2 Nuclear Equations
    4. 21.3 Radioactive Decay
    5. 21.4 Transmutation and Nuclear Energy
    6. 21.5 Uses of Radioisotopes
    7. 21.6 Biological Effects of Radiation
    8. Key Terms
    9. Key Equations
    10. Summary
    11. Exercises
  23. A | The Periodic Table
  24. B | Essential Mathematics
  25. C | Units and Conversion Factors
  26. D | Fundamental Physical Constants
  27. E | Water Properties
  28. F | Composition of Commercial Acids and Bases
  29. G | Standard Thermodynamic Properties for Selected Substances
  30. H | Ionization Constants of Weak Acids
  31. I | Ionization Constants of Weak Bases
  32. J | Solubility Products
  33. K | Formation Constants for Complex Ions
  34. L | Standard Electrode (Half-Cell) Potentials
  35. M | Half-Lives for Several Radioactive Isotopes
  36. Answer Key
    1. Chapter 1
    2. Chapter 2
    3. Chapter 3
    4. Chapter 4
    5. Chapter 5
    6. Chapter 6
    7. Chapter 7
    8. Chapter 8
    9. Chapter 9
    10. Chapter 10
    11. Chapter 11
    12. Chapter 12
    13. Chapter 13
    14. Chapter 14
    15. Chapter 15
    16. Chapter 16
    17. Chapter 17
    18. Chapter 18
    19. Chapter 19
    20. Chapter 20
    21. Chapter 21
  37. Index
By the end of this section, you will be able to:
  • Define entropy
  • Explain the relationship between entropy and the number of microstates
  • Predict the sign of the entropy change for chemical and physical processes

In 1824, at the age of 28, Nicolas Léonard Sadi Carnot (Figure 16.7) published the results of an extensive study regarding the efficiency of steam heat engines. A later review of Carnot’s findings by Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat (qrev) and the kelvin temperature (T). In thermodynamics, a reversible process is one that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change in some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible.

 A portrait of Rudolf Clasius is shown.
Figure 16.7 (a) Nicholas Léonard Sadi Carnot’s research into steam-powered machinery and (b) Rudolf Clausius’s later study of those findings led to groundbreaking discoveries about spontaneous heat flow processes.

Similar to other thermodynamic properties, this new quantity is a state function, so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy (S) and defined its change for any process as the following:

ΔS=qrevTΔS=qrevT

The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.

Entropy and Microstates

Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates (W) possible for the system. A microstate is a specific configuration of all the locations and energies of the atoms or molecules that make up a system. The relation between a system’s entropy and the number of possible microstates is

S=klnWS=klnW

where k is the Boltzmann constant, 1.38 ×× 10−23 J/K.

As for other state functions, the change in entropy for a process is the difference between its final (Sf) and initial (Si) values:

ΔS=SfSi=klnWfklnWi=klnWfWiΔS=SfSi=klnWfklnWi=klnWfWi

For processes involving an increase in the number of microstates, Wf > Wi, the entropy of the system increases and ΔS > 0. Conversely, processes that reduce the number of microstates, Wf < Wi, yield a decrease in system entropy, ΔS < 0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.

Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is nN. For example, distributing four particles among two boxes will result in 24 = 16 different microstates as illustrated in Figure 16.8. Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions. The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy.

Five rows of diagrams that look like dominoes are shown and labeled a, b, c, d, and e. Row a has one “domino” that has four dots on the left side, red, green, blue and yellow in a clockwise pattern from the top left, and no dots on the right. Row b has four “dominos,” each with three dots on the left and one dot on the right. The first shows a “domino” with green, yellow and blue on the left and red on the right. The second “domino” has yellow, blue and red on the left and green on the right. The third “domino” has red, green and yellow on the left and blue on the right while the fourth has red, green and blue on the left and yellow on the right. Row c has six “dominos”, each with two dots on either side. The first has a red and green on the left and a blue and yellow on the right. The second has a red and blue on the left and a green and yellow on the right while the third has a yellow and red on the left and a green and blue on the right. The fourth has a green and blue on the left and a red and yellow on the right. The fifth has a green and yellow on the left and a red and blue on the right. The sixth has a blue and yellow on the left and a green and red on the right. Row d has four “dominos,” each with one dot on the left and three on the right. The first “domino” has red on the left and a blue, green and yellow on the right. The second has a green on the left and a red, yellow and blue on the right. The third has a blue on the left and a red, green and yellow on the right. The fourth has a yellow on the left and a red, green and blue on the right. Row e has 1 “domino” with no dots on the left and four dots on the right that are red, green, blue and yellow.
Figure 16.8 The sixteen microstates associated with placing four particles in two boxes are shown. The microstates are collected into five distributions—(a), (b), (c), (d), and (e)—based on the numbers of particles in each box.

For this system, the most probable configuration is one of the six microstates associated with distribution (c) where the particles are evenly distributed between the boxes, that is, a configuration of two particles in each box. The probability of finding the system in this configuration is 616616 or 38.38. The least probable configuration of the system is one in which all four particles are in one box, corresponding to distributions (a) and (e), each with a probability of 116.116. The probability of finding all particles in only one box (either the left box or right box) is then (116+116)=216(116+116)=216 or 18.18.

As you add more particles to the system, the number of possible microstates increases exponentially (2N). A macroscopic (laboratory-sized) system would typically consist of moles of particles (N ~ 1023), and the corresponding number of microstates would be staggeringly huge. Regardless of the number of particles in the system, however, the distributions in which roughly equal numbers of particles are found in each box are always the most probable configurations.

This matter dispersal model of entropy is often described qualitatively in terms of the disorder of the system. By this description, microstates in which all the particles are in a single box are the most ordered, thus possessing the least entropy. Microstates in which the particles are more evenly distributed among the boxes are more disordered, possessing greater entropy.

The previous description of an ideal gas expanding into a vacuum (Figure 16.4) is a macroscopic example of this particle-in-a-box model. For this system, the most probable distribution is confirmed to be the one in which the matter is most uniformly dispersed or distributed between the two flasks. Initially, the gas molecules are confined to just one of the two flasks. Opening the valve between the flasks increases the volume available to the gas molecules and, correspondingly, the number of microstates possible for the system. Since Wf > Wi, the expansion process involves an increase in entropy (ΔS > 0) and is spontaneous.

A similar approach may be used to describe the spontaneous flow of heat. Consider a system consisting of two objects, each containing two particles, and two units of thermal energy (represented as “*”) in Figure 16.9. The hot object is comprised of particles A and B and initially contains both energy units. The cold object is comprised of particles C and D, which initially has no energy units. Distribution (a) shows the three microstates possible for the initial state of the system, with both units of energy contained within the hot object. If one of the two energy units is transferred, the result is distribution (b) consisting of four microstates. If both energy units are transferred, the result is distribution (c) consisting of three microstates. Thus, we may describe this system by a total of ten microstates. The probability that the heat does not flow when the two objects are brought into contact, that is, that the system remains in distribution (a), is 310.310. More likely is the flow of heat to yield one of the other two distribution, the combined probability being 710.710. The most likely result is the flow of heat to yield the uniform dispersal of energy represented by distribution (b), the probability of this configuration being 410.410. This supports the common observation that placing hot and cold objects in contact results in spontaneous heat flow that ultimately equalizes the objects’ temperatures. And, again, this spontaneous process is also characterized by an increase in system entropy.

Three rows labeled a, b, and c are shown and each contains rectangles with two sides where the left side is labeled, “A,” and “B,” and the right is labeled, “C,” and “D.” Row a has three rectangles where the first has a dot above and below the letter A, the second has a dot above the A and B, and the third which has a dot above and below the letter B. Row b has four rectangles; the first has a dot above A and C, the second has a dot above A and D, the third has a dot above B and C and the fourth has a dot above B and D. Row c has three rectangles; the first has a dot above and below the letter C, the second has a dot above C and D and the third has a dot above and below the letter D.
Figure 16.9 This shows a microstate model describing the flow of heat from a hot object to a cold object. (a) Before the heat flow occurs, the object comprised of particles A and B contains both units of energy and as represented by a distribution of three microstates. (b) If the heat flow results in an even dispersal of energy (one energy unit transferred), a distribution of four microstates results. (c) If both energy units are transferred, the resulting distribution has three microstates.

Example 16.2

Determination of ΔS Calculate the change in entropy for the process depicted below.

A diagram shows one rectangle with two sides that has four dots, red, green, yellow and blue written on the left side. A right-facing arrow leads to six more two-sided rectangles, each with two dots on the left and right sides. The first rectangle has a red and green dot on the left and a blue and yellow on the right, while the second shows a red and blue on the left and a green and yellow on the right. The third rectangle has a red and yellow dot on the left and a blue and green on the right, while the fourth shows a green and blue on the left and a red and yellow on the right. The fifth rectangle has a yellow and green dot on the left and a blue and red on the right, while the sixth shows a yellow and blue on the left and a green and red on the right.

Solution

The initial number of microstates is one, the final six:

ΔS=klnWcWa=1.38×10−23J/K×ln61=2.47×10−23J/KΔS=klnWcWa=1.38×10−23J/K×ln61=2.47×10−23J/K

The sign of this result is consistent with expectation; since there are more microstates possible for the final state than for the initial state, the change in entropy should be positive.

Check Your Learning Consider the system shown in Figure 16.9. What is the change in entropy for the process where all the energy is transferred from the hot object (AB) to the cold object (CD)?

Answer:

0 J/K

Predicting the Sign of ΔS

The relationships between entropy, microstates, and matter/energy dispersal described previously allow us to make generalizations regarding the relative entropies of substances and to predict the sign of entropy changes for chemical and physical processes. Consider the phase changes illustrated in Figure 16.10. In the solid phase, the atoms or molecules are restricted to nearly fixed positions with respect to each other and are capable of only modest oscillations about these positions. With essentially fixed locations for the system’s component particles, the number of microstates is relatively small. In the liquid phase, the atoms or molecules are free to move over and around each other, though they remain in relatively close proximity to one another. This increased freedom of motion results in a greater variation in possible particle locations, so the number of microstates is correspondingly greater than for the solid. As a result, Sliquid > Ssolid and the process of converting a substance from solid to liquid (melting) is characterized by an increase in entropy, ΔS > 0. By the same logic, the reciprocal process (freezing) exhibits a decrease in entropy, ΔS < 0.

Three stoppered flasks are shown with right and left-facing arrows in between each; the first is labeled above as, “delta S greater than 0,” and below as, “delta S less than 0,” while the second is labeled above as, “delta S greater than 0,” and below as, “delta S less than 0.” A long, right-facing arrow is drawn above all the flasks and labeled, “Increasing entropy.” The left flask contains twenty-seven particles arranged in a cube in the bottom of the flask and is labeled, “Crystalline solid,” below. The middle flask contains twenty-seven particles dispersed randomly in the bottom of the flask and is labeled, “Liquid,” below. The right flask contains twenty-seven particles dispersed inside of the flask and moving rapidly and is labeled, “Gas,” below.
Figure 16.10 The entropy of a substance increases (ΔS > 0) as it transforms from a relatively ordered solid, to a less-ordered liquid, and then to a still less-ordered gas. The entropy decreases (ΔS < 0) as the substance transforms from a gas to a liquid and then to a solid.

Now consider the gaseous phase, in which a given number of atoms or molecules occupy a much greater volume than in the liquid phase. Each atom or molecule can be found in many more locations, corresponding to a much greater number of microstates. Consequently, for any substance, Sgas > Sliquid > Ssolid, and the processes of vaporization and sublimation likewise involve increases in entropy, ΔS > 0. Likewise, the reciprocal phase transitions, condensation and deposition, involve decreases in entropy, ΔS < 0.

According to kinetic-molecular theory, the temperature of a substance is proportional to the average kinetic energy of its particles. Raising the temperature of a substance will result in more extensive vibrations of the particles in solids and more rapid translations of the particles in liquids and gases. At higher temperatures, the distribution of kinetic energies among the atoms or molecules of the substance is also broader (more dispersed) than at lower temperatures. Thus, the entropy for any substance increases with temperature (Figure 16.11).

Two graphs are shown. The y-axis of the left graph is labeled, “Fraction of molecules,” while the x-axis is labeled, “Velocity, v ( m / s ),” and has values of 0 through 1,500 along the axis with increments of 500. Four lines are plotted on this graph. The first, labeled, “100 K,” peaks around 200 m / s while the second, labeled, “200 K,” peaks near 300 m / s and is slightly lower on the y-axis than the first. The third line, labeled, “500 K,” peaks around 550 m / s and is lower than the first two on the y-axis. The fourth line, labeled, “1000 K,” peaks around 750 m / s and is the lowest of the four on the y-axis. Each line get increasingly broad. The second graph has a y-axis labeled, “Entropy, S,” with an upward-facing arrow and an x-axis labeled, “Temperature ( K ),” and a right-facing arrow. The graph has three equally spaced columns in the background, labeled, “Solid,” “Liquid,” and, “Gas,” from left to right. A line extends slightly upward through the first column in a slight upward direction, then goes straight up in the transition between the first two columns. In then progresses in a slight upward direction through the second column, then goes up dramatically between the second and third columns, then continues in a slight upward direction once more. The first vertical region of this line is labeled, “Melting,” and the second is labeled, “Boiling.”
Figure 16.11 Entropy increases as the temperature of a substance is raised, which corresponds to the greater spread of kinetic energies. When a substance undergoes a phase transition, its entropy changes significantly.

The entropy of a substance is influenced by the structure of the particles (atoms or molecules) that comprise the substance. With regard to atomic substances, heavier atoms possess greater entropy at a given temperature than lighter atoms, which is a consequence of the relation between a particle’s mass and the spacing of quantized translational energy levels (a topic beyond the scope of this text). For molecules, greater numbers of atoms increase the number of ways in which the molecules can vibrate and thus the number of possible microstates and the entropy of the system.

Finally, variations in the types of particles affects the entropy of a system. Compared to a pure substance, in which all particles are identical, the entropy of a mixture of two or more different particle types is greater. This is because of the additional orientations and interactions that are possible in a system comprised of nonidentical components. For example, when a solid dissolves in a liquid, the particles of the solid experience both a greater freedom of motion and additional interactions with the solvent particles. This corresponds to a more uniform dispersal of matter and energy and a greater number of microstates. The process of dissolution therefore involves an increase in entropy, ΔS > 0.

Considering the various factors that affect entropy allows us to make informed predictions of the sign of ΔS for various chemical and physical processes as illustrated in Example 16.3.

Example 16.3

Predicting the Sign of ∆S Predict the sign of the entropy change for the following processes. Indicate the reason for each of your predictions.

(a) One mole liquid water at room temperature one mole liquid water at 50 °C

(b) Ag+(aq)+Cl(aq)AgCl(s)Ag+(aq)+Cl(aq)AgCl(s)

(c) C6H6(l)+152O2(g)6CO2(g)+3H2O(l)C6H6(l)+152O2(g)6CO2(g)+3H2O(l)

(d) NH3(s)NH3(l)NH3(s)NH3(l)

Solution (a) positive, temperature increases

(b) negative, reduction in the number of ions (particles) in solution, decreased dispersal of matter

(c) negative, net decrease in the amount of gaseous species

(d) positive, phase transition from solid to liquid, net increase in dispersal of matter

Check Your Learning Predict the sign of the entropy change for the following processes. Give a reason for your prediction.

(a) NaNO3(s)Na+(aq)+NO3(aq)NaNO3(s)Na+(aq)+NO3(aq)

(b) the freezing of liquid water

(c) CO2(s)CO2(g)CO2(s)CO2(g)

(d) CaCO(s)CaO(s)+CO2(g)CaCO(s)CaO(s)+CO2(g)

Answer:

(a) Positive; The solid dissolves to give an increase of mobile ions in solution. (b) Negative; The liquid becomes a more ordered solid. (c) Positive; The relatively ordered solid becomes a gas. (d) Positive; There is a net increase in the amount of gaseous species.

Citation/Attribution

Want to cite, share, or modify this book? This book is Creative Commons Attribution License 4.0 and you must attribute OpenStax.

Attribution information
  • If you are redistributing all or part of this book in a print format, then you must include on every physical page the following attribution:
    Access for free at https://openstax.org/books/chemistry-2e/pages/1-introduction
  • If you are redistributing all or part of this book in a digital format, then you must include on every digital page view the following attribution:
    Access for free at https://openstax.org/books/chemistry-2e/pages/1-introduction
Citation information

© Feb 14, 2019 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License 4.0 license. The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.