We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Lecture 04. Entropy.

00:00

Formal Metadata

Title
Lecture 04. Entropy.
Title of Series
Part Number
4
Number of Parts
27
Author
License
CC Attribution - ShareAlike 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
UCI Chem 131C Thermodynamics and Chemical Dynamics (Spring 2012) Lec 04. Thermodynamics and Chemical Dynamics -- Entropy -- Instructor: Reginald Penner, Ph.D. Description: In Chemistry 131C, students will study how to calculate macroscopic chemical properties of systems. This course will build on the microscopic understanding (Chemical Physics) to reinforce and expand your understanding of the basic thermo-chemistry concepts from General Chemistry (Physical Chemistry.) We then go on to study how chemical reaction rates are measured and calculated from molecular properties. Topics covered include: Energy, entropy, and the thermodynamic potentials; Chemical equilibrium; and Chemical kinetics. Index of Topics: 0:04:17 Boltzmann Distribution Law 0:15:37 Three Types of Ensembles 0:31:55 S = k ln W
BeerGrading (tumors)MeatComputer animationLecture/Conference
BeerBiofoulingSetzen <Verfahrenstechnik>Lecture/Conference
Man pagePeriodic acidOptische AktivitätMoleculeSystemic therapyFlüchtigkeitPeriodateOrigin of replicationComputer animationLecture/Conference
Man pagePeriodic acidSense DistrictThermoformingMoleculeSystemic therapyHuman body temperatureChemical propertyWaterfallAnomalie <Medizin>Klinisches ExperimentMan pageFamotidinePeriodateMedical historyBallistic traumaLecture/Conference
Man pageSea levelSpeciesSystemic therapyHuman body temperaturePharmaceutical drugVulcanizationFunctional groupElectronic cigaretteLecture/ConferenceComputer animation
Sea levelQuantum chemistrySpeciesMan pageNitrogen fixationSystemic therapyWasserwelle <Haarbehandlung>OperonMeat analoguePhase (waves)Stream gaugeGene expressionQuantum chemistryDerivative (chemistry)MoleculeGrowth mediumFunctional groupAusgangsgesteinStream bedSetzen <Verfahrenstechnik>Lecture/ConferenceComputer animation
Optische AnalyseWursthülleChemistryPeriodateHuman body temperatureIce frontLecture/ConferenceComputer animation
MoleculeIce frontLecture/Conference
Molar volumeFunctional groupDevolution (biology)Surface finishingSense DistrictWine tasting descriptorsHuman body temperatureElectronFlüchtigkeitComputer animation
Molar volumeAcetoneBeta sheetIce frontLecture/ConferenceComputer animation
Molar volumeData conversionLecture/ConferenceComputer animation
Computer animationLecture/Conference
Sense DistrictMolar volumeSea levelWursthülleSense DistrictMoleculeComputer animationLecture/Conference
Man pageOptische AnalyseSense DistrictMoleculeGemstoneDesert rose (crystal)Man pageCalculus (medicine)Wine tasting descriptorsLecture/ConferenceComputer animation
Sea levelDigital elevation modelOptische AnalyseSense DistrictBockSetzen <Verfahrenstechnik>Insulin shock therapyBeerMoleculeMolecularitySetzen <Verfahrenstechnik>Human body temperatureTequilaLecture/ConferenceComputer animation
Man pageChemical formulaMoleculeElectronic cigaretteLecture/ConferenceComputer animation
Quantum chemistryOptische AnalyseGene expressionMoleculeGene expressionWine tasting descriptorsAssembly (demo party)Process (computing)Initiation (chemistry)Lecture/ConferenceComputer animation
Pitting corrosionGene expressionFunctional groupFlüchtigkeitMoleculeGene expressionPotenz <Homöopathie>Active siteAreaDNS-SyntheseSystemic therapySense DistrictElectronic cigaretteCombine harvesterLecture/ConferenceComputer animation
Systemic therapyMoleculeIsotopenmarkierungWursthülleLecture/Conference
Can (band)MoleculeActive siteFlüchtigkeitSystemic therapyAtomic numberComputer animationLecture/Conference
Optische AnalyseSample (material)ClenbuterolMan pageMoleculeWine tasting descriptorsSample (material)Systemic therapyMan pageElectronic cigaretteWursthülleComputer animationLecture/Conference
WursthülleMoleculeElectronic cigaretteQuantum chemistryLecture/ConferenceComputer animation
Highway Addressable Remote Transducer ProtocolQuantum chemistryMoleculeLecture/ConferenceComputer animation
Sense DistrictNitrogenWine tasting descriptorsNanoparticleRecreational drug useRiver sourceNickelerzComputer animationLecture/Conference
River sourceNickelerzAzo couplingLecture/ConferenceComputer animation
Systemic therapySunscreenRiver sourceLecture/ConferenceComputer animation
Optische AnalyseSpontaneous combustionRiver sourceSystemic therapyChemical reactionPeriodateLecture/ConferenceComputer animation
Breed standardAtomBockKnotLactitolDerivative (chemistry)Amount of substanceEntropyKältemittelGene expressionAtomic numberMoleculeHuman body temperatureBreed standardIce frontWaterfallElectronic cigaretteWhitewaterPhenobarbitalGrowth mediumLecture/ConferenceComputer animation
CryogenicsHuman body temperatureFireLecture/Conference
Breed standardAreaAddition reactionElectronic cigaretteGesundheitsstörungTranslation <Genetik>CryogenicsAtomic numberSystemic therapyPressureAzo couplingDerivative (chemistry)Computer animationLecture/Conference
Breed standardMolar volumeBohriumAtomMolekulargewichtsbestimmungHuman body temperatureAtomAtomic numberSunscreenData conversionMan pageTool steelRevenueGrowth mediumSense DistrictCalculus (medicine)AgeingGemstoneLecture/ConferenceComputer animation
Human body temperatureLecture/Conference
Breed standardMan pageElectronic cigaretteGemstoneHuman body temperatureTool steelBreed standardHeck-ReaktionCalculus (medicine)Pascal (unit)Molar volumeEntropyComputer animationLecture/Conference
TopicityEntropyLecture/Conference
Transcript: English(auto-generated)
Okay, everybody, have a good Easter weekend. So you did well on quiz one. I don't know if you had a chance to look at the grade book, but out of 113 of you who took the quiz, 98 got some flavor of A. Mean was 4.5 out of 5, so that's
an A. And so, that's fantastic. Keep it up. All right, quiz two is going to look a lot like quiz one in terms of the format. There will be five questions, and I'll have more to say specifically about what's on it
on Wednesday. There's a key to quiz one that's been posted on the results page of our website. If you want to take a look at that, you can do Cypher or G. and Mark's handwriting. I'm going to read that key.
Here's the histogram, thing of beauty. Okay, so what we want to do today is review what we've already learned in statistical mechanics briefly, okay, to make sure that we're all on the same page. The first thing that we learned right at the beginning of last week was isolated
systems of N molecules with discrete states, be they electronic, vibrational, rotational. We didn't say anything about the origin of these states, but of course the molecules, they can be any or all of these things. These molecules assume microstates that can be grouped into configurations having
a defined number of microstates. We called that W. All right, so we talked about this thing called microstates. We talked about families of microstates, called configurations. And one thing we didn't say is that it's a fundamental postulate of statistical mechanics that over a long period of time, each microstate of a system is equally probable.
I don't think we said that until now, but that's a rather profound statement. All right, and one thing that we did say last week is that this special configuration that is most probable, the reason it's most probable is because it has by far the most
microstates, and the reason that makes sense is because each microstate in this system is equally probable as a function of time. As N increases, as the number of molecules increases, a predominant configuration, PC, emerges that predominates over all of the other configurations.
The predominant configuration is the largest W of all possible configurations. We talked about three molecules containing three quanta of energy, and we found immediately a predominant configuration. We talked about four coin flips, and we found a predominant configuration.
Even though the system is really tiny, we can identify a predominant configuration in these systems. And we pointed out that as the number of molecules increases, it becomes more and more macroscopic, all right?
The degree to which this predominant configuration predominates becomes greater and greater and greater, to the point where imagine how sharp this distribution will be when this number is 10 to the 23 instead of 1000.
There's going to be an extremely small area, if you will, of configuration space that we're going to have to consider in terms of understanding the behavior of that system. And we said the Boltzmann distribution law describes the properties of this predominant
configuration. It doesn't describe the properties of all configurations, all right? It only describes the properties of the predominant configuration. In particular, it specifies how the energy levels within the predominant configuration are occupied as a function of temperature. Here's the Boltzmann distribution law in its simplest form.
Okay. And we also said that a particularly useful form of the Boltzmann distribution law involves the definition of something called the partition function Q. And here's that version of the Boltzmann distribution law, two different versions of it, all right? In this case, we're talking about the summation over i states.
Each energy level can have multiple states. We call that the degeneracy. Or we can take the summation over energy levels instead of states, right? And in that case, we have to multiply by the degeneracy explicitly, right? So these two equations are identical.
It just depends on whether we're talking about individual states or energy levels. No difference between the two equations. It's more convenient to use this if we know about the degeneracy. Finally, we said the first order, the partition function is the number of states
that are accessible to the system at a particular temperature. That's how we define the partition function. But it actually trivializes what it is. Because what it really is, is it's the analog in statistical mechanics to the wave function. Right? The wave function, as you know,
contains all of the quantum mechanical information of the system. You can use operators to extract information on the momentum, the position, and so on of our quantum mechanical system from the wave function. The wave function contains all of the quantum mechanical information of the system.
And the partition function does that for statistical mechanics. It contains all of the thermodynamic information on the system. Right? The partition function is a profound concept. Just like the wave function is in quantum mechanics, it has the same lofty status in statistical mechanics.
So, can that be true? All thermodynamic information embedded in the partition function? The partition function isn't that big a deal. It's a summation. Well, we showed on Friday that, for example, you can calculate the average internal energy
of the system using the partition function. We derived a nice expression for doing that. Here's the average internal energy of the system. It's just the total energy divided by the number of molecules. And you can do a little algebra. We can make a substitution for N from the Boltzmann distribution law.
And when we do that, we get this expression here. And then if we recognize that this guy is just a derivative of the partition function with respect to beta. Why? We get these really simple expressions for the total energy. Where we're multiplying by big N. And the average internal energy where we're not doing that.
Right? And this is just a partial derivative of the partition function with respect to beta. And this is just one over the partition function. The partition function has this information in it. Right? If we know how to ask the partition function for the information. Right? If we know how to ask the question.
This is, here we're asking the question, what is the internal energy of a single molecule? Okay? So, we've learned about one case where we can extract this information from the partition function. We're going to be learning about other cases where this is also true.
So, we derived this equation on Friday. Now, let me just point out to you that this might be the least intuitive equation in all of chemistry. Right? By no means should you look at this equation and go, oh yeah. That's obviously the average internal energy of a molecule.
Alright? One over the partition function. Shouldn't the partition function get larger if the energy gets larger? But it's one over, oh, and there's a minus sign in front of it. Right? And it's the partial derivative here of the partition function with respect to beta which is one over temperature. No.
I have zero intuition regarding this equation. If you have more than zero, you have more than me. Alright? But we can derive this equation. We have to trust the mathematics that we've done. It wasn't a complex derivation. We just did it again in like three slides. Right? Even though this equation has no intuitive basis to it.
Right? It does in fact tell us what the average internal energy of a molecule is. Right? We can trust it. We derive it ourselves for goodness sakes. But don't imagine that you should have some magical intuition about this equation. Alright? I don't.
I don't expect you to. It's not an intuitive equation to me at all. Now, we didn't finish this example on Friday. We did A. We evaluated the term populations. Right? But we didn't calculate the electronic contribution to the molar internal energy at 300 degrees Kelvin because we ran out of time.
So let's do that because it takes us to this concept of energy. So we said NO has electronic states at zero and at 121.1 wave numbers and they're both doubly degenerate.
And so in principle, this is the equation that we should be able to use to calculate the electronic contribution to the molar internal energy. And one thing we did on Friday is we calculated this curve. This is the partition function as a function of temperature and we derived an equation for that that I'll show you in just a moment.
And we plotted that and here's what the partition function does and you'll recall that NO has four states and so the partition function has an asymptote at four. Can't be larger than that and it can't be smaller than two because we're always going to have two states that are thermally accessible.
And so this thing makes intuitive sense because it starts at two and it has an asymptote. It looks like it's not 100 percent clear but here's four and this is asymptotically going to approach four at very high temperatures. So qualitatively this partition function is doing what we expect it to do and we're
at three degrees Kelvin so if I look at 300 on this plot and I just go to the curve that we've derived, the partition function of 300 is 3.119. I'm going to need that number in just a second. Okay, so if I want to evaluate now dq d beta, that term right there, I can do that.
I'm going to take d d beta of q. Here's the partition function that we derived on Friday. It's got two terms, one for each state, one for each energy level I should say. There's two, that's the degeneracy of the ground state, there's the energy of the ground state,
there's the degeneracy of the excited state, there's the energy of the excited state. So this is our partition function and if we're going to take the derivative, that zero is going to move out front here so this term is just going to go away for us and we're only going to consider this term right here.
So 121.1 is going to move out front and that minus sign is going to move out front as well and so this derivative, d d beta is just equal to minus 2 times that energy times e to the minus beta times 121.1 wave numbers. Now, we want to know what that is in joules and so we have to do the usual unit conversions
and so this is my clunky way of doing that which I find always works for me. There's Boltzmann's constant, there's the temperature that we're talking about and so if we plug these numbers in, this is what we get for dq d beta.
Here's n, here's q which we just pulled off that plot and there's a minus sign here that's going to cancel with that minus sign right there and so we're going to end up with 519 joules per mole for the total energy of 1 mole.
That's 6.022 times 10 to the 23rd because we're calculating per mole here. Now, were you told n was 1 mole?
Now, does this make sense? Always ask that question. Does this number make sense? Well, in this particular case we can figure this out in a rather detailed way because on Friday we worked out what the term populations are.
We said 36 percent of the molecules have an excited state or are in the excited state at 121.1 wave numbers. The rest of them are not. They're in the ground state. Well, these ground state molecules don't contribute to the internal energy because the ground state's got zero energy. Only the excited state contributes to the total energy of the molecule.
Okay, so if I take this .36 and I multiply by a mole of molecules and I multiply that by the energy of the excited state and convert it to joules, I should get the same answer that I just calculated for the internal energy and I get darn close. The difference between this and 519 is just rounding here.
So, yes, 519 joules per mole is probably correct because I can do this backdoor calculation of the same quantity. I can calculate the total amount of energy in this 1 mole of molecules based on how big the excited state is, what its energy is,
and what fraction of the molecules are in that excited state. And if I had carried a few more sig figs here, this would be exactly 519 instead of 521. Everybody see what I did?
Okay, so we can use the partition function. We ask the partition function in the right way. It will tell us what the average internal energy is. We can calculate that for a mole of molecules or for a single molecule. No problem. Okay, now there's a confusing subject that's discussed on page 429 of your book.
Not that all of this isn't confusing. I mean, quite honestly it is. But this is more confusing than most of this, right? And it concerns the classification of molecular ensembles
as micro canonical, canonical, or grand canonical. And then what's constant when we consider such different canonical types of ensembles and what's the partition function and why does it have these different... If you're not confused by this, you're just not paying attention. This is very confusing.
So, let's talk about this in a little more detail. What we've been discussing so far in the last week is this thing here. Micro canonical ensembles. A micro canonical ensemble, which is what we've been talking about, applies to individual molecules.
One member of an ensemble of molecules. We've been talking about one molecule. The partition function asks how many states in each molecule are thermally accessible at a particular temperature. So, the micro canonical ensemble is the ensemble of states that could exist for this molecule
given that it has a certain number of quantum energy. In some ways, it should be obvious to us that we've been talking about one molecule because, look, there's two states in this molecule that are always occupied,
four that could possibly be occupied. So, this Q, we've really been just talking about one molecule. If we've been talking about two, then this would be four, five, seven, and eight, not two, three, and four, right? There'd be a larger number of thermally accessible states
because we'd be talking about multiple molecules, all right? But that's not what we've been doing. We've been just talking about one molecule this whole time. The Boltzmann distribution law has two forms, blah, blah, blah, blah, but notice in either one of these two partition functions is the number of molecules mentioned.
We've just been talking about one. They're independent of N. They don't contain N. Okay, so in essence, this microcanonical partition function has just been asking the question,
what's the probability that N equals zero is occupied? What's the probability that N equals one is occupied? N equals two. Add up those probabilities, boom, you get the microcanonical partition function that we've been talking about so far. Okay, so now, let's define the partition function for an ensemble of molecules.
More than one. We're going to call it big Q instead of little q so that we never confuse these two things in spite of their intrinsic ability to confuse us.
Notice that this is the energy of a single molecule. This is the energy of all the molecules, big E, not little e, right? These summations play off quantum states for the assembly of molecules, these summations right here. All right, so the question is how can we express big Q in terms of little q, right?
How can we express the canonical ensemble, the partition function for the canonical ensemble in terms of the partition function for the microcanonical ensemble? All right, let's figure this out because we want this to make sense
even though it's intrinsically confusing. Here, let's consider just two molecules, A and B. Let's write the big Q for these two molecules now, right? It's this summation that contains the energy for both molecules now and so we can write that in terms of molecule A and molecule B.
E sub I is just E A plus E B. That's all the energy in A and all the energy in B. Oh yeah, okay? So we've got two molecules in our system now. What does the summation refer to? Well, it refers to, all right, and there should be, okay, I left the parentheses out here.
So beta should be multiplying this whole thing here. Parentheses here, here. Also parentheses here before I left them out. Okay, so we want to consider all the possible states. Molecule A can be in its ground state, so can molecule B. Molecule A can be in its ground state
when molecule B is in its first excited state or second excited state. Likewise, molecule A can be in its first excited state when molecule B is in its ground state. And so on we have to consider all of these different combinations. That's what these summations are. Now it turns out that if I look at this, all right,
at the end of the day all I'm doing is I'm multiplying these two microcanonical partition functions together. All right? Here's the microcanonical partition function for A and the microcanonical partition function for B. If I just multiply them together, I've got little qA times little qB, that equals big QAB.
Okay, so for this two-molecule system, I've got two partition functions. If I've got N molecules, I'm going to take my microcanonical partition function to the Nth power. Here I'm taking Q to the second power, right?
If these two molecules are identical, it's just little q squared. All right, in general if I've got N molecules, I'm going to have an N in the X moment here, right? This is the appropriate expression when we've got N distinguishable molecules.
N distinguishable molecules. What's a distinguishable molecule? Well, hypothetically if this molecule was located in a lattice and we could keep track of its position, then this would apply. Position's on the crystalline lattice. This is rarely the case as it turns out.
We're still rarely able to label molecules and keep track of individual molecules in the system. Understand that. So, what if they're not distinguishable? That's the more general case. What if we can't keep track of which molecules which?
What happens? Well, for two distinguishable units, we can tell the difference. Check out these two states right here. All right, molecule A is excited and molecule B is not. Molecule A is not excited like molecule B is.
All right, these are two different states for the system. One molecule A is excited, molecule B is not. One molecule B is excited, molecule A is not. We can tell the difference between this and this because we can keep track of these two different atoms. All right, we can see this guy's excited, this guy's not, and this guy's excited,
and this guy's not. So these are two different states that we can identify. But if these are gas atoms and they're zooming around in this room, we can't keep track of them anymore. If our sample is a gas, the molecules will be zooming around. We won't be able to keep track of them. And in that case, these two systems
will be indistinguishable from one another. In fact, we can't tell the difference. That's just one state. We can say one molecule's excited, the other one's not, but we can't say which one's which. So if the states are indistinguishable because they're zooming around in the gas phase, for example, we'll have a smaller number of them
because here we have two states, but if we can't tell the difference, it's just one. Okay, so in our example above, big Q is going to equal one half of little q to the n. There will be half as many states
if the molecules are indistinguishable as if they were distinguishable, if there's two molecules. There will be half as many states if there are two molecules. Let's consider another case. What if there are three molecules, A, B, and C?
Okay, let's say one has one quantum of energy, one has two, and one has three. Okay, one has one quantum of energy, one has two, and one has three, and these are all the different ways that we can configure that configuration. These are all the different microstates for that configuration, if you will. Six of them, all right?
This assumes that A, B, and C are locked into a lattice so that we can tell the difference between them. We can keep track of which molecule contains how many quantum of energy, but if they're zooming around, this is just one state. Okay, so if A, B, and C are distinguishable,
they're locked in the lattice, then there are six states here. If they are indistinguishable, there's just one. Three molecules, one over three factorial times q to the n. All right, so we multiply by one over n factorial to figure
out how many states we lose when the molecules become indistinguishable. One over n factorial, okay? So, that explains this nonsense.
Okay, distinguishable parts, we've got q is just equal to the microcanonical partition function to the big N's power, right? And if the particles are indistinguishable, we're going to reduce that by one over n factorial. All right, now this we're just going to forget about.
We don't need it, and this is already confusing enough. Okay, so we're going to talk about microcanonical ensembles and canonical ensembles. We don't need a grand canonical. All right, you with me? So, this is cryptic, but we now sort of understand it.
We can understand where these terms come from, right? Where big q comes from, right? Kind of important, kind of a central thing in statistical mechanics, but my goodness, more confusing than it needs to be. Okay, you ready for entropy?
This is a Kinney's shoe box. How many people have ever been to Kinney's? It's a shoe store. It's the only shoe box I could find in Google Images.
Okay, we're going to put nickels into this shoe box, a hundred nickels, all heads up. It takes a while. Turn them all over. They're all sitting in there, heads up. Okay, now I'm going to shape the shoe box
and then make sure all the nickels are flat again so I can see if their heads are tails. What's going to happen? They're going to stay heads. Who said that? Wrong answer. Shake inventory. All right, we didn't shake very hard, obviously.
All right, only a couple turned over. All right, 91 heads, 9 tails. Now I'm going to shake them again. All right, this time 72 heads, 28 tails. I'm going to shake them again, 59 heads, 41 tails.
All right, it will sort of fluctuate around 50-50, won't it, if I keep shaking? It will be unusual. I get exactly 50-50. All right, the system is never, if I shake this system right here, it never goes back in this direction. All right, the probability of me shaking this system
and getting that guy, I'll tell you what it is in just a moment. All right, does everyone agree that we always see this system evolving in this direction? We never see the opposite. Well, let's look at the number of microstates.
All right, 100 heads for 100 coins, right, there's only one way for that to happen. All right, how many ways are there to have 91 heads and 9 tails? Well, for 100 coins, 91 heads and 9 tails,
there's 1.9 billion ways to do that, trillion. What if you have 72 tails and, sorry, 72 heads and 28 tails?
4.99 times 10 to the 24 ways to do that. Right, is that number twice as big as that number? No, it's a billion times bigger. All right, 59 heads, 41 tails.
2 times 10 to the 28, factor of 10,000. All right, each one of these microstates is, to first order, equally probable as a function of time. According to statistical mechanics, right, this system always evolves in the direction of increasing W, right?
All systems do this. For any isolated assembly, we can always predict the direction of spontaneous change as that in which W increases, right? The number of microstates always increases for every probable reaction that we care about,
right, for every change of state, for every change in volume, all right? It's W increasing that determines the direction of spontaneous change. This is a very important idea.
So remember, any isolated system remains constant energy, so the system is optimizing in another parameter. There's no difference in energy between heads up and tails. All right, this system has exactly the same energy as this system, right? It's, you know, the difference has got to be, I don't know why there would be any difference, all right?
It's very, very small, between a coin that's heads up and a coin that's heads down. Okay, so this has got nothing to do with energy. It's got nothing to do with energy, right? The direction of spontaneous change is optimizing W.
That's what's determining this direction of spontaneous change. So Boltzmann postulated that this thing called the entropy, all right, is the thing that's getting optimized
and he postulated the entropy is equal to his constant times log of the number of states and he had good reasons for doing this that I haven't the slightest understanding of, okay? But he called this thing S, the entropy,
and he said it's equal to K log W. This was, in his own mind, his most important contribution to science derivation of this equation. Okay, so should we do an example? Calculate the standard molar entropy of neon gas at 200 degrees Kelvin and at 298.15 degrees Kelvin.
The standard molar entropy of neon gas. And the ensemble of atoms at constant temperature because it's a molar entropy and it's a gas. These molecules are zooming around.
We're not going to have any chance to keep track of them. So they're indistinguishable particles, all right? Can't keep track of them. So this is what's going on, all right? It's a canonical ensemble of indistinguishable particles
and the partition functions can be equal to the microcanonical partition function, the N divided by N factorial. Okay, so how do we solve this problem? Let's start with this expression for the entropy that was derived in your book on page 485.
All right, this is equation 15.2. It says the entropy is equal to the internal energy. This is just the energy of the ground state, right? So this is the total internal energy divided by temperature plus total number of molecules, log of the microcanonical partition function.
This version applies when we're talking about canonical ensemble of indistinguishable molecules. Notice that N is gone because N is contained within Q. Now let's write Q in terms of Q. Let's write this guy in terms of that guy.
Well, we have that already. I mean, we derived that earlier. We're just going to plug this expression in for that. Boom. Okay, and then we're just going to split this between two terms. We're going to pull this N factorial out, move it into the numerator
and put a minus sign in front of it. All right, there. All right, so we moved it into the numerator and put a minus sign in front of it. We still have to have the K there. And then Q to the N. I can move the N out front, right? Times K is times log Q now. Okay, so I've got two terms from one just
because I moved the numerator and denominator into different terms. And now we can do two things. We substitute for N. We call it Avogadro's number times the number of moles. Little n is going to be the number of moles. And we use something called Stirling's approximation for log of N factorial.
There's, well, there's something called Stirling's approximation for that that allows us to write it as that. Okay? So now we get this. What did I skip?
Oh, remember Avogadro's number times Boltzmann's constant is just the gas constant. Avogadro's number times Boltzmann's constant is just the gas constant and so I can make a substitution for, I'm doing two things. I'm substituting N sub A times N for N
and then I'm substituting for R also, right? That gets me this guy right here. I've got R's now instead of Boltzmann's constant. Okay, so what's Q? Well, for an atomic gas, what's the partition function for neon?
What do we have to think about at these temperatures? 200 degrees Kelvin, 298 degrees Kelvin. What states are accessible to neon at these temperatures? Well, does neon have rotational states or rotates?
We don't notice it's rotating. Does it have vibrational states? Nothing to vibrate against. It's just matter. Does it have any electronic states that we can access at these temperatures? There's no way for you to know that, but the answer is no. It's got no electronic states at these really low temperatures
that you would need to get involved at these low temperatures. Right? So all neon can do is translate around, right? It can translate. That's all it's, the only states it has available to it. Right? It's got no rotational, no vibrational, and no electronic states that are accessible
at these low temperatures. Now, there's no way that you could possibly know about the electronic states. You'd have to be told that, but you'd know about the rotational and the vibrational states. It's anatomy we're going to say. Okay? So, for an atomic gas, translation is the only possible
means whether you can sort the system, provide, well, that's not quite true. I mean, if there were electronic states that were accessible in principle, they could contribute, but if they don't, then translation is the only way in which energy can be stored by the system.
And under these conditions, the Sacher-Tetrode equation applies. The entropy equals N times R log E to the five has KT over T, that's the pressure, over the thermal wavelength Q.
Did I skip the derivation of this? It's not obvious that this equation comes in any way from this equation right here. Did I leave out a couple of lines of algebra here?
Apparently, I did. Okay, trust me. This equation applies to a monoatomic gas. All right, what do we need to know? Well, obviously we need to know the thermal wavelength. We need to know the temperature,
but we already know we're at 300 degrees Kelvin. Are we missing any other information? We need to know the number of moles, we need to know the pressure, and we need to know the thermal wavelength. This thermal wavelength is given by this equation right here, which when I looked at it, the first thing I thought is if this is a wavelength
and it's got units of distance, I'm missing where these units of distance are coming from. All right, does this mess have units of distance, meters? Well, if you're not sure, you always should do a dimensional analysis.
Because among other things, the dimensional analysis will not only tell you whether you're going to get the right units out of this mess, it'll tell you what units these other parameters have to have in order to get the right units out of the equation that you need. You need units of distance.
Okay, so if I look at this equation right here, what's H? H is units of joules seconds, right? What's M? M is units, that's mass. That's mass of a single neon atom. All right, units of kilograms. What's units of Boltzmann's constant, joules per Kelvin, units of temperature Kelvin.
Okay, now what do I have to remember? Kilograms, of course, can be reconfigured by just remembering E equals MC squared, so mass is E over C squared. That's joules over meters squared per second squared or joules second squared per meter squared. Boom. Joules second squared per meter squared,
that's a kilogram. I always just remember E equals MC squared and then joules, kilograms, meters per second. Okay? You get useful conversion factors that way, especially for mass. Okay? And so when I cancel all of these units here,
I get meters. All right, so presumably if I use mass here, units of kilograms, joules per second, so forth, I won't get meters. It's surprising, but you do get them. Okay, so now I can put the numbers in.
There's H. What is that guy? That's the molar mass of neon. All right, I want the mass of one neon atom, so I'm going to multiply by one over the number of atoms in a mole. All right, so that's going to be the mass of a neon atom in units of kilograms now, so remember that because that's the molar mass of neon
in units of kilograms, not grams. Okay, and if I run these numbers in my calculator, I get 2.748 times 10 to the minus 11 meters, or 27.5 picometers. That's a really short distance. All right, this thermal wavelength is tiny.
All right, even though we're not at a super high temperature here, all right, this is a tiny thermal wavelength, so we don't really have any intuition about what the thermal wavelength should be. All right, it's going to be tiny. It's going to be a tiny number. Now, this is maybe the first intuition that we have on that.
Okay, so now we can plug everything into this sacro-tetrode equation. All right, N is just going to be one. R, 8.31451 joules per kelvin per mole. Boltzmann's constant, temperature.
What the heck is that? All right, and there's the thermal wavelength right there. Q, all right, what is that? It's the number of pascals in one atm. Now, we're talking about standard molar entropy.
That means one atm, and that means 101,325 pascals. All right, that's a good number to keep in the back of your mind. Okay, so standard means one atm. That's 101,325 pascals,
and a pascal meter cubed is a joule, and so if I run these in my calculator you get 138 joules per kelvin per mole. Entropy is always joules per degree kelvin per mole, not like the energy, joules per mole, joules per kelvin per mole if it's the entropy.
All right, and if I increase the temperature slightly to 298.15 room temperature, I get 146 joules per kelvin per mole. Okay, we're going to do more examples
of the entropy on Wednesday. Now, one confusing thing is that Chapter 13 doesn't say anything about the entropy. Neither is Chapter 14. In Chapter 15, we're jumping around a little bit between Chapters 13, 14, and 15. Is everyone sort of comfortable with that?
Hopefully it will be obvious what topics we're emphasizing as we do this. Okay, so we'll see you Wednesday.