We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Statistical Mechanics Lecture 7

00:00

Formal Metadata

Title
Statistical Mechanics Lecture 7
Title of Series
Part Number
7
Number of Parts
10
Author
License
CC Attribution 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Leonard Susskind addresses the apparent contradiction between the reversibility of classical mechanics and the second law of thermodynamics, which states that entropy generally increases. This topic leads to a discussion of the foundation of chaos theory.
UniverseRulerRailroad carGasGas turbineModulationParticleTemperatureCarburetorTypesettingHot workingStandard cellAtmospheric pressureWater vaporRoll formingAccelerationDrehmasseSpread spectrumIdeal gasMassRoots-type superchargerSpeed of soundShip naming and launchingThermodynamic equilibriumCosmic distance ladderApparent magnitudeSuitcaseNanotechnologyMechanicCableForceDensityFACTS (newspaper)Avro Canada CF-105 ArrowCardinal directionCommodore MAX MachineAtmosphere of EarthAbsolute zeroMitsubishi A6M ZeroWeightComputer animationLecture/Conference
AmplitudeAtmosphere of EarthFood storageFirearmGasGas turbineTruckMeasurementNeutronParticleTemperatureRolling (metalworking)YachtCrystallizationPlant (control theory)SoundAtmospheric pressureElectrical breakdownAmmeterAmplitude-shift keyingElectronic mediaSpare partHochfrequenzübertragungIdeal gasMassMolekülmasseNatürliche RadioaktivitätNegativer WiderstandCartridge (firearms)TiefdruckgebietBrickyardHull (watercraft)KilogramWeightRelative articulationMonthBand gapIKEAForceDensityKopfstützeHourTuesdayMarch (territory)SpacecraftLiquidSpring (season)Optical cavitySpeed of soundWindLecture/ConferenceMeeting/Interview
TemperatureVeränderlicher SternCrystallizationContainment buildingPartition of a setBeta particleSpring (season)Ideal gasMassMultiplizitätNegativer WiderstandAudio frequencyElectric power distributionCartridge (firearms)Roots-type superchargerThermodynamic equilibriumVertical integrationContinuous trackGameGround stateNanotechnologyShip classCosmic microwave background radiationBird vocalizationFACTS (newspaper)Boltzmann constantFood storageMinerLiquidToolAtmospheric pressureBoatDrehmasseSpare partMechanical fanPipingInterval (mathematics)LeistungsanpassungScrew capWhiteRankingAmateur radio repeaterElektronenkonfigurationMonthAssault rifleKnotYearKopfstützeMagnetic coreHourGentlemanCardinal directionTiefgangLecture/Conference
WoodturningGas turbineClassical mechanicsParticleTemperatureRoll formingPartition of a setBeta particleSpare partSingle (music)Spring (season)MassNegativer WiderstandMembrane potentialThermodynamic equilibriumHeatVertical integrationHose couplingBuick CenturyLumberForceLimiterFACTS (newspaper)FirearmToolCrystal structureCartridge (firearms)Speckle imagingKey (engineering)Stool (seat)Blood vesselWeekYearHueFinger protocolLecture/Conference
BauxitbergbauMinerSolidGas turbineMeasurementSeeschiffParticleTemperatureYachtTypesettingPlant (control theory)Water vaporRoll formingShip breakingBeta particleEnergy levelSpare partSingle (music)Spring (season)MultiplizitätNegativer WiderstandAudio frequencyCartridge (firearms)Summer (George Winston album)Thermodynamic equilibriumZeitdiskretes SignalCar seatQuantumWeightContinuous trackBombMechanicSeries and parallel circuitsLumberYearBird vocalizationDVD playerFinger protocolPower (physics)Lecture/Conference
Food storageMinerFirearmClassical mechanicsMeasurementGunTemperatureHot workingSound recording and reproductionBeta particleEnergy levelNegativer WiderstandTheodoliteQuantumRelative datingContinuous trackCrossover (music)BombShip classQuantization (physics)Bird vocalizationFlatcarFACTS (newspaper)Cardinal directionSpring (season)TiefdruckgebietZeitdiskretes SignalLecture/Conference
GlassPhysicistSolidGas turbineClassical mechanicsTemperatureCrystallizationElectronSound recording and reproductionAtomismBeta particleEnergy levelSpring (season)Ideal gasModel buildingMultiplizitätNegativer WiderstandAudio frequencyPhotoelectric effectRoots-type superchargerWärmekapazitätHeatWeather frontQuantumDigital electronicsCrossover (music)Band gapMechanicEraserScale (map)Atmosphere of EarthFood storageMinerTurningElectricityFirearmCar dealershipClimateStormSchneller BrutreaktorWater vaporCrystal structureQuality (business)PipingCartridge (firearms)StagecoachSpeckle imagingWhiteFeltMixing (process engineering)Capacity factorWeekYearSchwellenspannungWind farmStock (firearms)Diaphragm (optics)Cardinal directionEffects unitLecture/Conference
Atmosphere of EarthDirect currentClassical mechanicsMeasurementModulationParticleTemperatureToolYachtBomberCrystallizationDotierungWater vaporCogenerationAngeregter ZustandBroadbandEnergy levelFlugbahnSpring (season)Negativer WiderstandNeutronenaktivierungAudio frequencyHarmonicMode of transportRing strainThrust reversalTheodoliteAnalytical mechanicsThermodynamic equilibriumWavelengthUniverseString theoryQuantumPhase (matter)Tin canNanotechnologyFundamental frequencyStarColorfulnessSchwellenspannungBird vocalizationFACTS (newspaper)JukosTypesettingPartition of a setAbsolute zeroAtomismElectric guitarTiefdruckgebietConstraint (mathematics)Lecture/Conference
TauAtmosphere of EarthTurningYarnGlassDirect currentClassical mechanicsSensorMeasurementScreen printingParticleTemperatureYachtStandard cellCylinder blockPlant (control theory)Field-effect transistorWater vaporCrystal structureDrehmasseElectronic mediaPower (physics)Cartridge (firearms)Thermodynamic equilibriumPlane (tool)Cut (gems)DayContinuous trackGameElektronenkonfigurationTotholz <Schiffbau>Phase (matter)Ground stationAerodynamicsRulerGreen politicsRegentropfenDensityKardierenLimiterBird vocalizationFACTS (newspaper)HourGentlemanCoining (metalworking)Analog signalSizingQuantumDyeingMeeting/Interview
Food storageTurningRailroad carFirearmGasGlassDirect currentSensorSpace probeParticleYachtPlant (control theory)FiberMusical developmentLevel staffSpare partSpread spectrumFrictionFighter aircraftWedge (mechanical device)Thermodynamic equilibriumPhase (matter)ProzessleittechnikMint-made errorsShip classStarter (engine)YearEveningBallpoint penLimiterFoot (unit)Stock (firearms)TrajectoryFACTS (newspaper)Compound engineCardinal directionEffects unitWatchTotholz <Schiffbau>Lecture/Conference
Direct currentClassical mechanicsMeasurementTemperatureFrictionCartridge (firearms)Thrust reversalSchwimmbadreaktorInitiator <Steuerungstechnik>Pattern (sewing)ElektronenkonfigurationPhase (matter)Tolerance analysisBallpoint penTrajectoryAtmosphere of EarthRailroad carLiquidModulationScreen printingToolLastSeparation processAnalog signalAutomatic watchFord ExplorerPhotographySpeckle imagingSummer (George Winston album)FaxWire bondingContinuous trackGlobal warmingRulerKnotCableYearStock (firearms)MinuteLecture/Conference
MeasurementPhysicistClimateParticleWeatherWinterHot workingSpeise <Technik>Field-effect transistorRoll formingSeparation processCogenerationKette <Zugmittel>Beta particleBubble chamberDoppelpendelSpread spectrumPower (physics)Audio frequencyMode of transportPendulumRing (jewellery)UniverseCapital shipSteckverbinderWeightOrbitPhase (matter)Orbital periodBand gapBombColorfulnessBridge (nautical)BarqueFoot (unit)TrajectoryDe Havilland Canada DHC-6 Twin OtterFACTS (newspaper)HourNyquist stability criterionClassical mechanicsSunlightFrictionEingeschränktes DreikörperproblemRulerTolerance analysisNewton's laws of motionMeeting/Interview
Transcript: English(auto-generated)
Stanford University. All right. What I want to do tonight is just begin. Well, what we're going to do is eventually get to the second law and what the second law means,
how it's consistent. I will try to explain the second law to the best of my ability. There should be lots of questions which I will try to answer. I know a little bit about the second law.
There may be two or three people in the world who know more, but I've never met any. So we'll talk a little about the second law, what it means.
But before we do that, I want to just think about some physical examples that we've learned some information about by doing statistical mechanics. First, I just wanted to illustrate some of the just numerical facts about gases
and so forth by contemplating the speed of sound, speed of sound in a gas, and in particular in a fairly dilute gas. What is the speed of sound? There are two formulas for the speed of sound. But if you think about it for a moment and you start with a very dilute gas,
what would you expect the speed of sound to be? Let's say you make a little bit of over dense region. How fast does that over density spread? That's the speed of sound. What would you guess?
Well, I know what I would guess. I would guess that it sure isn't going to be faster than the time that it takes for a molecule to travel out to a certain distance. In fact, for a very dilute gas, you might expect that that little over dense region just spreads out with a velocity that's
not too different than the average velocity of the molecules. How does it spread out? It spreads out by the molecules moving out. How fast do the molecules move out? They move out with their velocity. What is their velocity? Now, then we have to work a little bit
to find out what the velocity is. Let's write down a formula. In thermal equilibrium at a temperature T in a dilute gas, every molecule has a energy equal to 3 halves kT.
Now, I'm going to put the k in now because we want to do some laboratory numerology. 3 halves k Boltzmann times the temperature, where the temperature is measured in Kelvin. Kelvin means relative to absolute zero. But the difference between one degree corresponds
to one centigrade degree. Nine fifths of a Fahrenheit degree. So it's 3 halves. And you can see from this formula that the temperature goes to zero when the energy goes to zero. You see it right here. So temperature zero, energy zero.
And what is that energy? That energy must be the kinetic energy of the molecule. It's the only energy that it has for an ideal gas in a big box. So it's got to be 1 half the mass of the molecule times the velocity squared. And now that tells us what the average velocity is.
Multiply by 2. I didn't mean to throw away the 3. And now divide by m and take the square root,
but I'll leave it as velocity squared for the moment. Just the velocity squared. And you might expect that that's more or less the speed of sound in a gas. And that's pretty close. There is a more exact, let's see. There is a more exact formula. The more exact formula comes by actually studying
the mechanics of lumps of gas and the forces on them. The forces, I'm not going to work it out. We're not going to do this tonight. We could do it another night. It's not hard. It's easy to do. Newton was able to do it. Well, I mean, you know.
Newton did it. It's not too hard. You know, you take a box of gas, you take a region in the gas, and you first say what forces are on it. The forces have to do with the pressure from the outside.
And there's a net force if there's a gradient of pressure. If the pressure on this wall is the same as the pressure on that wall, there's no net force. So there's some force due to the gradient of pressure. You take that into account. What is the response to the variation in pressure? It's F equals MA. It's acceleration. So the box starts to move.
The acceleration depends on the mass in the box. So the density, the mass density comes into it. But in the end, you find a formula. And the formula, I'm just going to write the formula down. The formula is that the speed of sound squared, this is the velocity of a molecule.
The speed of sound squared is equal, and this is a fairly general formula. This formula is more general than just the ideal gas or an approximately ideal gas. It's E squared is equal to the derivative of the pressure with respect to the mass density.
Now I'm going to write the mass density as the product of the density of particles, the thing we've already called rho, the density of particles times the mass of an individual particle.
Mass of the individual particle times the density of particles is the mass density. So the standard formula for the square of the speed of sound is the derivative of the pressure with respect to the mass density, which is the same as the derivative of the pressure
with respect to the particle density divided by the mass. Let's see what the ideal gas formula has to say about this. The ideal gas formula that we derived is pressure times volume equals n times, let's put the k Boltzmann in, times the temperature, or dividing by the volume, n over the volume.
Now n over the volume is the particle density, so here we have the particle density. This is rho, but it's not the mass density.
If, well, all right, here it is. We have it here. Derivative of p with respect to rho times one over mass. So this is the formula. Here we have p as a function of rho. What is the derivative of p with respect to rho? Derivative of p with respect to rho is equal to k Boltzmann t, just differentiating.
And then we divide by one more mass, and we get k Boltzmann t over m. Not quite the same formula, but for the velocity itself, this is the square of the velocity.
This simplistic formula is wrong by a factor of square root of three. Square root of three is 1.7, so it's not so bad. But order of magnitude, it's essentially correct.
We could, let's work it out. Let's work it out for the air. I have some numbers written down. Let's use this formula.
You just stick in some numbers. Either formula, the velocity will be the same to within a factor of 1.7 or whatever it is. All right, so if you stick in, k Boltzmann equals 1.3 times 10 to the minus 23rd. The mass, what about the mass of an air molecule?
Air molecules are mostly made out of nitrogen, and nitrogen is one of these dimolecular, what do you call it, two atoms, a diatomic molecule, all right? Nitrogen has atomic weight 14, if I remember, and so the mass of a nitrogen or the mass of an air molecule
is about 30, 28, 30 times the mass of a proton, and you look that up, that winds up being about four times 10 to the minus 26th kilograms.
What else do we need to know? Oh, and we need to know room temperature. We're working at room temperature. Room temperature is 300, 300 degrees in the same units where k Boltzmann is equal to this number here.
So stick it in, what do you get? I get about, when I stuck it in, I got about the velocity itself. The velocity itself was about 500, I think 500, yeah, 500 meters per second.
That's what you get out of this formula. It's a little on the high side. The right answer is about 300 meters per second. Yeah, no, no, the neutrons are in there.
14 means 14 protons plus neutrons, that's in there. Yeah, atomic weight is 14, not the atomic number, just did.
Well, no, maybe, okay, all right, good, good point, maybe, got to think about it.
But the point, yeah, you may be right, you may well be right. So, but in any case, the orders of magnitude are correct.
And I suppose this, I don't know who the first one to actually do this calculation was, I don't know. Actually sit down and say, what is the speed of light given the atomic weight of nitrogen, given Boltzmann's constant. Of course, they didn't, in order to use, where's the other equation?
The p by the rho, all I had to do was to measure the pressure as a function of density, so they didn't have to do anything fancy. But I'm not sure who was the first one to realize that it was basically the velocity of a molecule.
Question? All right, now, I want to come to a puzzle. Well, let's see, are we ready for the puzzle? No, we're not ready for the puzzle. We're going, yeah.
Which is constant? Yeah, I think it's a fairly good assumption. The, of course, yes, we did assume the temperature was constant.
I think that's probably very good for small amplitude sound waves. Imagine for small amplitude sound waves, the temperature doesn't vary very much as it vibrates. Now, again, I'm not absolutely sure about that.
I would guess that the temperature has very, very little variation in a near ideal gas for very, very low amplitude sound waves. So, I think that's probably okay.
What's that? The variation of pressure is also small, yeah. Yeah, the variation of pressure is small. The variation of density is small. But the variation of pressure with respect to density is not necessarily small.
I mean, usually, I mean, I expect that when something is, if you increase the pressure of it, you'd think that the speed of sound would go faster because it's sort of like a solid is faster speed of sound than a, yeah, a liquid is going to drop.
Well, this dPd rho is calculated at the pressure that you're working at. dPd rho is not necessarily a constant. The pressure depends on the density in a complicated way. In particular, as the molecules get really squeezed together, the derivative of pressure with respect to rho gets very, very large.
So, you're right, but it's because, it's because this changes. Yeah. You'd expect v-haught to be a constant, but something related to... You'd expect, you'd expect what?
I mean, the speed of sound, you'd expect it to be related to pressure somehow. Yeah, sure it is. Sure it is. This quantity depends on pressure. No, no, no, no, this is at, I'm sorry, okay. That's right. This is at low pressure, or better yet low density, where the gas can be approximated as an ideal gas.
It doesn't, in the ideal gas range, it doesn't depend on pressure. But, as you, you know, the reason it doesn't is because if you go to the formula that pressure is equal to rho times the temperature,
at fixed temperature, pressure is just proportional to rho, but as the pressure gets larger and larger, this formula is going to break down. For example, it might start varying like rho squared, in which case, as you say, the speed of sound will depend on the pressure.
Just to come back to the idea that you started with where the velocity of sound is about equal to the velocity of a molecule. I imagine that idea would break down also as the density gets larger.
Probably. Yeah, I think it probably does. You know, I never thought about it very much. This is as much, this is the depth that I've gone into it. Yeah. Quick question. Sorry I missed out. Why is the mass 30 times the mass of the proton?
Well, I've made an approximation. 30 is equal to 28.
Let's move on. Let's move on to the harmonic oscillator. Now we have not a free particle in a gas, but I'm going to put into the gas or into the system, whatever it happens to be, whatever the heat bath is,
I'm going to put in a single harmonic oscillator. It's part of the system and I'm going to treat the rest of the system as a heat bath and ask about the property of the harmonic oscillator.
Now the harmonic oscillator could be a spring with a mass on it or anything else which naturally oscillates. It could be an electromagnetic wave in a cavity.
What else oscillates? What's another example of an oscillator? Sorry. Crystal. All sorts of oscillations take place in crystals. That's right. So there's lots and lots and lots of oscillating systems in nature.
Almost anything will oscillate if you just disturb it a little bit. So when we say the harmonic oscillator, we're studying an extremely wide class of systems. Anything which when disturbed a little bit away from equilibrium or away from its ground state will oscillate and that covers almost anything you can think about.
So let's do the statistical mechanics of the harmonic oscillator. Interesting question. What is the average energy of the oscillator? How does it depend on its mass? How does it depend on the spring constant and so forth?
And see if we can work out the answer to that question. So what do we do? We begin with an energy, with an expression for the energy. We're going to put it into the Boltzmann distribution and then we're going to calculate average quantities for the oscillator. Okay. What is the energy?
The energy is one-half mass times X dot squared where X is the coordinate of the oscillator away from its equilibrium position. Mass is the mass plus the spring constant times X squared over two.
This is not the Boltzmann constant. Now we are finished with Boltzmann. Boltzmann will fade into the background now. We no longer will keep track of the Boltzmann constant. K here is the spring constant. The Hooke's Law spring constant. Or equivalently, if we go to momentum instead of velocity, it becomes P squared divided by two M.
Momentum squared divided by two M plus the same KX squared over two. So that's our expression for energy.
This is our expression for energy. And now we want to write the Boltzmann distribution for this system here. And that's E to the minus beta. Beta is the inverse temperature. Now what determines the temperature?
The heat bath that it's immersed in. So if the oscillator was in this room, it would be 300 degrees. If it was in liquid helium, it would be three degrees, four degrees, whatever. If it was in the middle of the sun, it would evaporate and it wouldn't be there. But you know what I mean.
Okay. E to the minus beta. P squared over two M. I'm going to factorize it. E to the minus beta KX squared over two. I've simply used the fact that the exponential of a sum is just a product of exponentials.
So I've factorized it like this. That's the expression for E to the minus beta times the energy. What do we do with it? We calculate our friend, the partition function. And the partition function is our best friend in statistical mechanics. So we calculate it. And that's equal to the sum of all configurations, which in this case means DP DX.
Now in previous situations, the X integration just gave us the volume of the container that the box was in. Now that's not true because we're doing an integration of the integrals factor.
DP DX. Two separate integrals. It's just a product of two integrals. We can write it. Let's call it the P integral and the X integral. I put up the P downstairs here because I didn't have room to write DP. But that's what I mean.
Integral DP. Integral DX. Okay. So all we have to do now is to do these two integrals. Remember that numerical constants in the partition function, multiplicative constants don't make any difference. Why not? Because when you take the logarithm of the partition function, they correspond to additive constants.
And since we're always going to be differentiating the partition function, we don't care about numerical constants. So let's, for getting the numerical constants, we want to do these integrals. The P integration, we do the same way that we did. I'll just remind you how we did it.
We said that P squared over 2M beta, the thing occurring here, we changed variables and we called that Q squared. And obviously that means that P is equal to the square root of 2M over beta times Q.
Once we do that, this integral here becomes integral. Now, DP. We have DP here. I took it away. DP. Let me put it back. What is DP? DP is the square root of 2M over beta times DQ.
So, the first thing we do is we pull out a factor of square root P equals square root of 2M over beta.
So first of all, we have the factor of square root of 2M over beta. That's coming from DP. Then we can write DQ. And this thing just becomes E to the minus Q squared. This is a definite integral. What are the ends of the integration?
Minus infinity to infinity. Well, the momentum can be anything. Minus infinity to infinity. And this is a number. If you happen to remember, the number happens to be the square root of 2 pi. But who cares? It's a numerical constant that we don't care about.
It is a pi. Let's put it in there. Pi. That's this integral and we're finished with it. We've seen it before. It's the same one that occurred in the ideal gas. The new integration that occurs, so this integral has been replaced by this.
What about this one over here? We play exactly the same game. Let's make it, this is DX. Change of variables. X squared beta spring constant over 2. We will call Y squared.
That means that X is equal to square root of 2 over beta kappa times Y. Alright, we have DX here.
So the DX is going to become a DY. It will give us an additional factor of square root of 2 pi over beta k spring constant. That's the DX. Forget the 2 pi, just square root of 2 comes from here.
And then the integral just becomes E to the minus beta, E to the minus Y squared DY. By changing variables, we turn the integral into just a simple integral, E to the minus Y squared. And that again is equal to the square root of pi. So that's our whole answer, very simple.
That's the whole partition function, Z. Let's write it in Z. And it's equal, first of all it has a 2 pi. The 2 pi is of no particular interest to us.
It contains also a square root of M over kappa. Square root of M over kappa. M from here, kappa from here. Anybody know what the square root of M over kappa is?
What about the frequency of the oscillator? Do you remember the frequency of a harmonic oscillator in terms of its mass and its spring constant? So we could just write this as 1 over the frequency and I'll do so. But of course the frequency is a constant.
But nevertheless I want to keep it for now. 2 pi over omega times 1 divided by beta. That's Z. Now let's calculate the energy of the oscillator. How do we do that? We calculate the logarithm of Z and differentiate it with respect to beta.
That's the formula that we've learned to use and to trust. So log Z is equal. There's a log of 2 pi over omega. That's a constant. So let's write it as a constant.
And then minus log beta. Minus log beta. That's equal to log Z. And now what we want to do is we want to calculate minus the derivative of log Z with respect to beta. What does that equal to?
Remember? The energy. The average energy. That's the energy. Minus the log Z by D beta. You know, we're cruising now. We're not thinking. We're just chasing symbols. The constant gives us nothing. The minus sign takes away the minus sign from here.
And the derivative of log beta with respect to beta is just 1 over beta. 1 over beta, yeah. Which also is better known as the temperature.
Not so different than the 3 halves KT. Well, let's just write 3 halves T for the particle and the gas. Now, the first thing is where's Y naught 3?
Well, because this is just a one-dimensional oscillator, oscillating along an axis. So there's no 3. But what happened to the 2 down here? Why wasn't it 1 half? And the reason is because there were 2 integrals, each of which gave us a 1 half log beta.
The first integral here gave us a 1 over square root of beta. The second one also gave us a 1 over square root of beta. If there would have been more integrals of this type, for whatever reason, more dimensions, more degrees of freedom, whatever, we would have gotten a 1 half log beta for each of them.
1 half log beta means 1 half T for each of them. So for each integration that we have to do, which has this form of E to the minus X squared,
they're called Gaussian integrals, for each integration like that that we have to do, there is basically a 1 half the temperature in the energy. What are the 2 kinds of energies, incidentally, that we're talking about? We're talking about the energy of an oscillator.
It's kinetic energy and potential energy. In fact, the kinetic energy and potential energy, the average kinetic energy and potential energy are in fact equal, and each one equal to 1 half beta, 1 half temperature. So the kinetic energy is 1 half temperature. The potential energy is 1 half temperature.
Together, they give you the temperature. Now, there's a couple of interesting things to note of us about this. Two interesting things to notice about it. They're both very similar. First is that the answer doesn't depend on the mass of the oscillator. That was also true of the particle in the gas.
At a given temperature, the energy, the kinetic energy of the particle, did not depend on the mass of the particle. The velocity did. The velocity does, because the velocity 1 half mv squared is the energy. 1 half mv squared is the same for all of the different mass particles.
v squared, of course, will be smaller for heavy particles. The same is true here. The average kinetic energy is the temperature, and it's independent of the mass of the oscillator.
It's also independent of the spring constant. That's a little surprising if you think about it, because imagine making the spring constant huge. Imagine making the spring constant so large that no known force on Earth
can or elsewhere can stretch this spring away from its equilibrium point. One would think under such circumstances that it's not a spring. It's just a constrained thing which has a fixed length,
perhaps the length being zero, which cannot be changed. Under those circumstances, any sensible person would say, you can't excite it. You can't give it any energy. There's no way to start it vibrating.
There's no way to give it any structure at all. You can't give it any kinetic energy, because it's absolutely locked in place. You can't give it any potential energy, because you can't pull it away from that point, no matter how much force you exert on it.
Yet, the formula doesn't seem to care. The formula doesn't seem to care. There always seems to be energy kT for every oscillator, no matter how unphysical, no matter how hard it is to get that oscillator going. Now, there's something wrong.
Question? If k goes to infinity, C goes to zero. K doesn't appear. K appears here, but it doesn't appear here.
Once you take the logarithm, the k appears as an additive constant. Let's let k be 10 to the 500, okay? There's something wrong about the limits here. There's something wrong, but the something that's wrong is that we've ignored one very crucial feature of nature.
We have not ignored anything about classical mechanics. This is a correct classical mechanical conclusion. On the other hand, it's crazy. Let me give you an example of the kind of craziness that would be involved.
We talked about a particle. Before we move on, I just had a question about the partition function. When we did the analogous computation previously, and there was no dependence on the position, we then got the volume for the nth part. But you divide by an n factorial to take out some repetition of stuff.
Yeah, yeah, but it didn't make any difference. All it did was add a multiplicative constant in the partition function. No, it's only one particle. It's only one oscillator. Yeah, well, one oscillator, that's all.
This is the world's simplest system, an oscillating system in a heat bath, okay? And what we find is the odd conclusion that the energy stored in that oscillator is independent of the spring constant and the mass. Let's say, especially the spring constant. Let's take the mass to be fixed. But now we just jack up the spring constant, not 10 to the 500,
10 to the 10 to the 500, all right? For all practical purposes, it must be that it's infinite, and yet the formula takes that infinity or that large number, puts it here, and still the derivative of log z with respect to beta is equal to the temperature.
Okay, so there's something funny going on. This occupied people around the turn of the 19th century. They were very confused by it. And the reason they were confused by it is because the ideal gas law and the energy per particle seemed to work pretty well,
where the particle was treated as a point particle. But supposing that point particle, that molecule, was actually not a single point particle, which molecules are not. They might be a diatomic molecule.
Diatomic molecules can vibrate. They're little oscillators. They are little oscillators. And so one might have expected that, and of course they're rather stiff oscillators. They are rather stiff. They have a good solid spring constant.
So you have to give them quite a bit of energy to, well, you have to give them quite a knock, I should say, quite a knock to start them vibrating. But the formula seems to say that for a diatomic molecule,
you ought to have not 3 halves kT, but maybe 5 halves kT to account for an extra harmonic oscillator that could start to vibrate. Well, it was known that that energy wasn't there. So you could think of two conclusions. One conclusion is that the diatomic molecule is not really a diatomic molecule at all.
It's a point particle. But that's ridiculous. The other conclusion is that you're doing something wrong or you're missing some ingredient of physics here, which tends to keep that oscillator from having this much energy.
You of course all know what the ingredient is, right? Good. I won't tell you. It is of course quantum mechanics. Right. So what we want to do and to compare with this calculation is the quantum mechanical calculation, the corresponding quantum mechanical calculation of, again, a quantum mechanical oscillator in equilibrium.
How much do we have to know about quantum mechanics? Very little. We need the expression for the energy of a harmonic oscillator.
And most of you I'm sure know that harmonic oscillator energies are quantized. They come in discrete multiples and the discrete multiples are discrete multiples of Planck's constant times what?
Planck's constant times the frequency. So the energy of an oscillator is n units of energy where each unit of energy costs h bar omega.
The set of energy levels, the set of states of a quantum harmonic oscillator is just this discrete set of states all equally spaced. And that's all that we have to know about the quantum harmonic oscillator. The only quantum mechanics we need to know is that the energy of a single harmonic oscillator comes in discrete integer multiples of h bar omega.
Now we can calculate the partition function for the quantum mechanical oscillator. Remember what the answer is? The answer is always z is the sum of all states.
For the classical oscillator, we replace that by an integral. For the quantum mechanical oscillator, it's truly a sum. It's a sum over n of e to the minus beta times the energy of the nth state. And the energy of the nth state is n h bar omega.
I'm keeping h bar but throwing away the Boltzmann constant. I refuse to keep track of both of them. There's too much. Okay. So this is the formula for the partition function.
It looks like a complicated mess of exponentials. How do you sum that up? Boy, it looks hard. Not hard. It's very easy. E to the minus beta n h bar omega is e to the minus beta h omega to the nth power. So this is a series. The first term is just one. That's when n is equal to zero.
So this is a series that has the form one plus a number. Let's call it x, namely this number in here, plus x squared, plus x cubed, plus dot, dot, dot. And that's a geometric series.
That's a geometric series which is very easy to sum up. The answer is that this is equal to one divided by one minus x. That's the geometric series where x, where this thing in here is just x. So now we can sum it up.
Calculate the partition function. It's z is equal to one divided by one minus e to the minus beta h bar omega. And that's it.
That's the whole calculation of the partition function. It's fairly simple. You can write it in a number of ways, but I think this is probably the best way to write it. Just leave it in this form. There's other ways to write it, but this is fine.
What do we want to do? Well, we want to calculate the energy. The energy is the derivative of log z, but it's also equal. In this case, it's a little easier just to take it to be one over z times the derivative of z with respect to beta.
This is equal to the minus this. This is equal to minus the derivative of log z with respect to beta. Just using the property of logarithm, that the derivative of logarithm gives you a one over z. In this case, I think it's a little easier just to calculate and not bother taking the logarithm.
Okay, so what is dz by d beta? Well, there's some ugly expression in the denominator. When you differentiate the thing with the thing in the denominator, you get the denominator squared. So we get dz by d beta.
Let's just work it out. dz by d beta will contain a one divided by one minus e to the minus beta h bar omega all squared. That's just coming from differentiating a denominator. One over x, one over x squared. Yeah, minus. Okay, next we have to differentiate the argument.
We have to differentiate what's in the denominator with respect to beta. That's going to give us another minus sign from here and another two minus signs. I think the whole thing will have one minus sign in it. Yeah, there will be one minus sign left over.
You get a minus sign from here and you get another minus sign from here. But basically, it's just the derivative in the numerator here of the exponential. Well, what's the derivative of the exponential with respect to beta? It's h bar omega times e to the minus beta h bar omega.
That's it. Hm? Minus here. Yeah. Okay. Now we have to divide by z.
That means multiply by one minus e to the minus beta. Blob. Multiply by that and that gets rid of one factor here. And what is that? That's minus the energy. We want plus the energy.
So, let's put a minus sign here. By sheer luck, we wound up with a positive quantity. This is the answer.
This is the answer. What is this compared with? This is compared with one over beta. So, what's the relationship between them? Well, let's first go to high temperature. Let's first ask what this is like at high temperature. High temperature is a situation where the classical theory is good.
Why is the classical theory good? Well, each one of these little quanta of energy here are thought of as very small. A classical spring has a great many units of energy.
So, in that sense, the classical system in quantum units has a lot of energy. That must mean that it has very high temperature in some quantum sense. Classical systems are quantum systems become classical when the temperature is very high.
When the temperature is high, they have a lot of energy and the quantization of energy becomes unimportant. So, let's go to high temperature and see what we have. Where did I? Ah, here it is.
This is the energy. Let's erase everything else and just call this the energy. High temperature means beta big. No, no, sorry. High temperature means beta very small. Beta is one over the temperature. So, high temperature beta, what did I say, beta small.
Beta small, if we take the limit that beta gets very, very near zero, this is just approximately one and gets closer and closer to one as beta gets smaller and smaller. So, here's the answer. H-bar omega, this factor here just goes to one.
We could do the same thing with this factor here, but then we get in trouble. One minus one, bad thing to do. So, what we do is we expand this. It's one minus, and now e to the minus beta omega is one plus, I think, beta h-bar omega.
I'm using the expansion that e to the x is one plus x plus x squared over two and so forth.
And only keeping track of the first term. I have to keep track of the first, actually it's the second term. The first term cancels out. Okay, so not good to just take the first approximation one.
The second term is to approximate the exponential by just x, which is minus beta h-bar omega, which just gives us beta h-bar omega in the denominator. And the next terms after that are proportional to beta squared, beta cubed, beta fourth and so forth.
And they're much smaller than this. They can be ignored when beta gets very small. What do we get? We get one over beta, one over beta. So, you see in the high temperature situation, it just reproduces the classical physics.
Okay, but let's ask about the low temperatures. What happens at low temperatures? At low temperatures, that means large beta. Where's the energy?
All right, small beta, sorry, low temperatures, large beta, large beta. Large beta, this exponential is small. E to the minus a large number is small. So, now we're going to low temperature, large beta.
This is negligible. If beta is 100 million or something, very, very low temperature, then this is negligible. We just get one in the denominator. So, the answer is H bar omega times E to the minus beta H bar omega.
But again, this exponential is extremely small. So, we see far from getting just the temperature, we get something that's exponentially small when the temperature goes to zero.
When the temperature goes to zero, which means beta gets large, this becomes enormously small because it's exponentially small. E to the minus beta, where beta is large, is very, very small. So, what do we find? We find that quantum mechanics tends to suppress the energy of an oscillator when the temperature is low enough.
Very low temperatures, it doesn't behave like the classical oscillator at all. It has much less energy than the corresponding classical oscillator would have had. The question is where is the crossover?
Where does it go from being quantum to classical? And if you look at a formula like this, you would know that the crossover is where the exponential goes from being small to being large. The biggest the exponential ever gets is one, E to the minus, but it goes where it goes.
And when does that happen? That happens when beta H bar omega is about equal to one. The crossover between the high temperature behavior and the low temperature behavior is when what's in the exponential here is about one.
For very low temperatures, this quantity here is large. For very high temperatures, this quantity is small. And it's this exponential which controls whether the discreteness of the energy levels is important or not. So, let's write where the crossover is.
The crossover happens when beta, crossover means the transition from quantum behavior to classical behavior. It happens when beta H bar omega is about equal to one.
Now, let's see, which way does it go? If beta H bar omega is bigger than one, is that quantum or classical? That's quantum because it's low temperature. Bigger than one, quantum.
Beta H bar omega less than one, it starts to behave classically. Let's just write one other equation. Beta is one over the temperature.
And so that says H bar omega is equal to the temperature. That's where it crosses over. If you look at it, it makes a lot of sense. It says the crossover is when the energy of the oscillator classically is equal to one quantum's worth of quantum mechanical energy.
If you try to make the temperature lower than that, the oscillator has less than one quantum's worth of energy. The temperature is so low that it has less than one quantum's worth of energy.
It doesn't want to do that. It doesn't want to have less than one quantum's worth of energy. And so the partition function or the expression for the energy is almost zero, exponentially small. Here's the crossover point, or let's write it, greater than temperature, less than temperature.
Okay, so now we actually know the answer to the puzzle about the diatomic molecule. When the temperature is too low, that's when the temperature is too low, we're deep in the quantum regime.
And when you're in the quantum regime, the oscillator has exponentially small energy compared to what it would be classically.
On the other hand, when the temperature goes up, at some point, the temperature will get larger than H bar omega. At that point, you begin to activate the oscillator. You can't activate the quantum oscillator unless the temperature is such that it corresponds to more than one quantum's worth of energy.
That's what this says. So where are we? At low temperatures, up to some temperature, the diatomic molecule behaves like a monatomic molecule. You do not stop the vibrations going. But at some temperature, if you raise the temperature enough so that the temperature becomes bigger than the quantum of energy,
the diatomic molecule starts to behave like a diatomic molecule. Now, that temperature's pretty high. For a good stiff molecule, oh, here's another point.
The stiffness determines omega. The stiffness of the oscillator, what is it, the square root of K over M? The stiffer the oscillator, the larger the frequency. So the crossover point, the stiffer the oscillator, the higher the temperature of the crossover point. Well, you know, molecules are pretty stiff on the scale of rubber bands.
And the temperature at which the molecule goes from looking like a monatomic molecule to being a diatomic molecule is fairly high. But what happens if you raise the temperature even a little more?
Oh, incidentally, often before this happens, something else starts to happen. A molecule starts to rotate. Other things start to happen. This is just one example. What happens if you heat it even more? You activate even worse things. The atoms themselves start to vibrate, and eventually they'll get ionized.
So when they become ionized, it doesn't behave like an atom at all. It behaves like an electron and a proton. So this is an example of quantum mechanics solving a deep problem that physicists have been very, very confused about.
In particular, I'll tell you where this came up. Solids are simply systems of molecules where the molecules are arranged in a lattice. They vibrate around the lattice centers. So each atom is sort of an oscillator.
The energy of the oscillators as a function of the temperature, the amount of energy as a function of the temperature determines the specific heat of the crystal. The specific heat of a crystal with a very, very large number of oscillators, or the
heat capacity, was estimated by classical physics to be much, much larger than the actual answer. It was Einstein who basically figured out that what was going on is that
the quantum statistical mechanics of oscillators suppressed the oscillators until the temperature got high enough. So this is another Einstein, one of his minor, one of his small minor contributions.
I thought he got the Nobel Prize for the photoelectric effect. I think so. Okay, but this is neither of them. So when was this done? When did he do this? I mean, quantum mechanics had already come in? Well, remember that Einstein invented quantum mechanics in 1905, so I'm not sure exactly.
Somebody look it up, specific heat of solids. No. It was early. It was early. It was long before people understood the foundations of quantum mechanics. He just knew that the quantum energy levels had to be proportional to frequency and to N.
No. No. Yeah. It was based on discrete states and the partition function by summing that model of the state.
We looked at the gap and it went from the sum to the integral. And in that condition, we're inherently assuming that
I would say the ultimate justification was the quantum mechanics,
the quantum derivation of the, for example, ideal gas. But here's an example of where the quantum derivation reproduces in the high temperatures the classical answer.
So really the right logic, the real serious logic is to start with quantum mechanics and show that the quantum mechanical counting of states corresponds to the integral over phase space. And that's true. I just give you an example here. Do the calculation as classical physicists who would ordinarily do it, here it is, and we got, what do we get?
Let's go back actually to the partition function itself. Yeah, let's just look at the partition function itself, not
even take the energy. Let's just see how close we get. Okay, this partition function at high temperatures means small beta is one divided by one minus beta H bar omega, right?
H bar is a constant, incidentally so is omega, but H bar is a numerical constant that sits in front of all partition functions.
Let's not worry about it. One over beta omega. Now what was the partition function that we said for the classical case? Here it is. One over beta omega. The two pi is again a constant, not particularly interesting, but its dependence on beta and omega is one over beta omega here, one over beta omega here.
The quantum mechanics only comes in in this multiplicative constant one over H bar, which doesn't matter when you differentiate it. So here's an example of the derivation of the classical statistical mechanics from the quantum mechanics and that's really the right way to go.
But let me tell you, this formula was known long before quantum mechanics, so they were doing something right. They intuited that phase space was the right space, XP
space, momentum position space was the right way to think about classical systems. Yeah. 1907. Yeah, I figured it was very early. Right.
Well, excuse me. Classically or quantum mechanically?
Well, the energy stays equal to the temperature. Once it's above the quantum threshold, the energy is just equal to the temperature and it does not matter what the spring constant is.
It speeds up and makes larger oscillations when you apply temperature. Point is that as you heat it from low temperature to high temperature, it's the paradox of saying that the oscillator
always has that energy independently of how stiff it is. What's true is the place where it begins to have that energy, in other words, the temperature at which the
classical behavior sets in gets to be a higher and higher temperature as a function of the spring constant. So, if you took a very, you know, this 10 to the 10 to the 10th spring constant, yes, it's sufficiently high temperature, it would have an energy equal to its temperature.
But at any kind of ordinary temperature, it would essentially have no sort of threshold for it to start behaving classically, would get higher and higher and higher as the spring constant gets stiffer and stiffer. Now, that's kind of interesting because what it means is that as you heat a system, you start to discover that it has more and more degrees of freedom.
Let's make a crazy molecule now. A crazy molecule consists of a spring with two atoms connected to it and then each one of these atoms on a very stiff spring at that, and then connected this atom over here turns out to be two particles with a really
stiff spring, a much, much stiffer spring. All right, so how does it look at low temperatures? At low temperatures, there's not enough temperature to overcome the quantum threshold for this oscillator to
start oscillating. So, it behaves like a point. Behaves with three-halves kT. You heat it up to the point where this oscillator here, the frequency of this oscillator is such that the temperature is such that you activate the oscillations
but still not enough temperature to activate this super-duper strong spring here. It behaves like a diatomic molecule. You heat it more, it starts to behave as if it had three molecules connected by springs.
Instead of five-halves kT, I don't know, maybe a seven-halves, whatever the right number is. So, the lesson is as you heat the system, more and more degrees of freedom become activated and you start to discover the complexity of things. You don't discover the complexity of things at low temperature.
You discover the quantum behavior of things. As you heat it, you start to discover complexity and you start to discover degree of freedom by degree of freedom. They unfreeze themselves from a quantum constraint and start behaving classically. Now, some of you may realize this has to do a lot with
blackbody radiation, but we'll come to that. Well, to relate it, I won't relate it right now to blackbody radiation, but let's relate it to something else. A violin string. Take a violin string. A violin string consists mathematically of a large number
of oscillators. The oscillators are the violin string or the guitar string oscillating in its lowest harmonic. Plus, on top of that, you can superpose the first harmonic, second harmonic, or I think I just put however many
harmonics. Fundamentally, the mathematics of it is that a string, a vibrating string is basically an infinite number of harmonic oscillators. The infinite number become the shorter and shorter wavelengths.
If each one of these oscillators had energy equal to the temperature and there are an infinite number of them, that string would have an infinite amount of energy when it came to thermal equilibrium. Something's wrong.
Now, the answer is that most of these oscillations have very high frequency. The shorter the wavelength, the higher the frequency of the oscillator. And therefore, most of these oscillators at any given temperature are frozen out by quantum mechanics, not free to oscillate. As the temperature goes up and up, more of these
oscillations begin to oscillate, begin to have energy, but so that's another similar paradox.
Just studying the specific heat of crystals, it's very obvious. In fact, I mean, it was obvious to the experimental physicists around 1900 that there was something wrong
because their crystals did not behave according to classical mechanics. So yes, it's extremely, extremely easy to see this. Yep. As you heat the crystal, the specific heat starts to behave
as if there was a certain number of kT, a certain number of T per oscillator. Yeah. This is really easy to see.
Right. Right. The partition function would be the product of all the partition functions. The logarithm would be the sum. Absolutely. Yes. The logarithm being the sum would mean that the energy is the sum of all the oscillators.
Right. But those whose, those whose, where's our transition point? Those whose frequency is too high above the temperature, those are in the quantum regime and they're not appreciably excited. Okay.
That was fun. Yeah. Okay. Good, good, good, good, good. The reason that doesn't matter, you're right, we did ignore it.
It makes a multiplicative, adding a constant to the energy, adding a constant to every energy level is equivalent to multiplying the partition function by a constant and the result is no change in anything of interest.
Alright. So we start to talk about the second law. Why is the second law a puzzle? The second law is a puzzle because it, it says that the
world is irreversible and that something called entropy always increases or it stays the same and in fact you'd have to be infinitely careful with a big system to keep it from increasing and make it be the same.
On the other hand, Newton's equations of motion are completely reversible so anything that can happen in one direction can happen in the opposite direction. So there's a tension there. There's a tension between reversibility on the one hand of the fundamental equations and irreversibility of the
observational properties of complicated systems and that took some time to sort out and people are still confused about it.
So let's talk about it.
Let's imagine some phase space. The phase space is the space of states of some classical system and it's got a lot of dimensions. It's got a lot of particles or a lot of whatever it is that it has but we're going to just draw it. We're going to represent it simply as two dimensional.
One dimension will be the coordinates and the other direction will be the momenta. And as I said, there are a lot of them. This is a high dimensional space. And we start out with some probability distribution. We start out, let's say for example, let's start out
with a probability distribution which looks like that, meaning to say there's zero probability on the outside, constant probability on the inside of this blob. What it means to say is we know nothing except that the system starts out inside that blob.
We don't know anything more about it so the probability density is constant within that blob. What is the entropy of that system? Anybody remember? No, no, no, no, it's not all in maximum. The maximum would be if the probability distribution was
uniformly spread over the phase space. In other words, if you knew nothing. The entropy is the logarithm of the volume of the blob. Now, that's, yeah, it's not constrained, we just
happen to know that it's in that blob. We did our best possible measurements and we discovered that the system is in that blob. Whatever measurements it took. Oh, no, it's not, no, no, no, no, it's not in equilibrium. No, and so we're not talking about the entropy of
equilibrium, we're talking about the entropy of a certain probability distribution. And the probability distribution is handed to us. Somebody tells you, somebody upstairs tells you the momenta of 10 to the 23rd particles and the positions of 10 to the 23rd particles are all within this blob.
For example, I mean, you know, this is not that unrealistic. You might start with all the air molecules in some tank, you know, an air tank. And they're up in the corner of the room. And then you open up the air tank and the air tank goes out. Well, originally the molecules were in that corner, in
that tank, so they were confined in a small space. And moreover, you had a pretty good idea of what their energy was because you knew the temperature of what's in there. So you had a pretty good idea of what the momenta of all the particles were. That might not quite correspond to a uniform probability
distribution in a blob, but it would certainly correspond to some blob-like configuration. The more you know, it corresponds to the smaller the volume of this region. Oh, did I say the entropy was the logarithm of the volume?
Minus the logarithm of the volume. As the volume gets smaller and smaller, no, yeah, yeah. As the volume gets smaller and smaller, the entropy goes down. Which way is that? That's, yeah, okay, it's minus the volume.
All right, but in any case, there's an entropy which has to do with the size of this blob in the phase space. All right, now what happens to the blob as time goes on? Do you remember a theorem called Liouville's theorem? Liouville's theorem says, what, volume stays the same.
Therefore, the entropy stays the same. This is just another way of saying that if you follow the states, each state goes into a unique state. It has its analog in the discrete little games we played in classical mechanics of coins or whatever, where we talked about how states evolve one to the next to the
next to the next, and a given state goes to a unique state in the next instant, another unique state in the next instant, and so if you have some probability distribution, the probability distribution just translates if there are six states, a die, if there are six states and
you know that one of two of them is occupied, in other words, if the die is in one of two states, and you also know the rule for going from one state to the next state, after the next step, you'll also know that it's in one of two states, not the same two states,
but you also know that it's in one of two states. So the entropy in that sense stays the same. The probability distribution may change because the states which are occupied become different, but the fact that there are two of them and that they have equal probability, that stays the same.
So on the one hand, that seems to say that entropy doesn't change, and this is true. This is a correct view of what is called microscopic entropy or, I don't know what it's called, it's called
micro entropy, but there's another concept of entropy which does increase, and the other concept of entropy has to do with something called coarse greening. Typically, your experiments are such that you cannot
resolve points in the phase space. You don't have, even if your experiments were ideal, perfectly precise experiments, you still could not tell one state from another state if it was too close. There's some resolution, some resolution of states.
Now in quantum mechanics, it turns out that that resolution is naturally built in. The smallest that you can, the smallest uncertainty is given by Planck's constant, but in other situations, it may just be that your resolving powers are not good enough. So instead of talking about points of phase space, you
talk about resolved or what should we call them, coarse green points in phase space. Now, let's make an example. Supposing we started with this blob in phase space and it has a certain volume.
Let's follow it. And I'm going to tell you what it does, not because I've solved any mathematics problem, but I'm just going to make up a dynamics which preserves the volume, preserves the area on the plane here, but takes this phase volume here and turns it into a long snake which looks
like this. The same total volume, the snake is extremely thin and it looks like that. But unfortunately, because of our limited resolution powers, we cannot tell a phase point from a neighboring
phase point. So what we do when this happens is we take these phase points and we blobify them. We think of them instead of as being points in the phase space, blobs.
And now we start going through it and guess what? The blobs cover, they're overlapping, yep, they're overlapping.
No, it doesn't violate Liouville, we're just losing information because we can't keep track of this fine structure. The actual true dynamics is such that the snake here has exactly the same volume as the blob over here.
But because we have fuzzy resolution, we're looking at these things with somebody else's eyeglasses. What do we see? We see a blob and if we were to be asked, what is the volume of this blob?
In other words, where is the point in phase space where the same thing exists? We'll come to the conclusion that it's uniformly distributed over here because we couldn't follow the system carefully enough. If we could follow it with infinite precision and define
the probability density by defining it with infinite precision, the volume would not change. But in every real situation, in every real situation, we can't distinguish one state from a nearby state. So we do this process of coarse graining.
Coarse graining is replacing points of phase space by blobs of phase space. Once we do that, guess what? The volume increases. I always imagine a big piece of cotton. What is the volume of a piece of cotton this big?
Well, if you mean the volume of the cotton fibers all added up, you can squeeze them down to something small. If you let that cotton fiber go, the volume of the cotton fibers doesn't change very much, incidentally. They're pretty rigid. But the cotton sort of grows out and now you look at it
and it has this big a volume. If you look at it through somebody else's glasses and you can't see the cotton fibers, you will say that the volume of that piece of fiber stuff has increased. And that's what happens.
That is what happens. That is what happens. So, yeah? Are you saying then that classically entropy only appears to increase but quantum mechanically it does because of the limit of resolution? Let's forget the quantum mechanics. Just limits of resolution.
Limits of resolution. Remember that entropy is not really a property of a system. It's a property of a system and what you know about the system. So, if you could follow in infinite detail every point in
the phase space, all you knew in the beginning was that it was inside some here but not exactly where, then you would say afterwards it's inside the snake somewhere but you don't know exactly where it's in the snake. Then entropy would be conserved. But if you take into account that you can't tell one
point from another because of limits of resolution, then you wind up saying, look, I know much less about where it is than I did to begin with. Right. So, classically you can imagine that you might be able to entropy would remain the same because you can imagine you could. Yeah. Yes.
That's right. Yes. Yeah. But the problem is given any resolution, given any degree of resolution, whatever your resolution is, let's suppose it was good enough to resolve the snake after 10 seconds. So, it's good enough to resolve the snake.
But then the way systems evolve in phase space is they start to grow even littler snakes and littler snakes and littler snakes. Whatever your degree of resolution, this snakiness, we'll give it its right name, chaos is what its name is.
The way systems evolve in phase space is they start to spread and they start to spread out tentacles without changing the volume, they grow fibers and the fibers grow
smaller fibers and the smaller fibers grow smaller fibers. And eventually, no matter how good your resolution is, you'll eventually start seeing that the entropy increases. And that's the second law. That's what the second law says.
Now, let's imagine some phase space. We started out knowing a great deal. Let's talk about how big the available region of phase
space is for a minute. Let's think of it as a gas of a large number of particles in a box. If it's in a box, that means the x's are bounded. So, in a one-dimensional analog, it would say that the phase space is bounded between here and here, there's
the phase space. Can the momentum be anything? No. Usually, the energy constrains how much the maximum momentum can be. So, even if all of the energy of the system went into one particle, that one particle would still itself have a
finite energy. And so, the effective region of phase space which is available to you is some big box. Maybe it's not a box. It's got some shape, but it's finite. It's finite. It doesn't extend to infinity in any direction. Okay. Now, the phase point starts someplace, the exact precise
phase point. What tends to happen is the subsequent motion is extremely sensitive to where you started with. So, if you started over here, the system would evolve for a time.
If you started very close to it, the system would evolve in a very similar way for a while, but then it would start to depart. It would go off in some different directions. An example would be billiard balls, an idealized billiard table with no friction and how many people
played billiards here? Pocket pool? Chicago? Straight pool? What do you play? Three cushion is too hard.
Too much chaos. Yeah. So, you know that if you arrange the balls in a certain fashion and start with a cue ball and you are extremely accurate, you start with a cue where the ball is
exactly the same on two different occasions. If they're exactly the same and you shoot exactly the same, exactly the same things will happen. Now, what if you're a tiny, tiny, tiny, tiny bit off in your aim, everything else being kept perfectly fixed?
Well, if you're really just off by a truly, truly tiny amount, the difference in the trajectory of the whole system will be small for a while, but sooner or later it's going to start exponentially, errors compound. Errors compound and sooner or later that trajectory, that
nearby trajectory is going to deviate. That's the character of what is called chaotic systems and most systems in nature are chaotic. And so that means that no matter how close you start
together, you'll eventually depart and fill up a lot of phase space if you're not infinitely precise in watching. These two points seem to spread out even though the volume of phase space, strictly speaking, stays the same. Alright, this is the origin of the fact that if you
start with a small blob in the phase space, it'll stay pretty coherent for a while, but then it'll start to develop some pieces that wander off in some new way and then some other ones will start to grow out. These are the different regions of phase space beginning
to depart from each other because errors compound and then two points over here, which are close together, nevertheless eventually will depart from each other and the phase space gets very, the phase space volume gets extremely complicated and eventually, if you wait long
enough till the system gets to thermal equilibrium, this cottony stuff basically fills up the whole phase space. Fills it up as cotton would fill up. The volume, the actual volume of these fibers stays
equal to whatever the initial volume of the phase blob was, but it spreads out in a way that if you do any coarse graining of it at all, it basically fills the whole thing with uniform probability. All right, so now we watch a particular phase point.
A particular phase point, we start here, that means we start with a particular configuration of the cue ball and one, two, three, through 15, we stack them up and we let it run. It starts to go somewhere and then it goes somewhere else and then it goes somewhere else and then it goes
somewhere else and eventually, if there's no friction on that billiard table, eventually you will sample something close to every possible configuration. If you wait long enough, you will see every possible configuration in the phase space visited, not exactly,
but very close. You'll pass through just about every point in the phase space, very, very close to it. And so, what will happen is on the one hand, if you know something in the beginning, like for example, you might
know that the initial phase point is something very special. It's down here in the corner. All the molecules in the corner of the room with a certain definite velocity going off in some direction. All right, they start to bump into each other, the phase
point begins to separate and the phase points begin to separate and eventually it fills up the whole thing here. And so, I don't know anything anymore. It's reached its maximum entropy. There can't be any more entropy than saying I don't know anything.
But nevertheless, in a completely unpredictable and random way that you can't foresee, because you don't have enough detail, this phase point will eventually come back to this point here.
In other words, in the cue ball, billiard ball analogy, you start your cue, you start your rack, rack up in a certain very, very special triangle, cue ball down, you know, definite, you shoot, and you're very precise, but after a certain amount of time, they begin to go off in
funny directions, they begin to fill, not that they fill the pool table, but they fill the space of possibilities. But if you wait long enough, if you wait long enough, we can talk about how long it takes, those balls will come back and form a triangle again.
How long? Well, it's not infinite. It's not infinite. It's not infinite. It does depend on how precise you require to reproduce the original configuration.
If you allow even a very small tolerance, and you say, this is the original configuration to within a certain tolerance, then depending on that tolerance, it will take a finite time for the phase point to come back to that original configuration.
If you require infinite precision, it will never come back. But if you require precision to within some epsilon, then the time that it takes to pass back through that phase point to within tolerance epsilon is finite. In the case of a billiard table, right, you actually are
imparting energy by hitting the cue ball, right? Well, no, that's the starting point, with the energy. Oh, okay. The starting point is already the cue ball hitting it, so the system will replicate with the cue ball coming at it. Right. Right. Okay. So on the one hand, the entropy increases.
It increases in this cross-grain sense because you lose track. On the other hand, that does not say that you can't return to the original configuration. You can try to estimate how long it will take.
And maybe we'll do that next time. We'll make an estimate of how long it takes for a system of many degrees of freedom. Let's say the air, or we'll try out this problem. Take the air in the room, put it all up in the corner with its energy. It has its energy. It has its kinetic energy.
All the molecules are going in the same direction, very, very special state. They shoot off to the other wall, they bounce off the walls, they start hitting each other, they go crazy and the system is very chaotic. And after a while, they fill the room in the normal pattern of thermal equilibrium, and we sit and we wait
and we wait and we wait. How long does it take? We can be certain. We can be certain that the room lasts long enough, and it's a really, truly closed system. How long? Well, there's not going to be a closed system. We just don't want to leak molecules out. But it's kept at constant temperature. How long will it take on the average before all the
molecules will be up in the corner again, or in that corner, or in that corner? And that question can be answered. The answer is a long time. Don't you lose information if, in this case, it's not a
reversible system, right? It is. Because after some time, it goes back into the corner? Completely reversible. Anything that can happen, the opposite can happen. But, no, all right, we'll talk about it more.
We'll talk about the second law more. We'll talk about reversibility and how this paradox gets resolved. How this paradox of reversibility and irreversibility gets resolved. You know, it was one of the great classic paradoxes of
physics that puzzled a lot of people, as you undoubtedly know in the history of physics, drove Boltzmann to a distraction. I don't know if it drove him to suicide, but it certainly depressed him.
Eventually, he came up with the right answer. The right answer being that, not that entropy always increases, but given any configuration, it's most probable that the next configuration will have more entropy. Entropy probably increases, and the probability of the
configuration aiming itself back into some unusual configuration, unexpected, it's always small. Well, I'll give you some more examples. I'll try to make this a little more intuitive next time.
But we do owe the second law at least some kind of explanation. Well, there's an infinite number of chaotic systems, and
there's an infinite number of non-chaotic systems. Now, we have to start talking about measures on what, measures on Hamiltonians and things.
It's a good question. I'm sure that the mathematical people could probably answer what is the right measure of the chaotic systems versus the non-chaotic systems. But that's a good question.
I mean, not all systems are chaotic. That's for sure. And in many cases, you can perturb them a little bit. You can change them a little bit, and they will still remain non-chaotic. It's not the sort of situation where if you have a chaotic system and you change the rules, meaning the
Hamiltonian, a tiny, tiny bit, it will become chaotic. No, there's usually some chaotic, what does chaotic mean? Chaotic means that the phase trajectories tend to separate like this after a long enough time.
I didn't want to define it with any infinite precision right now, but it is this phenomenon of close-by points in phase space following each other for a while and then exponentially departing so that the system becomes
effectively unpredictable. It's a situation in which predictability effectively breaks down because in order to predict the system for a length of time, t, your precision with the initial conditions and your precision and knowledge of the
evolution of the system has to get better and better and better as the time over which you want predictability to be predictability. So you take the weather, just define it as a collection of molecules doing whatever they're doing instantaneously,
and you ask, how long can I predict the weather? How long can you predict the weather depends on how precisely you know the initial conditions. And if you want to predict it for a longer period of time, you have to have more precise initial data, even
longer, more precise. Now, that's quite different than the harmonic oscillator. The harmonic oscillator in phase space just moves around in a circle with constant frequency. If you start a phase point over here, it travels around over here.
Start a phase point over here, they just go together. And they go together forever and ever. Close-by phase points never get far from each other. That's a very non-chaotic system. A particle in an orbit around the sun in a true inverse
square law where we do exact Newtonian physics is not chaotic. The two close-by trajectories, two close-by orbits will remain close-by basically forever. The three-body problem is chaotic in general.
That neighboring orbits eventually tend to depart from each other. So that means predictability is a function of how accurately you know the initial conditions. To predict for a length of time T, you must know the initial conditions to within epsilon of T.
As T gets longer, you need to know the initial conditions better. That's what it means to be chaotic. Not also the laws of motion, you need to know also the high precision? Yes, you also need to know the laws of motion and the high precision. Both the laws and the initial conditions.
One is chaotic and one is not. It's a mathematical property. You've seen chaotic pendulums.
The double pendulum is chaotic. The ordinary pendulum is not chaotic. It's very, very predictable. Now you add a second pendulum like that that swings around this one and you start it going. Have you ever seen the double pendulum in action?
It's quite fascinating. You can follow it a little bit but then after a little bit of time it does something really weird. All the energy will go into one of the modes and very chaotic things happen because the phase space spreads.
You're asking a very hard question why some systems are chaotic and other systems aren't chaotic. I would say the state of the art is almost, it's very, very
hard to look at a Hamiltonian and know whether it's chaotic or not. Except for the fact that if it's complicated it's usually chaotic.
There are complicated ones which are not chaotic so this is not an easy question and it's something that mathematicians and mathematical physicists study and they don't have answers that are very general.
With no friction.
So if you knew the initial conditions through infinite precision, whatever that means, you could predict for infinite time.
Now, I think that's probably a meaningless statement but the right statement is some connection between given the time that you want to predict and how precise you have to know the initial conditions. So that has to do with how fast these trajectories depart
from each other. If the trajectories tend to stay close to each other for a long period of time, the system is rather predictable for a long period of time but eventually they will separate. So most systems are unpredictable in that sense.
Another way to say it is given any length of time that you want to predict, there exists a tolerance or a precision, an epsilon that would allow you, if you knew the phase
point within that precision epsilon, would allow you to predict for that length of time. So, you know, it's given t, there exists an epsilon such that and so forth. Yeah, the Yapenov-Kov exponents, right?
That's exactly what they are, Yapenov. Yapenov, L-Y-A-P-O-N-A-V, no, A-N-O-V, Yapenov, L-Y-A-P-O-N-A-V, L-Y-A-P-O-N-A-V, L-Y-A-P-O-N-A-V.
That tells you how fast, but the Yapenov coefficient is the exponential growth of the separation between trajectories. Oh, Yapenov, Yapenov, Yapenov, yeah.
Chaotic systems certainly don't have closed solutions as a rule. The thing with the double pendulum is at some point, the
second pendulum starts, will be on a path going straight up, ending up here, and then depending on exactly where it is, it will go here or here, and now we run completely different trajectories.
It's unstable. The orbits are unstable. The simplest kind of instability is just the top of a hill, as you were saying. If you go over here, you go this way. If you go over here, you go that way.
But that's just one point of instability at the very top of a hill. If you start anywhere else, nearby, let's say with zero velocity, the phase trajectories don't separate that much and that quickly. At the top of the hill here is a point of instability.
In these chaotic systems, basically every point is a point of instability. Every point in the phase space is unstable like that. Okay.
For more, please visit us at stanford.edu.