3/4 Mathematical Structures arising from Genetics and Molecular Biology
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 18 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/16456 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | |
Genre |
5
6
7
8
9
10
11
12
13
14
15
16
17
18
00:00
LESGruppe <Mathematik>MathematikMathematische LogikOrdnung <Mathematik>Transformation <Mathematik>Amenable GruppeFrequenzProdukt <Mathematik>InzidenzalgebraQuadratische FormVariableKombinatorKategorie <Mathematik>Berechenbare FunktionOrthogonalitätAlgebraische FunktionAlgebraisches ModellFunktionalGrothendieck-TopologieHochdruckIndexberechnungKugelPhysikalisches SystemProjektive EbeneTeilbarkeitMatchingStochastische AbhängigkeitWasserdampftafelZusammenhängender GraphTemperaturstrahlungDistributionenraumPunktAbstimmung <Frequenz>UnrundheitSortierte LogikVollständiger VerbandArithmetischer AusdruckGraphfärbungExplosion <Stochastik>Trennschärfe <Statistik>DifferenteObjekt <Kategorie>MultiplikationsoperatorMinkowski-MetrikRechter WinkelEinsAnalysisDynamisches SystemNumerische MathematikPolynomSymmetrieVektorraumGrenzschichtablösungÜbergangMorphismusFreiheitsgradKonvexe MengeProjektiver RaumLinearisierungFormale PotenzreiheNichtlinearer OperatorOrbit <Mathematik>RandomisierungQuadratzahlKlasse <Mathematik>Symmetrische MatrixExponentialabbildungTVD-VerfahrenZweiVorlesung/Konferenz
08:58
Dynamisches SystemFolge <Mathematik>Gruppe <Mathematik>KraftMaß <Mathematik>MathematikOrdnung <Mathematik>SpieltheorieTransformation <Mathematik>ModelltheorieFrequenzProdukt <Mathematik>GefangenendilemmaTotal <Mathematik>ÜbergangAggregatzustandAlgebraisches ModellFunktionalKoordinatenLeistung <Physik>Lokales MinimumMereologieMomentenproblemPhysikalische TheorieProjektive EbeneRotationsflächeZentrische StreckungThermodynamisches GleichgewichtTeilbarkeitExogene VariableMatchingSpannweite <Stochastik>Endlich erzeugte GruppeBasis <Mathematik>NormalvektorDistributionenraumSummierbarkeitAdditionPunktKörper <Algebra>FaltungsoperatorProzess <Physik>Vollständiger VerbandGraphfärbungKonditionszahlBestimmtheitsmaßDifferenteEvoluteMultiplikationsoperatorStandardabweichungMinkowski-MetrikRechter WinkelOrtsoperatorEntropiePolynomSymmetrieQuadratische FormGrenzschichtablösungGesetz <Physik>Algebraische FunktionMorphismusDifferentialFreiheitsgradIdeal <Mathematik>MultiplikationResultanteFlächeninhaltLinearisierungNichtlinearer OperatorRichtungStabilitätstheorie <Logik>DickeTransversalschwingungEinsVorlesung/Konferenz
17:55
EntropieGruppe <Mathematik>Lie-GruppeMaß <Mathematik>MathematikMathematikerMathematische LogikNumerische MathematikQuantenmechanikSymmetrieDeskriptive StatistikMengenlehreFrequenzKategorie <Mathematik>FinitismusNonstandard-AnalysisTotal <Mathematik>ÜbergangAggregatzustandMorphismusEbeneOrdinalzahlPhysikalische TheoriePhysikalisches SystemPhysikalismusSigma-AlgebraStatistische HypotheseTermKonstanteFormale PotenzreiheKategorizitätGewicht <Ausgleichsrechnung>Translation <Mathematik>EnergiedichteVollständigkeitPunktPunktgitterUnendliche MengeObjekt <Kategorie>Klassische PhysikZweiMinkowski-MetrikOrtsoperatorBoltzmann-KonstanteAlgebraische StrukturModelltheorieGerichteter GraphGruppenoperationQuantisierung <Physik>RandverteilungResultanteNormalvektorDruckverlaufAdditionGrößenordnungEinfügungsdämpfungKonditionszahlElement <Gruppentheorie>MultiplikationsoperatorRechter WinkelVorlesung/Konferenz
26:53
ApproximationGruppe <Mathematik>Maß <Mathematik>Numerische MathematikPerspektiveTopologieModelltheorieFrequenzProdukt <Mathematik>Kategorie <Mathematik>TropfenAusdruck <Logik>UngleichungAggregatzustandForcingGruppenoperationInhalt <Mathematik>Physikalisches SystemProjektive EbeneStellenringGüte der AnpassungEinflussgrößeMatchingNichtlinearer OperatorBasis <Mathematik>EnergiedichteStandardmodell <Elementarteilchenphysik>PunktQuelle <Physik>Sortierte Logikt-TestStabilitätstheorie <Logik>Vorzeichen <Mathematik>DifferenteObjekt <Kategorie>IsomorphieklasseMultiplikationsoperatorStandardabweichungMinkowski-MetrikRechter WinkelDynamisches SystemEntropieErgodentheorieHalbgruppeNatürliche ZahlMengenlehreFunktorZeitbereichFinitismusUnendlichkeitAbelsche GruppeAnalytische FortsetzungArithmetisches MittelBeweistheorieMorphismusIndexberechnungInverser LimesMereologieMomentenproblemPhysikalische TheoriePhysikalismusResultanteSigma-AlgebraTermTheoremUngelöstes ProblemZählenFlächeninhaltGewicht <Ausgleichsrechnung>VollständigkeitEuklidischer RaumDimension 3NeunzehnUnrundheitSummengleichungMassestromElement <Gruppentheorie>Spezifisches VolumenBoltzmann-KonstanteVorlesung/Konferenz
35:51
Algebraische StrukturDiagrammNumerische MathematikOrdnung <Mathematik>ParadoxonMengenlehreWürfelProdukt <Mathematik>MatrizenrechnungQuadratische FormKategorie <Mathematik>FunktorZeitbereichAusdruck <Logik>FinitismusFaserbündelBerechenbare FunktionUngleichungGesetz <Physik>BeweistheorieMorphismusEndlichkeitGesetz der großen ZahlenLeistung <Physik>MultiplikationOrdinalzahlPunktrechnungSigma-AlgebraTeilmengeTheoremFlächeninhaltFormale PotenzreiheEinflussgrößeQuadratzahlSummierbarkeitPunktEuklidischer RaumSymmetrische MatrixDifferenzkernSchätzfunktionKartesisches ProduktObjekt <Kategorie>Element <Gruppentheorie>MultiplikationsoperatorZweiMinkowski-MetrikSpezifisches VolumenGruppe <Mathematik>Maß <Mathematik>StabFrequenzLokales MinimumPhysikalisches SystemEuler-WinkelStandardabweichungRechter WinkelFigurierte ZahlVorlesung/Konferenz
44:48
EntropieFolge <Mathematik>MathematikerNumerische MathematikOrdnung <Mathematik>TopologieMengenlehreWahrscheinlichkeitstheorieQuadratische FormInvarianteKategorie <Mathematik>FunktorAusdruck <Logik>FinitismusFaserbündelUngleichungGesetz <Physik>ÄquivalenzklasseBeweistheorieMorphismusGesetz der großen ZahlenGrenzwertberechnungLogarithmusMultiplikationPaarvergleichSigma-AlgebraEinflussgrößeNichtlinearer OperatorRuhmasseGewicht <Ausgleichsrechnung>DistributionenraumAdditionPunktKlasse <Mathematik>PartitionsfunktionRichtungDifferenzkernFraktalgeometrieMessbare AbbildungDifferenteElement <Gruppentheorie>Klassische PhysikMultiplikationsoperatorMinkowski-MetrikRechter WinkelBoltzmann-KonstanteAlgebraische StrukturGruppe <Mathematik>FrequenzBerechenbare FunktionGrenzschichtablösungPhysikalischer EffektGarbentheorieGruppenoperationLeistung <Physik>PrimidealFlächeninhaltTeilbarkeitWasserdampftafelNormalvektorDruckverlaufSterbezifferRandomisierungEinfügungsdämpfungExplosion <Stochastik>StandardabweichungFigurierte ZahlVorlesung/Konferenz
53:46
Algebraische StrukturDiagrammEntropieMathematikerNumerische MathematikOrdnung <Mathematik>MengenlehreFrequenzProdukt <Mathematik>MatrizenrechnungKategorie <Mathematik>FunktorFinitismusUngleichungÜbergangAggregatzustandArithmetisches MittelBeweistheorieMorphismusEndlichkeitFundamentalsatz der AlgebraGeradeInjektivitätLeistung <Physik>LogarithmusMomentenproblemMultiplikationPhysikalische TheorieProjektive EbeneSigma-AlgebraTeilmengeTermZählenEinflussgrößeKategorizitätNichtlinearer OperatorGewicht <Ausgleichsrechnung>Translation <Mathematik>PunktSortierte LogikDifferenzkernMultifunktionGraphfärbungKartesisches ProduktDifferenteObjekt <Kategorie>MultiplikationsoperatorMinkowski-MetrikImpulsBerechenbare FunktionAnalytische FortsetzungFunktionalGarbentheorieGrothendieck-TopologieGruppenoperationPhysikalisches SystemWasserdampftafelNormalvektorCoxeter-GruppeÄußere Algebra eines ModulsEinfügungsdämpfungKonditionszahlEinhängung <Mathematik>Fahne <Mathematik>Rechter WinkelOrtsoperatorVorlesung/Konferenz
01:02:43
AnalysisApproximationEntropieFolge <Mathematik>GeometrieGraphMathematische LogikNumerische MathematikOrdnung <Mathematik>RelativitätstheorieTopologieFrequenzProdukt <Mathematik>Kategorie <Mathematik>IntegralAusdruck <Logik>DimensionsanalyseGanze ZahlFinitismusUnendlichkeitUngleichungArithmetisches MittelMorphismusEndlichkeitFunktionalGeradeInverser LimesLeistung <Physik>LogarithmusMetrisches SystemOrdinalzahlPhysikalische TheoriePhysikalisches SystemProjektive EbeneSigma-AlgebraTeilmengeTheoremFormale PotenzreiheÄhnlichkeitsgeometrieEinflussgrößeKategorizitätParametersystemRuhmasseGewicht <Ausgleichsrechnung>Wurzel <Mathematik>EnergiedichteQuadratzahlPunktEigentliche AbbildungSortierte LogikDifferenzkernMultifunktionPartielle DifferentiationDifferenteObjekt <Kategorie>MultiplikationsoperatorMinkowski-MetrikMaß <Mathematik>Quadratische FormBerechenbare FunktionGefangenendilemmaAnalogieschlussForcingGrenzwertberechnungFamilie <Mathematik>Exogene VariableBasis <Mathematik>AdditionExistenzsatzQuelle <Physik>Einfügungsdämpfungt-TestProzess <Physik>Euler-WinkelStandardabweichungEinsVorlesung/Konferenz
01:12:40
AnalysisEvolutionsstrategieGruppe <Mathematik>MathematikNumerische MathematikSpieltheorieMengenlehreModelltheorieProdukt <Mathematik>Kategorie <Mathematik>AggregatzustandGerichteter GraphGruppenoperationInhalt <Mathematik>Inverser LimesPhysikalisches SystemProjektive EbenePunktrechnungZentrische StreckungGüte der AnpassungFormale PotenzreiheEinflussgrößeWasserdampftafelNormalvektorEnergiedichtePunktKlasse <Mathematik>GradientArithmetische FolgeRichtungExplosion <Stochastik>DifferenteMultiplikationsoperatorMinkowski-MetrikRechter WinkelOrtsoperatorEntropieSymmetrieInvarianteFunktorAusdruck <Logik>FinitismusUnendlichkeitTotal <Mathematik>Arithmetisches MittelBeweistheorieMorphismusFreiheitsgradLeistung <Physik>Physikalische TheoriePhysikalismusResultanteStatistische HypotheseSubstitutionTeilmengeTheoremVertauschungsrelationEndlich erzeugte GruppeDruckverlaufKörper <Algebra>Dichte <Physik>Objekt <Kategorie>IsomorphieklasseKlassische PhysikVorlesung/Konferenz
01:22:37
AnalysisApproximationEndliche GruppeEntropieGruppe <Mathematik>KombinatorikMathematische LogikNatürliche ZahlNumerische MathematikOptimierungOrdnung <Mathematik>Transformation <Mathematik>MengenlehreModelltheorieAmenable GruppeKombinatorKategorie <Mathematik>FinitismusBerechenbare FunktionUnendlichkeitBeweistheorieMorphismusEndlichkeitFreie GruppeGruppenoperationInjektivitätInverser LimesKompakter RaumLeistung <Physik>MereologieMetrisches SystemMomentenproblemMultiplikationPhysikalische TheorieProjektive EbeneStichprobenfehlerStruktur <Mathematik>TheoremReelle ZahlFormale PotenzreiheKategorizitätGammafunktionSummierbarkeitPunktKlasse <Mathematik>DifferenzkernSurjektivitätKonditionszahlSupremum <Mathematik>Objekt <Kategorie>IsomorphieklasseElement <Gruppentheorie>DickeMultiplikationsoperatorZweiMinkowski-MetrikEinsLie-GruppePerspektiveSpieltheorieProdukt <Mathematik>SchießverfahrenArithmetisches MittelErschütterungKlassengruppeLokales MinimumEinflussgrößeMatchingWasserdampftafelBasis <Mathematik>SterbezifferStandardmodell <Elementarteilchenphysik>WellenlehreVarietät <Mathematik>Offene MengeRegulator <Mathematik>Differentesinc-FunktionVorlesung/Konferenz
01:32:34
Gruppe <Mathematik>MathematikNumerische MathematikQuantenmechanikStatistikFrequenzInzidenzalgebraQuadratische FormAusdruck <Logik>OrdnungsreduktionGesetz <Physik>ÜbergangFunktionalGruppenoperationKonforme AbbildungMereologieMomentenproblemProjektive EbeneVertauschungsrelationFlächeninhaltGüte der AnpassungFamilie <Mathematik>GammafunktionExistenzsatzPunktGlattheit <Mathematik>Kondition <Mathematik>Offene MengeDruckspannungDifferenzkernProzess <Physik>MultiplikationsoperatorMinkowski-MetrikRechter WinkelEinsAdhäsionAlgebraische StrukturEntropieMaß <Mathematik>RelativitätstheorieSymmetrieTransformation <Mathematik>Amenable GruppeProdukt <Mathematik>Riemannscher RaumVariableKategorie <Mathematik>FinitismusUnendlichkeitAsymmetrieOrthogonalitätAbelsche GruppeÄquivalenzklasseBeweistheorieDrucksondierungFreie GruppeFreiheitsgradGesetz der großen ZahlenKonvexe MengeKugelLogarithmusMetrisches SystemResultanteSigma-AlgebraSimplexverfahrenTheoremIsochoreReelle ZahlFormale PotenzreiheEinflussgrößeSummierbarkeitEuklidischer RaumNegative ZahlSpezielle orthogonale GruppeKomplexe EbeneVorzeichen <Mathematik>Objekt <Kategorie>IsomorphieklasseZweiFigurierte ZahlOrtsoperatorBoltzmann-KonstanteVorlesung/Konferenz
01:42:31
Algebraische StrukturGruppe <Mathematik>Maß <Mathematik>MathematikNumerische MathematikQuantenmechanikProdukt <Mathematik>Quadratische FormKategorie <Mathematik>IntegralBerechenbare FunktionOrdnungsreduktionPhysikalischer EffektAggregatzustandArithmetisches MittelFunktionalFunktionenalgebraGerichteter GraphGrundraumLateinisches QuadratLeistung <Physik>MaßerweiterungMereologieMomentenproblemMultiplikationPhysikalisches SystemPhysikalismusProjektive EbeneQuantisierung <Physik>Reelle ZahlEinflussgrößeWasserdampftafelZusammenhängender GraphCoxeter-GruppeEnergiedichteNachbarschaft <Mathematik>DistributionenraumPunktDifferenzkernProzess <Physik>DezimalzahlKomplexe EbeneExplosion <Stochastik>KonditionszahlDifferenteObjekt <Kategorie>Klassische PhysikMinkowski-MetrikEigenwertproblemLineare AlgebraMathematische LogikRelativitätstheorieTensorproduktMengenlehreWahrscheinlichkeitsverteilungMatrizenrechnungInvarianteAusdruck <Logik>DimensionsanalyseFinitismusUnendlichkeitOrthogonalitätUngleichungTotal <Mathematik>BeweistheorieMorphismusDichtematrixFreiheitsgradFundamentalsatz der AlgebraKomplex <Algebra>PermutationPythagoreischer LehrsatzSigma-AlgebraFormale PotenzreiheRuhmasseSummierbarkeitAdditionExistenzsatzBetrag <Mathematik>Euklidischer RaumMultifunktionVorzeichen <Mathematik>Pythagoreisches ZahlentripelVorlesung/Konferenz
01:52:27
EigenwertproblemEntropieNatürliche ZahlNumerische MathematikMatrizenrechnungUnendlichkeitGesetz <Physik>Arithmetisches MittelInverser LimesKonvexe MengePhysikalische TheorieProjektive EbeneTermNichtlinearer OperatorParametersystemUnterraumPunktDifferenteObjekt <Kategorie>Produkt <Mathematik>EreignisdatenanalyseFunktionalVorlesung/Konferenz
Transkript: Englisch(automatisch erzeugt)
00:29
The starting point is that what we see, phenotypically, the constituents of what we see are something
00:44
which you don't see, right? It's like in the usual matter, you see atoms, or rather manifestations, but not of what they're made of, right? Not of the subatomic particles.
01:00
And here, the observable features associated to genes, but each gene composed of two or many, for the plural organism of two or many units, and say A, B, and you write it like polynomial A, B, and what you observe is a function of it, right? So it's the observable certain function of these two variables, and so you don't concern
01:26
with what this function is, we're only concerned with this component. So this kind of, maybe more of them, maybe C, each of them, you think about this kind of variable, if you see, it will be like that, and you have distribution of them in
01:41
the population, and so you have this kind of sum, weights, yeah, A, B, C, A, B, C. So these are kind of formal variables, and you have polynomial in these variables, and when you think about what happens under randomly mixed, randomly matching population,
02:04
so what they exchange, right, they don't exchange their whole genes, but only this kind of damage, and when you just formalize, and it was done, I think, by, of course, by Mendel implicitly, and more explicitly by Hardin special case, you arrive at the class of interesting dynamical systems that you can kind of forget where they came from,
02:23
so let me summarize what are these systems. So they work in some topological algebra, and in the case of genetics, it will be truncated polynomial algebra, so you can see that polynomials in many variables divided
02:44
by polynomials with degrees in each variables less than something, say one in this case, but maybe anything, yeah, and you take this ideal generated by certain monomial, you divide by them, and you have this truncated polynomial algebra, and this is quite, quite
03:05
a really very kind of well-shaped algebra, in particular, I want to say one property of this algebra, which is implicit in computation and genetics, and it's, and if you don't make it implicitly complicated, it has exponential map, so in this algebra, the exponential
03:23
map of this algebra to itself, which are more subjective, so if you start, if you have polynomials starting with positive free term, they are in the image, so you have two log, but if you look at this property of the exponential, right, explicitly, it's
03:41
not that obvious, but it's quite a remarkable feature of this algebra that additive and multiplicative groups are the same, which is implicit in genetics, and so, but now what are these transformations? So transformations go in two stages, first the random morphisms,
04:01
A to A, then you have product of them, in fact, such as so, so how these random motions come to life in examples, for example, here, because, so this algebra is typically algebra function somewhere, so it acts in a space, and the morphisms come from maps
04:23
of the space, in the case of Mendelian dynamics, you have Euclidean space, and you have coordinates, and you just have projections of coordinate subspaces, and these projections give you a morphism of your algebra, of course, right, and actually it's not, and this is
04:44
a morphism, but we have an interesting nice feature, it's important, this is also has some consequences, so you see that the whole story started when biologists realized that the Mendelian logic, and it was implicit in Mendel, tells you that variation stops
05:06
in this second round of reproduction, so you have a mixed population, you have a new kind of operation with a different phenotype, but when you mix again, the distribution of phenotype doesn't change, so this slogan of, if it survives, doesn't apply, it just
05:24
happens only once, and then nobody dies anymore, and this is why it was so kind of, people around Darwin were so much unhappy, because it completely destroyed everything they were saying, justifiably, because what they were saying was not correct, and then just
05:42
behind this, on the first level, is a projection, so the coordinates have a property squared equal one, on the second level you multiply this into morphisms, and then you have a new kind of maps, they are multiplicative in the morphisms of the algebra, not of the algebra, but of its multiplicative group, and they are, as I said, quite non-trivial,
06:06
and quite amusing maps, just the objects which you obtain, and a typical map of this type, which I want to bring to the light, because it's quite remarkable, it's called Sagre Maps, and the most, kind of, remarkable is the first one, another map, it is a map
06:25
from projective space Pn, to this sphere, I always call it dimension, and plus one over n over two I guess, but anyway, it is obtained by taking linear form on a linear
06:40
space, take linear form and take it square, and become quadratic form, so you map space of linear form, one linear space to another space, to space quadratic forms, and the image of the projective space, because symmetric sphere factors to projective space, you have projective space, the simplest example, you have S2, it goes to projective 2, and
07:01
then goes to the sphere S4 in alpha, and it's quite quite, if you think, quite remarkably symmetric sphere sitting there, its orbit of the orthogonal group, the linear group, and this next level, so inside of this, we have this here, the point all the symmetry
07:22
enters because we assume independence of events, so independence is assumption of symmetry, it's not just from philosophical point of view, it means something independent, but it automatically means you assume things as symmetric as they can be, and this, by all the symmetry immediately comes in, and then the next level, this concerns a single
07:42
gene, a single locus, and these kind of, the maps which you get this way, these multiplicative endomorphisms, they have the properties that also, multiplicative endomorphism, also they have this Mendelian kind of maps, which Mendel emphasized, square equals M,
08:03
so if you look at a fixed particular phenotypic feature associated with one gene, its distribution population changes on the first round of random mating, which has nothing to do with selection, so what you observe in population is variation, if we observe it, there's nothing to do
08:25
with selection, just mixing of present gamut, and then on the next level, when you have these M's, you consider some particular convex combination, in this moment, with J, and
08:42
this moment, some analysis enters, because convex combination numbers must be positive, otherwise nothing works, I mean, you don't get, you don't get anything, anything manageable, and this combination corresponds when it has several genes in this, this kind of combination,
09:02
now it does not stabilize this, that's why it doesn't stabilize, but it exponentially converges, and the reason of that, again, the kind of maps you see in this example, so here, your fundamental and the morphic projection to linear spaces, out of them, you make other things, and they are invariant in the very big symmetry group, and actually
09:22
it's exactly what we call a normalization group, because of the symmetry, you can control them pretty well, and the basic result here, that this map, well, you have to look what conditions satisfy, which I explained, the simple condition you need to satisfy, that
09:40
it's equilibrium position, and it is fixed point is attractive, exponentially attractive fixed point, and this is a, this is seen because, just once you know, it's locally attractive, and because there is big normalization group, it's differential is attractive, then you know, it's attractive in direction transversal to equilibrium position, so everything
10:03
goes to that space of equilibrium maps, and the equilibrium maps are the ones which maximize the entropy, this moment entropy enters, and just a couple of words now in biology that should be said, that this feature, this fundamental property of sexually producing
10:22
organisms, that it's not features which are inherited, but the secret units of gametes inside which are invisible, which are inherited, ensures stability of populations, if not for that, we'll all be dead, pretty soon, we shall, we will degenerate, again elementary
10:41
mathematics shows that we would die, nothing would be there, so if you take naive, that we need theory, use mathematics, boom, we are not dead, we are dead, and this is very easy kind of, easy mathematics, because the accumulation of this mutation, and they will be the key, be the key eventually, any population, when there is no horizontal gene exchange, when there is no sexual reproduction, it tends to degenerate, of course time depends
11:07
on the scale, but essentially here, which makes it kind of incomparably more stable, the population keeps units which are not manifested themselves in, so if they become
11:20
somewhat deleterious, some others will compensate, and this theory is completely different game, I haven't looked carefully in this mathematics, and so in any case, in genetics there are two time scales, one the one which I described, from Mendel, the long scale, the other which correspond to mutations, because this A's and B's may be subject to certain modifications,
11:42
which are not at all kind of trivial what they are, they are not just random something, and I mean in the traditional evolution theory they become confused, confused by people, by some people at that time, and become clearly transparent, relatively recently there
12:03
was some clarity which has been achieved, but it may be deceptive, because say, it was happening many times, when Darwin came to his theory, everybody was, at least those who supported him, were very happy and said, oh now we understand evolution, and then the next generation came with Mendel, and then mathematics by Haldane and Fisher
12:23
and Wright, who developed mathematical formalism, and say oh no no, this was complete nonsense, now we understand it, and this was called modern evolution synthesis, in the 20s and 30s, and then in the 70s molecular biology started, and Manon was a great head of
12:42
well now we understand evolution on the molecular level, and he was extremely kind of sarcastic about everybody else, you know, people out and about speaking about evolution and saying they understand something, because in his opinion, justifiably, then understood nothing, but now we say that Manon understood as little, because his ground was very
13:03
few data, and now we can say, now we understand, and the reason is because the data on molecular goes to the petabyte basis, so like trillions of units of petabyte,
13:22
keep forgetting, maybe even quadillion, yeah, the total amount of length of sequences now being analyzed, I think, of the order of 10 to the 15, yeah, this petabyte. So this is a data from which you now can make some non-trivial statement about evolution. And if you have, you know, something like 10 to the 5, which was the type of monoid,
13:45
yeah, it's nothing, you can't make, it's not comparable, so, and now people say, well, now we understand something, right, they can, one can reconstruct the common ancestor of all living organisms, which lived about 3.5 billion years ago, but,
14:03
on the other hand, it is explicitly stated now that we understand nothing here. It's a tiny little piece where the situation is infinitely more complicated than people believed in 200 years ago, 100 years ago. So that's concerning data.
14:24
Now I want to return to the entropy, because it enters here, and it's, so the point, yeah, we did one last point, that, so, this was very special kind of endomorphisms, and mathematically, immediately you ask, what are the transformations of this kind available
14:43
when you have an algebra, you have this kind of dynamics in there, and other examples what they are, and another pronounced example is, of course, the normal law. So the normal law is exactly come from the same time, you have algebra functions under convolution, you take this convolution squared, and the thing becomes rescaled,
15:05
so you scale it correctly, and there's always, by the way, rescaling is essential, all these things, you never have, on the polynomial level, you rarely almost never have importance, and then you have a fixed point, and you know it's a huge attractive
15:21
basin, this is the normal law. And the question is, what happens for other fixed points of these transformations? We show them as sufficiently attractive, and what are similar transformations involved? And I think, I don't know how much has been studied in that, yeah? And of course, there are many other algebras, and many other examples immediately are coming to my mind, but, they were coming to one's mind, but I haven't looked at them.
15:44
Even here, yes, yeah, as I said, there are lots of other mathematics, and the point is, if you want to make mathematics, you have to separate yourself from reality, because, of course, this ideal of genetics is never satisfied.
16:01
On the other hand, this is exactly because this mathematics is useful, because you can understand what it is, and it is roughly used. Typically, in this population dynamics, you can see the more general mating, and this mating is usually just more or less general if you don't see something, quadratic map from Euclidean space to itself, yeah?
16:23
This is a quadratic map, and then what? Nothing. Quadratic map is just anything. Any polynomial map may be seen as a quadratic map, because all algebraic operations are generated by summation and multiplication of degree two. So everything is quadratic, so everything can be there,
16:43
and how can it say where is what you want? So that's the problem. On the other hand, you can imagine other models, for example, of evolution with sufficiently separated scales. Of course, another feature here which is very non-physical, which we'll do opposite to what we shall see in a second, is scale separation.
17:01
Separation of biology depends on the fact there are many separate scales, and mathematics of that is related. It is part developed to what is called tropical geometry, but again, I haven't looked at this myself carefully, and even what I looked at, I don't want to discuss here. But now I want to turn to entropy and do the similar thing.
17:25
So we have this very naive starting point was we have two groups of organisms, two fields of flowers of a different color. They were separated. They mixed up, and we see distribution of colors, and see how it develops, and see it kind of stabilizes too fast
17:41
to be counted by biology as mathematical phenomenon. And mathematics is amazingly kind of already ready. Basically, it's algebra and the morphisms and the homomorphism, something quite nice. The symmetries, the Lie group, actually the symmetries here,
18:02
the set of normalization, the important Lie groups, et cetera. It's her idea. Now what happens with physics, yeah? So because Boyseman and Mendel were almost contemporary. Boyseman was slightly younger than Mendel, but of course, you never heard of Mendel.
18:21
And they pursued the same idea that the fundamental world is discrete rather than continuous. So Mendel shown that basic units of inheritance are discrete. And this again is saying in opposite to basic premises of Darwinian theory. And so Mendel was right, and Darwin was wrong at every point.
18:45
And Boyseman was insisting that the world was discrete on the level both of structure and energy. So he suggested that atomic theory was, of course, conjectured long before him, but he was promoting it. And also, he suggested energy is discrete.
19:00
And this was a suggestion he made to Planck. And Planck has now introduced his Planck constant in discrete definition. But what is the model? What is the mathematics he was trying to pursue? And so what's amusing about that? Again, this mathematics, which you have is physical or biological, very simple scheme.
19:21
But mathematics, you take it the one you know. The scheme is always the same, but your interpretation depends on your background. So what is the logic of Boyseman? So we have the system of particles. At that time, atoms were kind of semi-conjectural. Some people still didn't believe in atoms.
19:41
Justifiably, because as we know today, atoms in the classical framework cannot exist. It's just an absurd notion. It only makes sense in terms of quantum mechanics. Otherwise, it's just a self-contradictory concept. But still, you can pretend they're there.
20:01
And then you want to understand the system consisting of many atoms. And you want to assign to it entropy. Entropy, of course, was coming from physics in a different plane. But we just stick to the ideology of Boyseman
20:23
and as you can read in physics. Text, entropy is just log of the number of states. And this just try to decipher it.
20:40
Try to decipher it. So you have the system, and what is states, and what it means. So you can say, aha. This is how it was done by mathematicians at that time. Interpretation was coming along with Canto theory of sets.
21:01
And actually, Canto theory also made a much in this spirit. Yeah, you see, well-built basic units, elements of sets. So you say, aha. There is a set of states. And this is log of the cardinality of this space. Right? So we have all possible position of the seconds.
21:23
And then the immediate objection is kind of infinite set. It has no cardinality. It will be infinity. Just doesn't work. But of course, this is because we already try to read this in the language which is not appropriate here. So I just say, what I want to explain
21:41
that the way people were teaching Boltzmann till recently was just completely arbitrary. It was a language. So for me, the kind of turning point of the language should be somewhere 1960 for this particular purpose. I would say, so pre-Grothendieck or post-Grothendieck. Of course, there were other people involved.
22:00
And particularly in this story, it was also people accompanying Grothendieck in categorical thinking. And then another point was around this time, it was non-standard analysis, which didn't exist before. And so what Boyce was saying can be interpreted completely rigorously in terms of non-standard analysis
22:23
and categorical language mixed together. So all these infinities, whatever, relate to concept of the space of states. But there is no space of states. I mean, this is an abstraction. A mathematician may say it. But you may say, well, it's just not that.
22:40
There is no such thing. And so what actually thesis have? Yes, what they observe, and how interpret what they observe. And this would let me explain it. And then how we transform it to the language into point of Grothendieck language. And so what you observe is as follows.
23:01
So I have this physical system. For Boyce, it was a gas. I prefer it to be a crystal. Just for the purpose of the discussion, it's slightly easier. And so it's a crystal. So you think about it secretly, but then you don't know what it is. It's just atom positioned in a lattice in a symmetric pattern in the space.
23:20
And this is a three translation making this group G3. But you don't know that. Just your mental picture. So what you do is start measuring it. So you take some machine. You put it here and see what you see. And then it's another physical system which has its own states. And you think about this, something with windows.
23:42
And then something blinking in these windows. And this, and this, and this, and this. Then you take another such machine, put it somewhere else. And you have the same blinking. And you can move it along this space and see what happens. And again, they blink together. And then, and this is what all you see.
24:03
And moreover, you don't have colors, whatever. You only have frequencies of blinking. And if you can move one machine to another, by translation in this space, you say, they're kind of really the same. And then you can compare corresponding windows. And from that, you develop concept on this.
24:22
The only thing we think the thesis have is entropy. And yes, immediately, what I say, my description will be not quite complete. I describe it in categorical terms. But probably the right language should be two categories. Because there is in fact protocol of how you do that.
24:40
And these protocols are morphism in two categories. And that probably better, better kind of formalism. So the right, not right, but simple, more adequate formalism from a point of view would be the one which I'm going to describe. So what is entropy? What is this physical system? And I want to say that there is no set.
25:01
Eventually, there is no set here. There is entropy. But there is no space of states. You have a number of states. But there is no space of states. And this number and this log must be understood in a way, in an appropriate way, exactly kind of translating what Boylston says into the language of first into categorical language
25:23
and secondly into language of non-stern analysis, which actually is there. The right object is some growth in the group. There is some non-standard completion of certain category. So in this, I want to explain. And this is extremely, extremely simple. I think it's just words.
25:42
It's just words. You just translate what I said. So what? Again, we don't know what system is. We have it attached to these little things, these kind of windows. And you see this blinking. And sometimes, you can move it. And sometimes, you can have them together. And then maps are attached.
26:02
And then each of them is a physical system, and so on, right? And you can attach to it a smaller one and see what you see from this one. So you make this. We have this machine put here. Then this machine connect to this machine. Do that. And so what is the category again? I'm saying it is describing, very simple category.
26:21
You see a kind of kindergarten category. But if you just say these terms, everything becomes extremely straightforward. And this basic category is ease of finite measure spaces.
26:42
It's a very simple category, which is objects are stones, finite collection of stones with given weight, normalized to have total weight one. So you measure them in units. And morphism, when bringing, as I said before, not maybe not stones, but drops of water.
27:03
They bring them together. Some of them you can bring together. And their mass adds. So these are morphisms. And immediate objection, maybe just like partially what is said, why to have category, but we shall see in a second. There is serious advantages to think, even in this example,
27:22
to speak in categorical terms. Now, about entropy. Maybe I said before going to entropy, I just go ahead, and then return back to entropy.
27:41
This is our finite spaces corresponding to these little machines, which have finally many entries. And you have frequencies of some blinking here. And these frequencies are our weights. So our measure space of it. Our morphisms, when you go from one to a smaller one.
28:03
But then there is this big one, which is not of this nature. The original physical system is not like that. It's kind of infinite. These are finite. There are finally many windows. And they are blinking there. So weights are these frequencies. And again, when you look at this with limited means, for example, you only count blinking here and here.
28:21
You have this morphing from one finite measure space to another. What about the big one? So the big one, the one you measure by means of the small ones. And you know perfectly well what it is in categorical language. It is covariant functor from this category of finite space to category of sets.
28:42
Idiot. You don't have to think. You know what it is. I mean, there are these language, if you know this, you do it effortlessly, and then you go on. Just all measure theory, as we know it, counts effortlessly from there. All statements and theorems and proofs of measure theory are part of this categorical language.
29:01
So it's a language like flow. You just say all standard sentences, absolutely trivial, without thinking, and no measure theory in your hands. Once you have this category. And this is what people do in measure theory usually, but their language is 19th century language, and this creates problem. And historically, so the point is interesting.
29:20
And I think for me justification that this language is bad was that entropy was in the dynamical system invented by Kolmogorov. I keep forgetting, 1958. His first paper contained mistakes. In this context, it's so trivial, it's unclear. There's no room for mistakes. Exactly because he was using his own language,
29:42
he created also measure theory. It's more than measure theory language, and this language is not adequate here. It just creates only problem. It's just a way to carry with you. It's absolutely unnecessary, and you make mistakes. Of course, ideas, you've got to completely write ideas and ideas are like that, like kindergarten ideas. But the language is very awkward. And so when you write it, you just,
30:02
because it's complete balance, you carry. And so you make mistake in this balance, because it's balanced. That was corrected by Sinai, and so it's called the model of Sinai entropy. But this, again, concerns this balance. It doesn't concern the core, which is what I said. Now, we still now turn to the entropy. So eventually, the point is to understand
30:21
what is the entropy of this kind of physical system, like all crystal. And in these terms. So the point of, again, physical idea of entropy
30:51
is something which you measure many, many times, and then you average, and then you get it. The typical, the formula you should write for the finite measure space would be some pi log pi.
31:04
On the other hand, if you look at this picture, you measure something here, then you take identical thing and measure it again, and take identical thing to measure it again, and look at the coherence of these measurements. This, of course, extremely matches stuff.
31:22
But however, if you want to count this number of states, you have to repeat experiments too many times. On the other hand, it is very concise formula, and that's exactly the formula. It's not the definition of entropy. It's a very efficient way to compute it, suggested by Boltzmann and mistook for the definition by mathematicians, because they were doing a nice thing.
31:41
Actually, this was not Boltzmann's plank for first round this formula. And then, but it's, again, it's a formula. It's not just computational formula, not the definition. So what would be the definition? So I give you, first, the mode of this definition.
32:01
I then decipher it again. So I have my category of finite measure spaces, called FM, finite measure spaces. So this certain category. If you have any category, you can assign to it a growth index group, or growth index semigroup. So we shall do that, we shall do it with certain care.
32:23
We have this in the back of our mind, but you do it with some care. And this means that whenever we have, so this is preferably assigned to morphisms, and so it's something associated, we take a group generated formally by morphisms or semigroup,
32:40
and the relation, if you compose FG, then this element of the growth index group will be F plus G. So it's abelian group or semigroup. I prefer semigroup in this context, I like that. Very simple. It's abelianization of your category. And in the first approximation,
33:03
entropy is of a finite measure space, it's image in this growth index group, and that's it. But then, epistreiori, this group or the semigroup, is isomorphic, is topological. Actually, this category also has some topology,
33:22
which I suppress at this moment. I somewhat suppress this, but this will come in a second otherwise. If you just say it like that, it'll be huge abelian group or semigroup, uncountable, something horrible. But still, by the way, not if you're just limited to special space, it's still not completely abstract.
33:40
But it's topological, so you only look, do everything continuously. But in any case, I want just to kind of bypass this and just to come closer and closer to the core of this to explain actually how it works and just how to look at this
34:03
in a somewhat different perspective. That's kind of the idea, because in fact, the correct statement, it is not growth index group of this category but of a non-standard model of this category. So given any kind of defined category,
34:21
and we shall return to this, this kind of comes to the more modern result in ergodic theory when you have to work with this non-standard model of groups and in some more categories. But now I want to look at this in a more naive way and just for me, it is a following example, which I laugh very much.
34:42
It surprised me a lot when I learned it. And this is a Lewis, I'm not sure how to say it correctly, Lewis-Huitni inequality. And this inequality, I already spoke about it
35:03
and I want to repeat it again, where this idea is extremely clear and again forces you to think about other mathematical problems. You consider a measurable set in the Euclidean space, in three-dimensional space. It's true in any dimension,
35:20
but this is the first non-trivial example. And you consider three projections to coordinate planes. So it is x, y, z plane. And you can see the x, y projection, x, z projection and y, z projection. So you have three domains and this has certain volume
35:41
and this has certain area x, y, a, x, z and a, y, z. And then the inequality says, not surprisingly, that volume squared less or equal than product of these three.
36:09
And this is a weak, I mean sharp, but not sharp in a certain sense, form of isopedmatic inequality. Because if you have a domain
36:20
and you know about an area, of course, area of each of them is smaller than area. And this is the volume. So I have volume as a correct power estimation of area except the extremal shape here will be not a ball, but a cube.
36:43
Which is somewhat amusing because this is a isopedmatic inequality even in this sharp form it's very powerful inequality. It implies, for example, all so-called inequality is half mathematical, physical, trivial corollary of this. And so how you prove that? And this is the proof by Whitney
37:06
and Lewis was just writing some simple computation of the equality, which was not, I mean, not, I mean, it's very easy and what? It's kind of strange, but the right way in my view,
37:21
the right way to think about this is as follows, which is explain this Grothendieck philosophy. You observe that, first, that it has nothing to do with Euclidean space. All was essential that you had product of three measure spaces,
37:46
or maybe I call them, according to my notation, x, y, and z. And I have a subset there and they're projected to x, y, y, z, etc.
38:00
And look at these measures and they have this inequality. I never use here the structure of the Euclidean structure because everything about measurable sets and measurable theoretically, it's all the same. Actually, I speak here measurable theoretical language, which already I said is very bad language. By the way, I said that measure space is a functor from the category of finite sets to the category of sets,
38:24
which I didn't quite explain yet. But on the other hand, this language is kind of so sticky, we still use it, yeah? And there is a good reason not to use measurable theoretical language because really, you're not correct. The way the stacks boosted in measurable theory, they're just wrong.
38:41
Everything they say is wrong because it's not a set. Measure space is not a set. It's category is not category of sets. If you want to say it's set, it's not a set. This must be. But on the other hand, you can't say set if you remember, if you know what you're saying. Okay. So, and once you say this verse, yeah,
39:01
you observe if I put here Cartesian powers, and nothing changes, yeah, then all these numbers go to the nth power. Therefore, if you can prove this inequality for large n, even approximately with some error,
39:20
when you take back nth root, you may get what you want. So what you have to prove only is approximately, but after you took the power. And now again, I say it slightly and carefully, and then you come back, come back, come back to this in a second. If you're, which will be not quite true here, but I'm saying almost,
39:41
when you start going to high and higher, to high and higher power here, with the law of large numbers, this set, arbitrary set, will kind of converge to the cube. And so in the limit, you have this cube when you have a quality which is not quite true, yeah, because exactly, I'm saying not quite right. It will be converge to the cube minus a plus something,
40:01
determine your favor, and then in the limit, you just say, huh, I have a quality, and I read this n, and I have the proof. And so it's completely effortless proof, and now, yes, if you think a little bit about that, so what happens when they're high and higher power and apply the law of large numbers. But now I want to formulate the law of large numbers,
40:23
because it motivates what they do. On the other hand, the definition, with Grotheny group, they don't need it. Yeah, it just comes in the end. So what is the point of the law of large numbers in this context? So it applies to finite measure spaces. So I have finite measure space, which I want to write as a collection of atoms, yeah,
40:41
finite collection of atoms. It says, if I take this Cartesian power of this, which means I, so it's quite simple, right, and I have this thing, and I have this thing, and in every one of them I put the product of the two. You remember there was exactly the essence of this
41:04
most elementary Mendel type of map, yeah? It was exactly doing that. You remember how you, so it's, we are not quite far from where we were. So this, this Hardy-Wendel theorem, or Mendel kind of principle, says if I take a matrix,
41:20
and where the sum of all elements normalize equals one, I replace every entry by product of sum here and the sum here, and normalize again the resulting matrix so the sum will be one. So I have a space, I have a mapping from space of matrix into itself. The square of this map equals m.
41:42
This was kind of paradox resolved by Hardy-Weinbrick special case, though they never formulate in these terms, just writing explicit formulas, I showed you last time this stupid formula, but if you write a formula of two by two matrix,
42:03
symmetric matrix. However, this is exactly what we are doing, this is a square of a space, now we have this power, and the point of the Bernoulli theorem saying that when n goes to infinity, in nature, this space converges in some sense to space with equal entries,
42:22
which I would call homogeneous spaces. Equal entries, I don't want to say it, because it depends on the particular category I'm speaking about, but being homogeneous, object in the category is true always, just for any category, this concept of homogeneous, or homogeneous object in a category,
42:41
so every measure space, finite measure space, when you take a high power, become essentially homogeneous. And to say it precisely in a compact way, you have to say that I take p and take n, being non-standard infinitely large number, and this space as a non-standard space
43:01
is homogeneous up to higher order kind of infinitesimals. And this language when you pursue it, you should see it's really how it works. Now, let me remind again what is the precise definition, in what sense it converges. And moreover, and this is what is essential,
43:21
it's not only true for individual space, it's true for morphisms. So it works, it works coherently, so given any kind of diagram of maps, you can, you go to the high power, the whole diagram is approximated by a corresponding diagram where all spaces are homogeneous, and morphism between homogeneous spaces.
43:41
By the way, the spaces are homogeneous, and morphism between those are just decomposition of numbers, right? So we have equal. Atoms are projected, because you have equates, so it means here you have number p times q, and here it's number p. So you throw away some factor,
44:02
you decompose, you decompose the numbers and products. So it's category becomes extremely simple, just category of numbers, and category of, multiplicative category of numbers. And I'm saying when n goes to infinity, this category converges to another category.
44:21
So in the limit, and actually it's non-standard limit, I mean it's, and we shall see this formalism is essential, in a, in a, in a, in a, by the end of this lecture. Okay, so in the, but now let's say explicitly, because when you have to decipher it once,
44:40
and then you can forget it, and speak in this kind of language, and you feel very comfortable, but once we have to say it explicitly, in what sense, converges, and you don't have to know what non-standardize is, and in a way it just says if you just follow your, kind of, naive intuition, about whatever you just do it, right? As it was always done, and then this justification that,
45:01
which is, by the way, is not at all true justification, because it depends on some axiomatic, which may, may not be accepted, right? Some people wouldn't accept it. And, and, okay, so, what are the essential convergence, and there are two topologies involved,
45:22
there are two type equivalencies, and they're really different, and I mean, you eventually mix them up, and one can respond to additive structure, probability, and numbers, in order to multiplicative. And the additive is very simple, when you have two measure spaces, two fine measure spaces, and another measure space, you say they're close,
45:40
epsilon close, if one obtained from another by removing, by adding some measure epsilon. It may be many, many atoms, but total mass epsilon, right? If you, so if you don't change, kind of, you take some bunch of atom, of weight of one, one minus epsilon, and you do whatever you want to others.
46:02
You spread them, condense them, whatever. So these two will be additively closed by epsilon, and then there is another one, multiplicative, and here we can use this normalization, I hope I said it correctly, and that, another one which you multiply, you multiply everything by a constant,
46:23
all atom multiplied by a constant, such that, if you take, I hope I'm not confusing it, you take, divide log of the constant, by the log of the number of elements in your space, this goes to zero.
46:41
Again, everything applied for infinitely large m's, it might be infinitesimal. This is what I'm saying, it's very inconvenient all the time to speak about sequences. It's one n, but infinitely large, which means you're working with sequences. But I'm always confused, you take log by c, or you divide themselves, we shall see in a second, I'll formulate the law of large numbers,
47:00
and then become probably correct. Except here there is one little point, because you have two spaces, and they have different number of atoms, this m, by which one you divide. And you divide by a small one, but again this is conventional, because in truth you have a sequence depending on n,
47:20
and this is your normalizing element, you divide by this n log n. And this I think is a kind of essential point, yet again it goes in the direction of the large deviation, large or small deviation, I always forget how it's called yet.
47:42
Now because one is independent of additive structure, numbers here multiply by number, I multiply always by number, and here I just add or subtract some mass, yeah. Probability theory exactly just is in the play between additive and multiplicative structure, they're very different here. And the old proofs of equivalence here have to be separately proved, something invariant on the additive,
48:02
and something invariant on the multiplicative. And then when I have two sequences, two spaces x and y, I take the powers, and see if the two sequences for infinitely large n is the equivalent. And you call them Bernoulli equivalent. And Boltzmann entropy is a class of Bernoulli equivalent spaces.
48:25
And by Bernoulli theorem, and the second I explain why. Such a space for large n is homogeneous, and so the number of atoms, the number is kind of defined.
48:40
So you take log of this number, it will be entropy, and they're Bernoulli equivalent if and only if entropy is equal. And if you compute this entropy by going there, you immediately see it's kind of trivial, it is given by this formula. I mean just because it's kind of trivial computation, because both behave correctly under the problem.
49:02
Now why Bernoulli theorem, why it's the law of large numbers, it may, usually the law of large numbers formulated differently, but it is the law of large numbers for variable log p. And exactly, but you can, you don't have to say that yet.
49:21
From a certain point of view, this is a better law of large numbers, the usual one, because it involves no extra random variable, it's just one measure space. It's a property of the category itself. And it doesn't, of course, you might be careful, it's a little bit, of course, cheating, because there are two measure spaces here, right?
49:42
Secretly, it's a, this way is p i, and the space is equal ways to all atoms. And of course, additionally with Boltzmann, for Boltzmann there were two measure spaces, there was Euclidean space, which invariant you will measure, and the second measure, distribution of his ensemble,
50:01
and where the entropy came up. So it's a, in fact, property of two measures, where the entropy comes. Now, hmm? Yeah, it's a measure space with extra weight function, which gives you a second measure.
50:21
And this was how Boltzmann think about that, and this is kind of Shannon interpretation. I don't know exactly, because the way Boltzmann writes, of course, he doesn't say it explicitly, but in his arguing with mathematicians, of course, he shows this is what he had in mind. And he had in mind, of course, non-stereontonic soltae.
50:40
And, which was not, of course, available in rigorous form. In fact, it wasn't well kind of established, essentially, you go by Leibniz, but then came people, you know, like Weierstrass, you know, Cauchy and said it's no good, and changed it to epsilon delta, and of course, everybody thought in those terms, and so,
51:00
so Beusch was just in the wrong moment, yeah. So nobody understood what he was saying, I mean, mathematicians, we had ten, we had ten instruments understanding what he was saying, arguing with him in a silly way. Anyway, this is a, this is a point. Now I just want to explain this,
51:22
so how we are using that, you establish basic, basic, basic inequalities. So what is the basic inequality, the Shannon inequality, so let me, now we just come back to this. Abstract measure space,
51:41
and yes, though this inequality is about finite measure spaces. So when you have this, so why abstract measure space, you have abstract measure space, in what sense it defines your functor in the category of finite measure spaces? And the functor is extremely simple,
52:02
it's just a set of maps from X, so if it's finite measure space AP, it's just a set of all measurable maps. We don't know what measurable maps are, but this, what you do, because category, you have this X, it's obviously some category, and you know, with the morphism in this category, there is no points, and there is, you know,
52:20
you see the point is, there is no points, because in order to define, usually measure space, you have to speak about all sets of measure zero, and the set of this is more than a continuum. Therefore, you need to re-evolve fully developed experimental fractal theory. And this certainly, I mean, just, nobody, no mathematician knows this actually, yeah? It's a very messy and pleasant theory,
52:40
and who knows whether it's right or not? Right, it's logic. And it secretly sits there. Of course, you don't care, because it's, as I'm saying, it's low, you carry, you don't need, you never use it. Right, all these measures say, but as categorically, all you have to know is this, is this for our purpose, and in classical language,
53:01
and it's still very convenient pictures, partition into find too many elements with different weights. But the point is, it's convenient, categorically, to give it a name like that, and have a map into something. And then it doesn't have to be a set, it's just functor. And another way to think about this, because it maps to all measure spaces,
53:21
finite measure spaces, here is a category of finite measure spaces, and it's convenient, in a second we shall see, take a small subcategory, so it will be set, not just class. And here on the top, we have this x, and this can go to any space here. So you just, in the top of this pyramid, you bring this set. And so what's a relevant,
53:41
basic operation for sets, for partition, you can intersect them pairwise. And this operation is called, so we have this, you have two spaces on the x, you can say p times q. And the meaning of that, if you have a physical system, and you measure it by one,
54:04
set state detector, and take another one, you consider them simultaneously, so you consider how much frequencies appear in pairs of windows, and this would correspond to this operation. And you don't know, again, the whole point, you don't know what the biggest thing is.
54:22
You don't care, sets or no sets, it's just absolutely irrelevant, it's not set. You get a set of numbers, of course, it's just rather ad hoc abstraction, and it's just, we're used to that, in common, it's very useful, in a way, operation, to consider a set of everything. Whichever meaning somebody takes a set of all of them.
54:41
it seems that, kind of experience of the last decade says, well, it's much better to do it categorically. I mean, it just gives you better language than set theory. Of course, it's just language, either of them. So, and this corresponds to this intersection, and this, again, have to define categorically. So what it means,
55:00
so you have this functor, so you add this functor, as you consider an object of a category, and then it means the following. So you have x goes to p, x goes to q, and then there is kind of a minimal object here, r. So, you can split this diagram. And this category
55:21
has this property. So this is essential, this category of finite sets. Finite measure spaces have this property, when you add this, no, I said it incorrectly. So, your measure space means you have a functor with this property, so you always have this r. And then you have all you have to know
55:41
about measure spaces. Everything can be said in this language in an extremely simple way. I mean, everything translates here. Now, this is by the way interesting. Got the pattern? The pattern doesn't push up? Hmm? Well, this r will be this.
56:07
So this is actually a co-product. There is no, only this operation. Other operation is silly from this point of view. There is another operation but it's completely silly. I mean, it exists, but it's that's theoretic and it's it's
56:21
perversion, sort of, it enters in some level, you know, on a much more sophisticated level, but you can say it, but I better not to say it. It's a need to the structure. It's a it's a some artifact of something, but even I'm not saying it. Forget about that, but it's completely different level of
56:41
much, much, by two order high level of structure, which is not needed here. So this is what we have. So we have this category and now I I want to formulate, prove, and experience the basic, basic inequalities and the basic inequality here is
57:01
this Shannon inequality, which says that p times q, the entropy of that, less or equal, the entropy of p plus the entropy of q. Now,
57:21
again, it is physically kind of obvious, right? Because we make two observations and we count what you see, count the number of possibilities. If you make them together, they don't interact, they don't prevent, don't, for example, you may have certain colors blinking here and if they're
57:40
completely here and here independently, of course, the numbers will be multiplied so logs will add, right? But if they interact, some of them may prevent one from integrating with another, you have fewer number and then for this, of course, that's obvious because already they have this picture of this being number of states. Now,
58:00
how we can see this in this language? I'm saying that because typically proof, what mathematicians do, uh-huh, write this entropy, some log, log is convex and they're quite happy. I personally can't accept such, I don't know why log is convex, you have to differentiate, you have to know calculus, you don't have to know that. You can translate
58:20
this physical reasoning to categorical language and then become as obvious as that without, and in particular, prove that log is convex, right? So log is convex follows from this physical language. Yeah, it's not behind it and the proof is as follows. When you have this inequality and you know,
58:40
again, you can put here powers n, right? Because everything multiplicative. Everything multiplicative, it becomes, it becomes, everything becomes homogeneous and then when it becomes homogeneous, it reuses to the statement.
59:00
So let me give, maybe I reformulated slightly differently. Yes, if you formulate the original theorem, this inequality in terms of this two projection, two p and two q, it says, we have here some weights and I project them here and here. Here is my p, here is my q and here is my r. So it's some measure here,
59:20
one project here and one projection here and so, and this is r, r meaning p, q. And so, this inequality says something about these matrices. So you have here, here is projection giving measure p, here is measure q, and p of p plus q dominates measure of r.
59:41
Now, if I take Cartesian powers all of them, they all become homogeneous and what I see, I see that this r will be subset in the product of p plus q and because cardinality of sets is monotone under injective maps you have it. And this indeed
01:00:00
And what's strange, because in measure theory, all morphisms are subjective. And here, you use the fact. You use injectivity of the maps. And this will be kind of fundamental in the theory, kind of not obvious thing. And this was kind of understood rather recently
01:00:21
in the work by Lewis Bowen. So actually, how is injectivity can be seen categorically? What is an injective? And the point is that exactly this appears in this diagram, R going to P and R going to Q. This diagram
01:00:44
is kind of injective. So when you go from the original category, look at the category of such pair of diagrams. And so it's a morphism now. It's a pair of morphisms. And this is a morphism. So if you take this for morphism, a new category of these flags, how they're called here,
01:01:09
then this will be injective morphism in the sense of category theory. So it's another way to formulate it. And this is suggestive for other categories.
01:01:25
So that's entropy is monotone in the category of certain diagrams. Because you see, you have to go from subjective morphism to injective ones. And the proof is exactly like that. This Shannon equality tells you that.
01:01:41
And another point, another Shannon equality which you usually use is slightly different. And it says that entropy of now for morphisms also says inequality. And this I want to show you at this moment advantages of these notations.
01:02:04
So that's for morphisms. And just if you use usual definition. So traditionally, this is called entropy of f called relative entropy. And it's written like this, entropy of p comma q.
01:02:21
For me, it's morphism f is a morphism from p to q. The trouble with this notation, of course, you never know who is p and who is q. And secondly, you have three symbols, p comma q instead of one f. So you ask exactly three times formulism. I exactly wrote it down to see. So what happens? It's just half a line, exactly a line.
01:02:44
Meanwhile, can I ask a question about the formulism? Are you allowing partial defined maps between your fine metric spaces? Partial defined maps between the fine metric spaces? In this context, not. But implicitly, they are there. Because the measuring device, it
01:03:03
No, no, no. You measure the whole thing. No, no, no, no. Measuring device measures the whole system. It measures the whole system. However, technically, it's convenient to have partially defined map, but they're not in this category. So as usual, because it's kind of slightly analytic, the categorical language might be slightly adjusted.
01:03:21
And indeed, you have to allow partially defined maps just for defining this topology and the limit that it's needed. But category is just category as it is. In the defining its properties, OK. So you see the difference between the formulas over there
01:03:41
and this how you write it when you write in digital geometry. I see this. It's easier to remember. And this certainly is a mess. It's already unpleasant. So if you make computations, it's a lot of difference. Also, relative entropy makes sense. It was good about that. It was very typical for this logic categorical,
01:04:04
even when entropies of individual p and q makes no sense. So in the simple thing, this entropy of this morphin difference of two entropies, if they exist. But it may happen that both p and q are actually infinite spaces. And entropy of p and q is not defined, but difference is.
01:04:23
So just to follow up on this question, can you, instead of you defining entropy for functions and the acids, could have partially defined functions? No, they are morphisms. No, no, I don't function. They are morphisms. I don't know what function is. Or in partial morphisms. Morphism, you have a category, you have a morphin, period. They're not set. How about you also have relation?
01:04:41
No, no, no, no. This is exactly wrong language. There is no relation. There are category, and there are morphisms, period. This is exactly the power of this language. You don't need this kind of junk. Partial sets, I don't need this junk. Yes, category and very clean language. No, in the end, if you have it for relations, you have it for the graph. No, I don't have a graph.
01:05:00
I don't have a graph. It's purely categorical language. There's a whole point I insist. You can express everything in purely categorical language. You don't need set theoretic language. Or business about relation, spatial sets, functions, they all pop out of there. You secretly have them, because you're used to them. I keep them at the back of your mind. But I prefer to have it back on the physical system.
01:05:23
You say there's no relations in categorical language? Sure, sure. I mean, you can bring them, but they're kind of a junk. You see, the whole point of categorical language, it emphasizes the relevant relations and throw away the junk. And you carry all this junk with you.
01:05:42
And this year, you don't know what to do with this. It accumulates. And it's much more clean language. And not ideal, I guess. Not perfect, and it has some drawbacks here. And we shall see them maybe today. It's not the end of the world. Because one of the points, you have to work in some completed category
01:06:02
when you have no standard analysis and you have to go to the limit properly. But my point was to show you, just on this example, formulas become better. Usually, people, instead of NTP, write some letters, which again, for me, impossible, because you never remember which. And they were different in different period of time. I think boys were using S. Today, people use H.
01:06:21
But if there are formulas like that, you don't need this. This is the most complicated formula we ever use. But if you start elaborating the junk, of course, you have long formulas. And you're forced to be in one line to use one letter. So I'll take the square root. What square root? There is no square root here. It's a relation. No, there is no square root here.
01:06:41
OK. It's about numbers. Yeah, it's another story. So this is just another related formula for these kind of diagrams, which is useful. And coming back to the original example,
01:07:04
so what actually is proven here, yeah, and maybe just one was Shannon's inequality and another kind of obvious inequality, which I don't want to say, that entropy of any measure space is less or equal than log of the cardinality
01:07:21
of the space. And equality holds even only if it's homogeneous. This, again, true by the same logic. You go to this limit and see what happens to one space in the basement to another. You don't have to make any computation with log. However, here we should admit that this formalism is about formulas has disadvantage.
01:07:41
You cannot prove the equality holds only for homogeneous spaces. Because you do it by approximation. And for this, you need something like a Boyce formula. You need analyticity of log. Not much, but in order to have 16 locations, you need analyticity of log. So log has a deep meaning.
01:08:01
And why take log, by the way? Why take log? For physics, it's reasonable, because you have two junk of matter, which are separate. You want entropy to be additive, like mass, and behave like energy or like mass. It's additive, and it's reasonable. But very mathematically, you take log. And there is, not explanation, but some phenomenon
01:08:23
which tells it's the right thing to do. It's physiometric. But it's not that kind of deep reason that I have only guess what it is yet. But I say it in a short while, but now we
01:08:42
have this property. And now coming back to this Lumi-Suitti inequality. In fact, much stronger inequality is true. Namely, instead of projections of this, you can see the entropies. So here, because you have just, first,
01:09:01
you can think everything discrete. Discrete or continuous, I mean, I haven't described this formally for continuous thing. You can do it, or you can approximate everything by discrete, thinking this was subset and discrete measure space. Then it was subset cardinality and entropy are the same. But projection, you can replace by entropies, which are smaller than cardinalities.
01:09:21
I mean, it's still inequality holds true. So it becomes stronger inequality. And when you come to analysis, this corresponds to so-called log-Sobolev inequalities, which we are proving later, which is kind of strengthening of usual Sobolev inequalities. So this contains much more meat.
01:09:43
But again, this is a big mystery for me. The problem, what happens with the ball? There is no similar theorem for ball being single. There is no sharp log-Sobolev inequality where extremal things will be ball.
01:10:01
It will be only the best you can do and will be Gaussian distribution, sort of thing, but not the ball, which also quite non-trivial. But anyway, and so if you write the Shannon equality in this example, actually, it's really the Shannon equality. This gives you this inequality. And so in this kind of limiting argument, of course, it's much easier for me each time
01:10:21
to repeat this limiting argument. You go to the limit, and you see constant sets, and you know the answers. You don't have to remember formulas. All you have to know that cardinality of a subset is smaller than cardinality of a set. And this is just all kind of the only formula you have to know. Now, how from there you go to the next step?
01:10:41
Because we had infinite system, and what are the integrals of infinite system? And my infinite system was a crystal, right? So I had this 3G lattice, so I had this lattice, and I had each node. I had a particle.
01:11:01
And imagine each particle may be in finitely many states. So I have this finite measure space in each point, and I have this infinite product. So I have this finite measure space to the power. This is infinite power. So this is infinite countable set, unlike finite set.
01:11:21
What it is? What kind of object is this? So what are these products? And again, so what you say in this categorical language, well, you just say what are your admissible morphism to finite dimension space, and the basic of them
01:11:42
are projections. And this all you have to know. There may be others, right? And from that, I want to define entropy and to prove this theorem of Kolmogorov. So now let me remind this theorem of Kolmogorov, which was a problem for quite a while, which is in traditional language of measure theory.
01:12:01
But in this way, formally speaking, a more general theorem. But formally, in reality, it is the same. So it says, given finite measure space, and Kolmogorov was concerned with the case of integers. And take another measure space, finite measure, take g to the q. So all sequences, so this p is collection of atoms.
01:12:25
And this is collection of some atoms, q. So I have sequences of weight p or weight q. For example, the easiest maybe, these have two atoms of equal weight, and these three atoms equal weight. And the question in ergoic theory,
01:12:42
if there is isomorphism between measure theoretic isomorphism between two spaces, which commutes with the action of the group. So when you have a space with an action of the group, you have a new category, which isomorphism commuting with the action of the group. Just again, a general categorical construction.
01:13:00
And you want in this particular instance to know that. And it was unknown, just for this example. And Kolmogorov shown that isomorphism, if such isomorphism is possible, it would imply that entropy of p equals the entropy of q.
01:13:25
This was his theorem. And you can ask the same theorem for any group g. So substitute here by any countable group. And it's unknown.
01:13:40
It's still unknown. Though, as I said, recently it was a great progress made by Lewis-Boyne, the next step in that. But Kolmogorov proved, for z, and explained to you, everything is just the same for z or for any group, it's just a reasonable difference. It's unclear to me if it was not non-relativistic
01:14:02
or not. Because they were never, of course, looking about g. It's one-dimensional system, who cares, without interaction. I mean, you can't imagine fields being concerned with a system of particles on a one-dimensional crystal if you don't interact. I mean, it's not a really exciting system. They were studying three-dimensional system with interaction. And there was a big literature preceding
01:14:22
that of Kolmogorov, essentially, by one-half. And they're proving very kind of deep theorems, much more complicated than that. But if they implicitly had this invariance, the results aren't clear to me. Because I haven't read these papers, of course. You have to read them and to understand them, which is different language, which is not so obvious.
01:14:41
I'm curious, yeah, if this was, the thesis knew that. But this was a short gap. This was 58, and this paper was in the van Gogh in the early 50s, yeah. You said countable, that's what you mean. What about? It's countable, it's in that group, it's countable.
01:15:01
Is it true? It's the same. Countable group and countable generated group with the same class of groups. Why not finitely generated group? I mean, particularly finitely generated will be countable. It's countable group. All they say is countable group. But I'm asking you, is it known? It's unknown for finitely generated group. Finitely generated group. No, no, no, it's unknown for finitely. It's known for all kind of examples of group.
01:15:22
Nobody can show me a group, it's unknown. Because this theorem of Bowen, I mean, I guess there are potential candidates of groups, maybe it's not true. But any group, kind of you, you show it. But again, it's related to some other question in group 30, exactly about these non-standard models.
01:15:41
So the proof goes, why non-standard analysis? Here, non-standard analysis, absolutely crucial. But not for Kolmogorov case. So let me explain this Kolmogorov case very easy. And again, we just look in crystals. I think the difficulty when you have the group Z in you, it's too primitive, too simple to say anything.
01:16:02
Z3 is much more fun. Actually, I prefer lattice, Z2. You can make pictures. So we have to, in principle, define what entropy means of this kind of physical system.
01:16:23
And what you do, yeah, actually, I have to already show you how to do it. So just some formulas and just explain what it means.
01:16:42
Certainly, your system is infinite, right? And so you'll, and since it's infinite, of course, entropy must be infinite. So you have to normalize it to the piece. So you have a huge crystal.
01:17:02
You take some big chunk of matter, so it becomes finite. So you take entropy of this piece, and just imagine that this is what you observe under this piece. See what you see there? Take this entropy and divide it by the total number of atoms,
01:17:21
or by the volume, whatever, normalize, and go to the limit. Limit may exist or may not exist, if not there is no limit, there is sub-limit. But in physics, limit should exist, yeah? In these examples, limit usually exists. And if you have a crystal, then, of course, you're added by the symmetry.
01:17:41
So you can always kind of, you know, so this piece and a bigger piece, a smooth piece, must have the same entropy. So this thing will be kind of invariant when you go to the limit.
01:18:01
But the problem is, so how you make this measurement? When you have this in practical, so this big piece, meaning you attach to this, you're kind of, so as I say, we have this detector,
01:18:21
we have this detector, and we can move this detector along with the symmetry group, right? So I have this detector, it's finite measure space, and I can move it along, and I can move it in new position and have this p times p, and I can make it many, many times. And take this measurement, and then normalize, right?
01:18:44
So and see how many state I see. But the point is, of course, when I do that, right? So I cover this chunk of matter by this piece, measure this entropy, whatever I see here, and then grows. But the point is, I can miss some of the states. My measurement may be blind to some states.
01:19:02
You're just missing them. So I can add another kind of more powerful detector and measure them together. And then I may see more. And this is one of the point of the thesis. The entropy is what you see, is not what's in there. There is no entropy in physics of a system.
01:19:20
It's completely absurd, yeah, because it's kind of, you start looking deeper and deeper, you become infinite anyway, yeah? Depends on how many large degree of freedom you measure. For example, it depends on the energy scale. It's just not the states which are there, but the states you can observe, and you exchange all the time. If they don't exchange, you don't see them, right? You only see the state when it goes from one to another,
01:19:42
energy being committed to absorb, right? And it's, again, by the way, not so easy to formalize mathematically, yes? My formalization is only partial, right? But at least you go away from this naive set theory. So it is, in fact, entropy depends
01:20:00
on the class of measurements. And if you say not only a measurement, but there are protocols, it will be two categories and one category, which I don't exactly know how to formalize, but this may give you somewhat different kind of theory. And essential point is, how many, how many kind of, these p's you need for that, right?
01:20:25
And you say, huh, so I am kind of in a good shape. We have a complete system of these detectors. If I add a new one, instead of measurement with them in the limit, I get the same entropy,
01:20:40
which means that if I get this q and measure it with p and then take this many, many times, my entropy up to order n, because it grows proportional to the number of terms, will be the same as if I had no q, right? So if I add this new apparatus, all I see, which was already I have seen.
01:21:01
Of course, if I start hitting my physical system with much strong energy, I will see more and more. But on a particular level, I see the same. So mathematically, it means that my kind of measures abstract, measures space, as a functor, is determined by morphism only to these spaces.
01:21:22
I don't need extra spaces. Of course, I may change it. I mean, it depends how I look at this. Yeah, this exactly. I'm saying there is no set, and there are different functors. And different functors on the same physical system can give you different objects. And then the point, in particular,
01:21:43
in the case of product spaces, that, as I said, by their definition, we have infinite product space, p to the power countable set S. Then
01:22:00
their projection to find a subset give you a full system of measurements. If I consider any kind of this join of these sets, if I add any other set, my entropy doesn't grow.
01:22:21
So when I measure entropy, this has nothing to do with your theory, whatever. It's just general properties, more or less, definition of the product. And in the classical measure theory, it's called Lebesgue density theorem. This is essentially definition of measure. It has nothing to do with the theory.
01:22:40
The moment I have it, now, I can measure my entropy, say, in the case of group Z. So I just measure this entropy on this segment. And then this sum entropy will be an additional entropy
01:23:01
to the power p to the n. And the p to the n, this entropy just multiplies by n. And if you have a q, it will be q to the n. If they are isomorphic, this entropy must be equal. And because the added term, because the group is amenable. So the only thing which is used here, that n plus i divided by n goes to 1 for n going to infinity.
01:23:25
And this is just mathematical part of the Kolmogorov theorem. Everything else is just this kind of categorical language. And in a way, a kind of mathematical structure enters amazingly there in this aminability of the group Z. This is true for amenable groups.
01:23:40
This is theorem of Kolmogorov. And if you look a little bit closer, it's immediately what it shows. And this kind of interesting story, what it shows that if you have amenable group A and have p and have q and just have a morphism like that.
01:24:02
So I'm sorry, I wrote it in the wrong order. You have spaces p to the power a and q to the power a. Gamma is amenable group. Then entropy of p greater equal to the entropy of q.
01:24:25
In particular, if there is a morphic, the entropy equal. If the group gamma is amenable. And that's how the story was till relatively recently, because it was observed by ways and I keep forgetting
01:24:41
his name, second name, that it is not true for free groups. So you have a free group that is easy to construct, such morphisms which actually increase entropy. And therefore, it was believed that the Kolmogorov theorem is also not true. And then it was about a couple of years ago proven by Bowen for free groups and for some larger class of groups
01:25:01
which I'm going to describe now. Nevertheless, it is true. This is not true. This surjectivity entropy is not monotone, but it is still invariant of and isomorphism. But for the reason of injectivity
01:25:21
of certain auxiliary map, not of their surjectivity. And the proof is technical. I will not explain the edges. Actually, now the IP in his latest papers more transparent proof, which I haven't read. His first proof was a message. I mean, just the idea is clear, but the computation indeed, it's combinatorial computation you have to make.
01:25:41
So what is the point is that there is a class of group where p to the gamma isomorphic to q to the gamma does imply that entropy is equal. And so now a couple of words I
01:26:01
want to say a few words about these groups. So these groups includes free group. And it groups all subgroups of linear groups. And in my general, what is called rigidly-defined groups. And so let me define this in a suitable language.
01:26:23
Just to give the idea where this instant analysis enters. And first, what is a rigidly-defined group from this perspective? So these are groups gamma, which
01:26:40
may isometrically act on a compact metric space, if I'm not mistaken. This is kind of not usual definition, but this definition which is suitable for my purpose.
01:27:10
And the logic of that, the way you have to think about that, that when x in a compact space, it acts approximately on finite set. Yeah, because if you carve it by finite set,
01:27:23
you can have approximate action on finite set and a certain effort, you make it actually actually on finite set. So rigidly-defined group, the ones whose action on finite set fully kind of discloses the nature of the group. And the groups here, they're called sofia groups,
01:27:45
which by definition, which in a second I shall elaborate and explain it in simple terms, but this is a kind of very good one. They are so many groups of a non-standard compact spaces.
01:28:02
So you have to say, what is a non-standard compact spaces? A non-standard compact space is a compact space in the category of non-standard metric spaces, which means objects when you take non-standard model of the language of real numbers.
01:28:20
So when you have real numbers kind of in your statements. And so metric is still understood as a real number, but spaces are non-standard. So the idea behind it that these are groups which approximately act on finite sets, but they add in a different way from the one described
01:28:45
here. So let me give the definition due to wise. And so it's a sofia groups. Probably it's a very restrictive condition.
01:29:01
So all amenable groups are sofic. All rigidly finite groups are sofic. Projective limits are sofic groups are sofic. But, well, so kind of rigidly amenable groups are sofic. But there are more examples, but we don't know if they're all groups of sofic. It's very unlikely they're all sofic. And the definition is quite simple.
01:29:24
So if you decipher this non-standard language, which I was referring already in some categorical sense, it means that, again, it's convenient to have one infinitely large number. This number, of course, goes to infinity. But imagine you have this huge, huge number,
01:29:44
which is finite but large. So you see just in this boundary, with finite and infinite, it's not that clear. I mean, just a mathematical formalism. We all think, you know, it is finite and infinite. But let me give an example, which is a typical example,
01:30:05
which is a number which you cannot say is finite and infinite. You can see that, say, the time, the maximal time of working of a Turing machine with a program, you know, with 2,000 bits or a million bits, whatever.
01:30:25
You have to describe whatever model you take on your computer with infinite tape, any model you take. And then you write a program, and it may go forever or it might stop. You can see that only those which stop, see how long it takes. And you can take the supremum of all programs of given lengths sufficiently large.
01:30:41
And from some moment on, it's a material how long they are, because all these numbers, for all practical purposes, are infinite. They're finite in one language, and infinite in another. And this is called Godel theorem, that shows that our naive picture of finite numbers is not adequate. So this is exactly what kind of number you take.
01:31:01
This kind of huge number, this finite set of this cardinality. And the group acts approximately, meaning that for a given finite set of the group, these elements presented by transformation which are partially defined, and which partially satisfy relation, but the order, the number of elements
01:31:20
where they fail to be action, divided by the number of elements, is infinitesimally small. So when you go to the limit, it becomes small. So they act up to, countingly, up to a certain error. And then, of course, this depends on your non-standard model. And with every such model, Born introduced his entropy.
01:31:43
But in the case of Bernoulli shift, they all coincide with the usual entropy. But the proof of this is not easy, even for free group, even for easily defined group. His proof is computational. Unlike Kolmogorov theorem, we have to really compute to see how many combinations are there, and how they match, et cetera.
01:32:03
Now, I want to justify log. Why there is log? Again, but this problem, even if it's true for all groups,
01:32:21
regardless if all groups are, it's unknown if sofic groups, if and only if non-sofic groups, violates this property. And it's unknown if all groups are sofic. And so it's quite unclear what happens in general with this question, whether isomorphism implies equality of the entropy. This looks kind of like a nice simple question,
01:32:41
because it's so general. So I repeat. So we have an infinite product of finite measure space, indexed by gamma. And the group X here, you can see the two such spaces. And if the isomorphism, measure-strategic isomorphism, preserving the group, commutating the group action
01:33:00
implies that the entropy is equal. And this is wide open for general groups, for general countable groups. That's, I think, a very nice question, because it's so simple. And it looks quite difficult. And again, the two major results here, one is due to Kolmogorov, and another to Bowen.
01:33:29
By the way, in amenable group, and particularly in that case, one knows that converse also true, it is a theorem of what his name, what is his name,
01:33:46
so fluent that if entropies are equal, then the thing isomorphic. I keep forgetting his name always. Oh, in a second. It just came to my mind and then disappeared, yeah.
01:34:03
And this kind of part actually easier. The theorem originally was much harder, but when it was proven, Ornstein, yeah, exactly, Ornstein theorem. It's Ornstein theorem. It's much more elaborate proof than Kolmogorov. But once it was proven, there was some general limit,
01:34:21
then it extends to other groups rather effortlessly. Whenever group has a kind of free subgroup, free abelian subgroups, it's already true. It happens to be as one could expect it, yeah. It's harder to prove, but having much less structure inside.
01:34:41
And but this is remained unknown. Now about log. Now it is me. And this actually has also a certain relation to mathematical biology, which I don't know, I must admit. This concept come from the mathematical biology, what I'm going to say. I originate from the mathematical biology.
01:35:01
And there are two names each. And so the major player here was Fisher. And so this is about log at Shannon equality. If you think a little bit about Shannon equality,
01:35:21
Ingeres easily transforms to the fact that entropy is, I keep forgetting, convex or concave function in the variables pi. So if you have p1, pn, I hate this notation. And so you think about them as a point in the n simplex,
01:35:42
n plus p0, pn. So I have unit simplex given. So the positive number, sum equals 1. So entropy is some function of the simplex to positive real.
01:36:09
And Shannon equality is kind of equivalent. Again, when you say equivalent, it's still a simple reduction. But in certain contexts, it may be equivalent sometimes. This function being, I keep forgetting, convex or concave.
01:36:23
And I think convex. Minus entropy is convex. Minus entropy is concave. I prefer this function, which is convex. But this will be negative. But it's material. So it's positive or convex or concave.
01:36:42
But what essentially is Hessian is sine definite. So you can always turn it into positive. So it defines positive definite quadratic form on the simplex. So this is Hessian of this entropy. So it's a differential quadratic form, which is positive definite, therefore it is a metric. So it's a Riemannian metric.
01:37:04
And you may ask, what kind of metric is this? If you know this metric, it's complete, not complete, positive coverage and negative coverage. So what kind of metric is this? And just my experience, again, when I thought about it,
01:37:22
I asked somebody. And he said, this is called Shansahaini metric. And then from this moment on, you can start searching internet. Shansahaini is a mathematical biologist who introduced this metric about 30 years ago.
01:37:43
And when I looked into that, and there were about 10 other names attached to this. People were discovering it all the time, this metric, and discussing it. I think the first was Fisher in something 1923,
01:38:00
maybe, or something. And then yesterday, they just couldn't figure out what kind of metric this was. But then I read by somebody who apparently discovered this metric himself. It was annoyed. It was called Shansahaini. He said, why is it called Shansahaini metric? It's just spherical metric.
01:38:22
So this metric is isometric. The simplex isometric to the simplex in the sphere. It takes sphere in this positive cone. The symmetric, simplex isometric to the spherical metric. So the adhesion of this function, sum of pi, log pi. Why it will be of constant curvature?
01:38:43
It's incredible this can be, by the way. So what's so remarkable, it has so high degree of symmetry. This only has symmetry of commutation group. This has symmetry of the orthogonal group. How could it be? And so what is the map? What is the asymmetry between them? And the asymmetry is not a regular projection.
01:39:01
You take a regular projection, it's not asymmetry. But what is the projection in the map, which is called Archimedes map? When you have coordinates xi, they go to xi squared. Then the points on the sphere where sum squared is 1
01:39:22
goes to simplex, where the sum is 1. So this map sphere is n to the simplex. Because here, what means being on the sphere? On the sphere, it means sum of x squared equals 1. What does it mean being on the simplex?
01:39:41
It's sum, so this called yi equal. It means sum of yi equals 1. And this map gives us something, just obvious kind of. If you make computation, it's immediately the symmetry up to a constant with 2 or something. And this map is what is called a real part of the moment map.
01:40:01
And again, the moment map is a kind of very remarkable map. In mathematics, discovered by Archimedes, we discovered that when you project sphere to the interval, this map is measure-preserving. And then more in fancy terms, it has a map for complex numbers from z to z times z bar.
01:40:25
The map from c to positive numbers is also measure-preserving. They're not one-to-one map. They're not preserve density, but they preserve measure. In the category of measure space. And so this is the map.
01:40:41
And then immediate kind of tendency, immediate conclusion is that entropy must be invariant under the group orthogonal transformations. And this entropy was introduced by von Neumann. And so how it is defined?
01:41:02
What are the objects where it is defined? So you have to say, yes, all the things about measure spaces, but in a way, it will be orthogonally invariant. So let's do that.
01:41:20
We say it in a slightly fancy way, but it's kind of nice. And the key is pointing to the spirit of quantum mechanics. So the point is that what I want to say, that just from this Boltzmann formula implies the existence of quantum mechanics. Mathematically.
01:41:41
And this is kind of, I don't see a rational explanation for that. Why statistics and the formula which came from just formalism of the law of large numbers? The only mathematics there, the rest is the language. Why the law of large numbers imply the existence of orthogonal symmetry in space?
01:42:02
You see, if it were the normal law, we kind of know this. But why the law of large numbers? That's, for me, unclear. However, once this is being said, how do we define entropy in this Euclidean setting?
01:42:25
So measures on finite set, it's something when your number is assigned to subsets, and they have some property invariant under permutation or something. So what will be this in the case of when you
01:42:41
replace sets by Hilbert spaces? So formalism for Neumann, at least, formal in quantum mechanics, is just anything instead of sets, say finite sets, you replay finite dimensional Euclidean spaces, infinite cells by infinite dimensional spaces. You may do it over real or over complex numbers, makes a difference.
01:43:01
Mathematically, it's the same. In this particular case, if you know it for real, you know it for complex, and vice versa. So instead of having finite set, you have a Hilbert space, and you want to introduce something like a measure. But what you would be measuring is subsets. You have subspaces. And to each subspace, you want
01:43:21
to assign a number in Hilbert space. And the condition of additivity is that if the two spaces are orthogonal, this is additive. These things add up, and the whole thing might be, of course, you have a certain degree of symmetry, a posteriori.
01:43:40
And the way to do it is as follows. And it might be positive, it might be additive, and there's in a way, if you look at these kind of properties, there's only one solution. And solution will be as following. You take positive definite quadratic form, and you normalize it, trace to be 1, some of eigenvalues 1.
01:44:01
And then, if you want to measure it using this form, and this would be a measure. So these are called, in physics, density matrices. And usually, they're in complex Hilbert space, but real is as good for us as complex. So the positive definite matrix is trace 1.
01:44:20
Given a subspace, it's restricted there, take trace over there. And this will be the mass. And this is additive by Pythagorean theorem. Of course, everything, of course, is Pythagorean theorem. So the point is you replace additivity, the fact that additivity of cardinalities is replaced by Pythagorean theorem.
01:44:42
Of course, all this formalism in Hilbert spaces is just unfolding of Pythagorean theorem. And this is essential property, this additive. And this, as I said, is Pythagorean theorem. Actually, this property of the Archimedes theorem, the property of this preservation of measures,
01:45:03
is also obtained by integration of Pythagorean theorem. And it's got a fundamental extension of additivity to linear algebra. And so what will be then the entropy of that?
01:45:22
And of course, the naive definition of the entropy. So the object replacing measures are positive definite quadratic forms. These are measure spaces. The positive definite quadratic forms with sum of eigenvalues trace equal 1.
01:45:43
This point total mass equal 1. So-called in-symmetricist. And in particular, the manifestation of Pythagorean theorem, if you take a phenomenal frame, any orthogonal frame, and take the sum of values of the form, it doesn't depend on the frame.
01:46:03
It's, again, exactly because of orthogonality, it adds up, and it doesn't depend on the frame. This additivity is very strong because this additivity implies you can decompose in many ways and get the same number. It's much more symmetry than in the usual,
01:46:23
for usual measures. So what would be entropy? And von Neumann defines entropy as, so you have this eigenvalues, and you can say again, this would be your kind of weights,
01:46:41
it will be lambda i log lambda, with minus sign. And so it will be just, you take diagonalized eometrics, it becomes a finite measure space, you can take entropy as measure space. And then you want to prove, and then you want to prove the basic properties of this Shannon type inequality
01:47:01
in developer logic theory, and then it's become not completely obvious. So that's interesting, that it's not completely obvious. And so let me show you kind of one way to formulate it.
01:47:36
So the point is that in quantum mechanics, and this is just you see also in mathematically,
01:47:47
there is no concept of going from a biggie system to a small system, right? In classical physics, we did actually have our big system, you make a measurement, and you go to the smaller system. But in quantum mechanics, it doesn't quite work,
01:48:02
because big system is not in the simplest example, kind of a classical thing, you have a big system, and you can project this to a, the small one, and this is a reduction maps. In quantum mechanics, the corresponding thing to be tensor product of spaces, so because corresponding quantum mechanical object
01:48:22
spaces of function on A and B. So this will be this tensor product. And then you don't have, yeah, this doesn't exist. You can't reduce it. And this kind of physically has some meaning. Yeah, I guess you're calling the quantum world, universe, not divisible.
01:48:43
You cannot have isolated quantum mechanical system. You just, the thing doesn't exist. By physical, whatever, if you just mean it, this is what they say, and mathematically it means that. Actually, remember, we had it in this Mendelian formalism, because the spaces were coming with additional structure.
01:49:02
There were some, because you could kind of, there were spaces of function, and you could integrate all the spaces as integral with particular functional, allowing this projection. But this, in principle, here is not allowed. What you can do, however, you can symmetrize the states, yeah?
01:49:21
So your states, or your probability distribution states, if you have a group acting on a space, by orthogonal transformation, you can apply the group and average, and you have something. And this, systematically, what happens in the world, you have this averaging. And this is almost the same as reduction in classical case,
01:49:45
because if you have this kind of bunch of numbers, and you want to project it here, this projection prohibited in quantum mechanical world. However, imagine you have a group acting, permuting them in all these fibers.
01:50:00
If you average them, they become constant along the fibers, and then as good as your projection, right? You can go from one to another, all formulas become instantaneous. So, taking this picture, and having this picture in mind, we can ask about Shannon equality, when you apply this,
01:50:21
and this here state the basic inequality, Shannon equality. Here it is. This was proven by Lanford and Robinson about 40, 50 years ago, that how entropy behaves under this averaging.
01:50:41
If you look what happens in this example, in this classical example, it becomes Shannon inequality. And here, this condition, commuting is okay, but these are reusable. You may ask, why they are reusable? The proof is not terribly difficult, not terribly easy, it's kind of, it's still relatively simple,
01:51:01
it was proven, and then conjecture was formulated differently. They formulate in a more traditional physical language, but in what I say, of course this was obviously a knowing condition, and the conjecture is saying I need it, which corresponds essentially,
01:51:21
in the slightly different language, they had absolute Shannon inequality, and you wanted relative inequality, in, for morphisms, because here there is no morphisms. But this corresponds to irreducibility, the trouble is reusable, not that there are many components, but some components go in multiples,
01:51:40
and when they go in multiples, multiples completely get mixed up, you know, if you have a reusable presentation, when one thing appears many, many times, and this is what happens in these examples. Then you can have, there are too many symmetries, and this is a really different picture. And then it was proven five years later, by, by, by, by,
01:52:00
Lieb and Ruskai, and so this is true without this condition, and this is kind of basic, basic fact here. And in one way to see the proof, is exactly along this line, which I said, you just, you have to redefine entropy the same way we did it for, for the, for the final set.
01:52:21
You, you have a Hilbert space, and where there is matrix Q, but now you take it, tensorial power, when N is again, non-standard number, infinite number. You just start arguing everything, reformulating in this term. And again, by the law of large numbers,
01:52:41
this matrix become, quasi homogeneous. When this quasi is a little bit kind of annoying point, which means that, in the limit, in the approximate approximation, it will be have the following nature. Here, you have a homogeneous object, meaning proportional to your, to your Hilbert, original form, all eigenvalues are equal,
01:53:01
and projection to the subspace. But the fact we have this projection, it means not the whole space, this kernel, right? The subspace where it vanishes, semi-definite, non-definite. And this is a kind of a little point which makes arguments slightly, slightly trickier. However, however, if you just assume that your object
01:53:22
are like that, all before and after averaging are quasi homogeneous, and then become kind of tautology, the same way as for sets, for different reason, but you will make this computation, and it is obviously equal. The only thing to check, it's really correct,
01:53:40
that this operation is sufficient in functorial to survive all kind of, because see this what was important in classical case, that not only Bernoulli's theory was true for objects, it's true for morphisms, yeah? So you could coherently go to the homogeneous limits. And here also you have to show you can go coherently.
01:54:03
But this coherence is kind of, well, we have to say exactly what it is. And so it takes a couple of pages, on one hand. On the other hand, that proves, which traditionally done, done by this kind of, what is now is called matrix convexity,
01:54:21
which was kind of, this term appeared later, that the whole theory of convexity, in particular convex function, has their counterpart for operators. It is different kind of convexity, which is stable, and it turns a product, which ever said, we did not know too much yet. But this is again, kind of interesting, interesting enough,
01:54:41
enough, enough thing, yeah. Okay, so this is the end of the story about the entropy.
Empfehlungen
Serie mit 4 Medien
Serie mit 10 Medien