We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Information Biology - Investigating the information flow in living systems

00:00

Formal Metadata

Title
Information Biology - Investigating the information flow in living systems
Subtitle
From cells to dynamic models of biochemical pathways and information theory, and back.
Title of Series
Number of Parts
165
Author
License
CC Attribution 4.0 International:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
How to apply Shannon's information theory to biology.
Keywords
2
Thumbnail
36:48
16
Thumbnail
1:00:12
17
Thumbnail
45:59
45
59
Thumbnail
1:01:02
83
Thumbnail
1:02:16
86
113
Thumbnail
1:01:38
132
141
154
Thumbnail
1:01:57
InformationstheorieDataflowMusical ensembleComputerRoundness (object)InformationPerturbation theoryPhysical systemLecture/Conference
TheoryInformationBuildingLocal GroupProcess (computing)Physical systemGroup actionUniverse (mathematics)InformationComputer animationLecture/Conference
Process (computing)InformationCellular automatonTelecommunicationComponent-based software engineeringProteinNumberFunctional (mathematics)InformationCellular automatonHydraulic motorProteinProcess (computing)SoftwareComputer animation
SoftwareInformationGroup actionLecture/Conference
Model theorySimulationMathematical analysisMereologyOscillationPhysical systemAtomic nucleusProteinSpeciesProduct (business)KinematicsVelocityCellular automatonPrototypeLipschitz-StetigkeitEmailSoftwareVolume (thermodynamics)VelocityNumbering schemeCellular automatonSpeciesComponent-based software engineeringComputer animation
Product (business)MereologyCellular automatonPrototypeOscillationSimulationMathematical analysisModel theoryPhysical systemAtomic nucleusProteinSpeciesKinematicsVelocityMathematical modelResultantRight angleOrdinary differential equationOscillationState of matterPhysical systemSteady state (chemistry)MereologySoftwareWell-formed formulaMultiplication signDifferential equationMathematicsSimulationLecture/ConferenceComputer animation
SimulationSensitivity analysisRevision controlModel theorySoftwareVariety (linguistics)Mathematical analysisComputer networkHybrid computerControl flowAsynchronous Transfer ModeElementary arithmeticFluxParameter (computer programming)Curve fittingPort scannerMathematical optimizationGraphical user interfaceMarkup languageSystem programmingModel theory1 (number)Computer simulationSoftwareSoftware developerComputer clusterKolmogorov complexityChannel capacityNeuroinformatikInterface (computing)XMLComputer animation
SimulationComplex (psychology)StatisticsIntegrated development environmentFreewareSource codeModel theorySoftwareIntegrated development environmentChannel capacityComputer clusterFront and back endsDatabaseComputer animation
Model theoryOscillationModel theoryFile formatSoftware.NET FrameworkMarkup languagePhysical systemMoment (mathematics)Channel capacityComputer animation
Term (mathematics)Kontraktion <Mathematik>Process (computing)Keyboard shortcutStreaming mediaCellular automatonCone penetration testComputer animation
Cellular automatonKontraktion <Mathematik>Different (Kate Ryan album)NumberConcentricStreaming mediaTransmissionskoeffizientProcess (computing)Lecture/Conference
Series (mathematics)OscillationRadio Monte CarloCellular automatonOscillationMultiplication signConcentricData miningCollaborationismLevel (video gaming)Single-precision floating-point formatInformationHeat transferSimulationOrder (biology)Thermal fluctuationsEvent horizonRandomizationDifferent (Kate Ryan album)Physical system
Digital object identifierAlgorithmSimulationExtension (kinesiology)Compilation albumPhysicsThermal fluctuationsAlgorithmMultiplication signCellular automatonEvent horizonSimulationPhysical systemNeuroinformatikComputer clusterComputer animationLecture/Conference
FaktorenanalyseCodeProteinCellular automatonInformationDynamical systemData structureDependent and independent variablesCellular automatonSocial classCodeFunctional (mathematics)ProteinComputer animation
Series (mathematics)OscillationRadio Monte CarloModel theoryCodierung <Programmierung>Term (mathematics)Kontraktion <Mathematik>ProteinDigital object identifierSimulationAlgorithmExtension (kinesiology)PhysicsCompilation albumCodeCellular automatonInformationFrequencyWaveformKeyboard shortcutCylinder (geometry)Cellular automatonProteinInformationLatent heatFrequencyCodeWaveformComputer animation
OscillationFrequencyMountain passMusical ensembleTheory of relativityWaveformStudent's t-testOscillationMultiplication signInformationFrequencyProteinPhysical systemDivisorCASE <Informatik>Lecture/ConferenceComputer animationProgram flowchart
Factory (trading post)TelecommunicationPhysical systemHeat transferFunctional (mathematics)InformationResultantInformationstheorieBit rateCellular automatonClassical physicsComputer animation
TelecommunicationPhysical systemLimit (category theory)InformationTheoryMathematicsHeat transferAnnihilator (ring theory)Source codeTime domainMessage passingPrototypeSource codeProteinInformationMessage passingPhysical systemCommunications systemCASE <Informatik>Computer animation
InformationContent (media)Event horizonTheoryEntropie <Informationstheorie>InformationstheorieInformationTheoryStatisticsContent (media)Event horizonType theoryRight angleLogarithmRandom variableMultiplication signArithmetic meanCurveLecture/ConferenceComputer animation
Event horizonContent (media)InformationTheoryEntropie <Informationstheorie>AerodynamicsMultiplication signPhysical systemInformationCASE <Informatik>BitAverageMarkov chainNumberProcess (computing)Different (Kate Ryan album)Content (media)Cellular automatonEvent horizonCurveState observerDeutscher FilmpreisMatrix (mathematics)Point (geometry)LogarithmDynamical system1 (number)Entropie <Informationstheorie>Group actionComputer animation
Markov chainModel theoryPhysical systemProcess (computing)State of matterVariety (linguistics)Field (computer science)TelecommunicationNP-hardMiniDiscError messagePattern recognitionNatural languageRankingGoogolWeb pageAlgorithmAerodynamicsGame theoryProcess (computing)AlgorithmCellular automatonNP-hardError messageFehlererkennungMathematicianHard disk driveMultiplication signLecture/ConferenceComputer animation
InformationEntropie <Informationstheorie>Bit rateDifferent (Kate Ryan album)BitBit rateEntropie <Informationstheorie>Point (geometry)MathematicsInformationComputer animation
AerodynamicsInformationMeasurementBitProcess (computing)CASE <Informatik>InformationDynamical systemEntropie <Informationstheorie>Cellular automatonProteinTable (information)Heat transferProtein domainModel theoryDifferent (Kate Ryan album)TheoryComputer animation
Variable (mathematics)InformationProcess (computing)StochasticPhysicsDiscrete groupEstimationKernel (computing)Population densityHistogramSummierbarkeitEntropie <Informationstheorie>Heat transferProteinRankingAdaptive behaviorMaß <Mathematik>Cone penetration testCalculationStochasticProteinModel theoryInformationTheoryMultiplication signDynamical systemOscillationCASE <Informatik>ResultantThermal fluctuationsCausalityComputer animationEngineering drawing
OscillationParticle systemPhysical systemHeat transferBit rateInformationEntropie <Informationstheorie>VolumeMaß <Mathematik>AerodynamicsResultantArithmetic meanCellular automatonInformationParticle systemoutputPhysical systemFood energyNumberProteinHeat transferDynamical systemComputer animation
Data structureProteinCharacteristic polynomialoutputOscillationAerodynamicsDifferential (mechanical device)Cellular automatonProteinData structureResultantCharacteristic polynomialDynamical systemSensitivity analysisDifferent (Kate Ryan album)WebsiteoutputArithmetic meanData transmissionComputer animation
Model theoryRegulator geneoutputCellular automatonInformationProteinHeat transferCalculationConcentricSensitivity analysisRight angleLecture/Conference
ProteinMechanism designInformationInformationProteinHeat transferRange (statistics)ConcentricCellular automatonComputer animation
ProteinInformationMechanism designHeat transferRead-only memoryPlotterHeat transferProteinInformationModel theoryEntropie <Informationstheorie>Cellular automatonSource codePhysical systemComputer animation
InformationRead-only memoryConcentricSource codeMultiplication signAnalytic continuationDirection (geometry)Cellular automatonGradientComputer animation
Multiplication signSource codeSemiconductor memoryDirection (geometry)ConcentricOrder (biology)Lecture/Conference
InformationRead-only memoryRepresentation (politics)Level (video gaming)Wechselseitige InformationBuildingScale (map)Cellular automatonConcentricModel theoryPhysical systemDifferent (Kate Ryan album)InformationNormal operatorScaling (geometry)MeasurementExecution unitMultiplication sign1 (number)Level (video gaming)InformationstheorieMutual informationBitCASE <Informatik>Adaptive behaviorState of matterSemiconductor memoryCurveDiagram
System programmingCondition numberoutputTelecommunicationShape (magazine)Read-only memoryHeat transferSpacetimeInformationObservational studyTheoryInstance (computer science)Different (Kate Ryan album)Semiconductor memoryPosition operatorScaling (geometry)Decision theoryMultiplication signAdaptive behaviorProjective planeInformationstheorieInformationCASE <Informatik>Moment (mathematics)AreaComponent-based software engineeringProcess (computing)Regulator geneCellular automatonWaveformProteinoutputEstimatorCarry (arithmetic)TheoryLecture/ConferenceComputer animation
Linear programmingLocal GroupLink (knot theory)SimulationSoftwareModel theoryoutputOscillationProcess (computing)InformationBoom (sailing)WebsiteSparse matrixCollaborationismChannel capacityMusical ensembleNumberComputer simulationLecture/ConferenceComputer animation
Computer simulationNumberProteinBitLecture/Conference
Game controllerPort scannerProteinLatent heatSurfaceOscillationCharacteristic polynomialParameter (computer programming)FrequencyoutputMoment (mathematics)Multiplication signConcentricFraction (mathematics)Different (Kate Ryan album)Arithmetic meanRefractionLecture/Conference
NumberKinematicsCellular automatonCASE <Informatik>WaveGradientProcess (computing)ConcentricThermal fluctuationsDynamical systemAsynchronous Transfer ModePhysical systemSound effectHard disk driveDifferent (Kate Ryan album)Keyboard shortcutCollaborationismMeasurementSoftware developerMultiplication signHypothesisLatent heatBitBoiling pointMassInformationPoint (geometry)Semiconductor memoryNeuroinformatikAnalogyModel theoryTheoryPointer (computer programming)Moment (mathematics)Complete metric spaceFilm editingMathematical morphologyArithmetic meanConditional-access moduleLecture/Conference
Semiconductor memoryInclusion mapCartesian closed categoryMusical ensembleDiagram
Transcript: English(auto-generated)
What happens if you mix Shannon's information theory and biological systems?
A dish better served hot. Please welcome our computational systems biology chef, who will guide you through investigating the information flow in living systems. Please welcome with a warm round of applause Jurgen Parle.
Thanks a lot and thanks for having me. It's great that so many of you are interested in that topic, which is not about technical systems but actually biological cells.
So I'm leading a group in Heidelberg at the university there and we are mostly interested in how information is processed, sensed, stored, communicated between biological cells.
And we are interested in that because it's not obvious that they actually manage to do that in a reliable fashion. They don't have transistors, they only can use their molecules, mostly proteins, big molecules that are little engines or little motors in the cell
that allow them to fulfill their biological functions. If information processing fails in cells, you get diseases like epilepsy, cancer and of course others. Now, cellular signaling pathways have been studied in some detail,
mostly single pathways, more and more also networks of pathways, but surprisingly little conceptual work has been done on them. So we know the molecules that are involved, we know how they react,
how they combine to build these pathways, but we don't know how actually information is transferred or communicated across these pathways. And we intend to fill that gap in our group and of course first we have to model these networks,
we have to model these biochemical pathways and this is how we proceed. So you have a cell, you can't see that here, but on the upper left corner you have that scheme of a cell
with all the different components, you have volumes in this cell where chemical reactions happen. So chemical reactions take biochemical species, ions, proteins, what have you, and they convert them into other chemical species and these reactions happen in the different compartments.
Now it's very important to assign speeds or velocities to these reactions because these speeds determine how fast the reactions happen and how the dynamic behavior then results. And once you have done that you can translate all of that into a mathematical model like the one shown here on the right.
This is an ordinary differential equation system, I don't want to go into detail, I only have like two or three formulas that might be interesting for you. So this is just any mathematical model you have of these systems and then you can start analyzing them.
You can ask questions like how does the system change over time, that's simulation, which parts influence the behavior most, what are the stable states, do you have oscillations, do you have a steady state and so on. Now you don't have to do that by hand because we are actually also developing software,
that's just another thing, I guess you know that all models are wrong, we try to build useful ones. So I said you don't have to do this by hand because we are also into method development and we are building scientific software.
One of the softwares we build is called Copasi Complex Pathway Simulator, it's free and open source, you can all go to that website, download it, play around with it if you want. Because we also use more demanding computations which we send to compute clusters, we also developed a scripting interface for Copasi
which is called Korg, the Copasi connector and this allows you to use the Copasi backend with all the different tools that are in Copasi from your programming environment and then you can build workflows and send them to compute cluster.
We think it's easy to use if you play around with it, you get stuck, then just let me know. So this is software you can use, you can play around with. Where do we get the models? Well, there is a model database that is called BioModels.net, also free to use, you can go there, download models. At the moment they have almost 800 different manually curated models
and almost 10 times of that that are built automatically. You can just download them in the so-called SBML format which is the systems biology markup language that then import it into Copasi or other software and play around with them.
Okay, so coming back to biology, one of our favorite systems is calcium signaling. And calcium signaling works roughly like this, you have these little, I mean the oval thing is a cell, then you have these red cones that are hormones and other substances
that you have in your bloodstream or somewhere outside the cell. They bind to these black things which are receptors on the cell membrane and then the cascade of processes happen that in the end lead to an in-stream of calcium ions,
these blue balls from the ER which is not emergency room but endoplasmatic radiculum which is one of the compartments in the cell into the main compartment, the cytosol of the cell and also calcium streams into the cell from outside the cell. And this leads to a sharp increase of the concentration of calcium
until it's pumped out again. There are pumps that take calcium ions and remove them from the cytosol and pump them out of the cell and back into the ER. This is very important because calcium is a very versatile second messenger, that's what they call it.
It regulates a number of very important cellular processes. If you move, your muscle contraction is regulated by calcium, learning, secretion of neurotransmitters in your brain, fertilization. A lot of different things are regulated by calcium
and if you simulate the dynamic processes, you get behavior like that. Here you can see it oscillates, it shows these regular spikes, so this is the calcium concentration over time. Now, if you actually measure this in real cells and this is data measured by collaboration partners of mine in England,
you see it's not that smooth, you get these differences in amplitude of the peaks, you get secondary spikes, you get fluctuations around the basal level and this is because you have random fluctuations in your system.
Intrinsic random fluctuations that are just due to random fluctuations in the timings of single reactive events, single reactions, biochemical reactions that happen. In order to capture this behavior, because this behavior is important, that can hamper reliable information transfer,
we have to resort to special simulation algorithms, for example, the so-called Gillespie algorithm. If you do that and apply it to the calcium system, you can see you can actually capture the secondary peaks and all the different other fluctuations you have in there. Now, this is just a Monte Carlo simulation.
I say just, it's really time consuming and demanding because you have to calculate each and every single reactive event in the cell and that takes a lot of time, that's why we do that on a compute cluster. I told you already that calcium is a very versatile second messenger, so you have very many different triggers of a calcium response in the cell,
things that lead to a certain calcium dynamics and on the other hand downstream, calcium regulates many different things. So you have these hour class or bow tie structure and that's why people have speculated about the calcium code. How can it be that the proteins, I should go back,
that actually do all these cellular functions, these green cylinders that bind calcium and are then activated or inhibited by it, how can it be that they know which stimulus
or which hormone is outside of the cell? They don't see them because there is a cell membrane around the cell, around the cytosol. So people have speculated, is there an information encoded in the specific calcium waveform? Is there calcium code?
And how can it be that the proteins actually decode that code? It's fairly established that calcium shows amplitude modulation, so the higher the amplitude of calcium, the more active get some proteins. It also shows frequency modulation,
meaning the higher the frequency of the calcium oscillations, the more active get some proteins. But maybe there are other information carrying features in the waveform, like duration, waveform, timing and so on. Now a doctoral student in my group, Arne Schoch, has looked into frequency modulation
and he actually showed that there are proteins, in that case NFAT, which is the nuclear factor of activated T cells, which are important in your immune system. They only react to calcium oscillations of a certain frequency. So they get activated in a very narrow frequency band, and that's why we call it bandpass activation.
Okay, so I guess you all know signaling speeds of technical systems, they're fairly fast by now. One of our results, because we quantify actually information transfer, is that calcium signaling operates at roughly 0.4 bit per second.
If you compare that to technical systems, that seems very low, but maybe that's enough for all the functions that a cell has to fulfill. So how did we arrive at this result? Well, we used information theory, classical information theory, pioneered by people like Claude Shannon in the 40s,
also by Hartley, Tucky and a few other people. So they looked at technical systems and they have this prototypical communication system where there is an information source on the left side, then this information is somehow encoded, it's transmitted over a noisy channel where the message is scrambled,
then it's received by receiver, decoded, and then hopefully you get the same message at the destination that was chosen at the information source. And in our case, we look at calcium as an information source and we study how much information is actually transferred to downstream proteins.
How do you do that? Well, information theory 101, information theory primer. In statistical information theory of the Shannon type, you look at random variables. You look at events that have a certain probability of happening.
So let's say you have an event that has a probability of happening, and then Shannon said that the information content of this event should be the negative logarithm, which is shown here.
The curve on the right-hand side should be the negative logarithm of the probability, meaning that if an event happens all the time, and I will show you an example later, there is no information content. The information content is zero. There is no surprise if that event happens, because it happens all the time.
It's like a sunny day somewhere in the desert. However, if you go to lower probabilities, then the surprise becomes bigger and the information content rises. Now, in a system, you have several events that are possible, and if you take the average uncertainty of all possible events,
you get something that Shannon called entropy. This is still not information, because information is a difference in entropy. So you have to calculate the entropy of a system, and then you calculate the entropy that is remaining after an observation, say, and this difference is the information gained by the observation.
Now, coming to a simple example, let's say we have a very simple weather system where you can only have rainy and sunny days, and let's say they are equally likely. So you have a probability of one half. For each of them, the average of the negative logarithm is one.
So you gain, when you observe the weather in this system, you gain one bit per day. You can also think of bits as the information you need, or a cell needs, to answer or decide on one yes or no question. Now, if it's always sunny and no rain,
then you get zero information content or uncertainty. The average is zero, so you don't get any information if you observe the weather in the desert, say. 80-20, you get a certain bit number per day,
in that case 0.64 bit per day, and you can do that for Leipzig. In that case, Leipzig has 99 rainy days per year, according to the Deutscher Vetterdins, and this gives you an information of 0.84 bit per day. You can do it in a general way,
so let's say you have one event with a probability of p and another event with a probability of 1-p, and then you get this curve, which shows you that the information content is actually maximal if you have maximal uncertainty, if you have equally likely events.
If you have more possible events, in that case, four different ones, sunny, cloudy, rainy and thunderstorm, you get two bit, and this is because of the logarithm, so if you have double the amount of events and they're equally likely, you get one bit more. Hope I didn't lose anyone.
Now, we're always looking at processes, dynamic things, things that change over time, and if we look at processes, we have to look at transition probabilities, so we have to change probabilities to transition probabilities, and you can summarize them in a matrix, so let's say if we have a sunny day today,
it's more likely that it's also sunny tomorrow and less likely that it's raining, maybe only 25%, and if it's rainy today, you can't tell, it's equally likely. And these processes are also called Markov Process.
Markov was a Russian mathematician, and you have them everywhere. These Markovian processes are used in your cell phones, in your hard drives, they're used for error correction, the PageRank algorithm of Google is one big Markov Process. Okay, so you're using them all the time,
nothing technological would work nowadays without them. Now, because we have knowledge about today's weather, the uncertainty about tomorrow's weather decreases, so now we have an entropy rate instead of an entropy,
and the difference is, again, the information you gain by today's weather. So you can do the maths in our example, the entropy would be 0.92 bit per day, and the entropy rate, given that you know today's weather,
is less, it's 0.87 bit per day. Now, to complicate things a bit more, maybe we also look at a second process, in that case, air pressure, and you can measure air pressure with these little devices, the barometers, and maybe, if it's sunny today, and the air pressure is high, in 90% you get a sunny day tomorrow,
and only in 10% of the cases you get a rainy day, and so on, you can go through the table. In our case, I looked it up yesterday, we had high air pressure and it was raining, so in our little model system, it would mean that it's sunny today.
Now, I told you information is a decrease in uncertainty, now how much information do we get by the barometer, by knowing the air pressure? And this is the difference in uncertainty without barometer, and with the barometer, in our case, we have to assume that the probability of high and low air pressure is the same,
and we get 0.39 bit per day, that we gain by looking at the air pressure. Now, what does that have to do with biological systems? Well, we have two processes, we have a calcium process that shows some dynamics, and we have the process of an activated protein that does something in the cell.
So we can look at both of these, and then calculate how much information is actually transferred from calcium to the protein, how much uncertainty do we lose about the protein dynamics if we know the calcium dynamics. And this is mathematically exactly what we are doing, and this is called transfer entropy,
it's an information theoretical measure developed by Thomas Reiber in 2000. There are some practical complications that we are working on, and this is what we are using actually for the calculations. So in our case, we have data from experiments,
or we use models of calcium oscillations, and then we couple a model of a protein to these calcium dynamics. This gives us time courses, both of calcium and protein, stochastic time courses, including the random fluctuations, and then we use the information theoretic machinery to study them.
And some of our results I want to show you. For example, if you increase the system size, if you increase the particle numbers, if you make the cell bigger, then the information that you can transfer is higher, meaning if the cell invests more energy and produces more proteins,
it can actually achieve a more reliable information transfer, which comes of course with costs for the cell. Also it seems that if you use more complicated dynamics, meaning not only spiking but maybe bursting behavior where you have secondary spikes, then you can transmit more information because the input signal carries more information,
or can carry more information in its different features. Another result is that proteins, a very interesting result, I think is that proteins can actually be tuned to certain characteristics of the calcium input, meaning with all the different calcium sensitive proteins in the cell,
they are tuned to a specific signal, so they only get activated, or these pathways only allow information transmission if a certain signal is observed in the cell by these proteins. So, in a way, the 3D structure of the protein defines how it behaves dynamically,
how quickly it binds and so on, how many binding sites it has, and then this dynamic behavior determines to what input signals that protein is actually sensitive. And on the right-hand side you can see some calculations we did. The peaks actually show where this specific protein,
which is a Calmodulin-like protein, you don't have to memorize that, it's a very important calcium sensitive protein, where these differently parameterized models actually get activated and allow information transfer. And this allows differential regulation because you have all the different proteins,
you have only one calcium concentration and only the proteins that are sensitive to a specific input get activated or do their things in the cell. Now, if you look at more complicated proteins, so Calmodulin, the one I just showed you, was only activated by calcium, more complicated proteins like protein kinase C, for example,
they are both activated and inhibited, so they show biphasic behavior where in an intermediate range of calcium concentration they get activated, with very high or very low concentrations they are inactivated, and you can actually see that these more complicated proteins
allow a higher information transfer, and again, producing these more complicated proteins might be more costly for the cell, but it can be valuable because they allow more information to be transferred. And this you can see in this plot where we actually scanned over the activation
and the inhibition constant of these model proteins and you can see that you have these sweet spots where you get a very high information transfer, so color-coded is transfer entropy. Now, coming to a different system, just quickly, we also looked at other systems, of course, calcium signaling is just one of our favorite ones, we also looked at bacteria,
and this is E. coli, a very famous model system for biologists, these are cells that can actually move around because they have little propellers at their end, and so they want to find sources of nutrients,
for example, to get food, so they swim into a direction and then they decide whether to swim, whether to keep swimming in that direction or whether to tumble, reorient randomly and swim in some other direction. The problem for them is they are too small, they can't detect a concentration gradient
of nutrients, of food, between their front and the back of the cell, so they have to swim in one direction and then they have to remember some nutrient concentration of some time back, and then they have to compare, is the nutrient concentration actually increasing,
then I should continue swimming, if it's decreasing, I should reorient and swim in some other direction, and this allows them to, on average, swim towards sources of food. Now, in order to compare over time the nutrient concentrations, they have to memorize,
they have to know how much nutrients were there some time ago, and for that they have a little memory, and the memory is actually in the, you can see on the left hand side, the receptor that actually senses these nutrients, they can be modified, these receptors,
we call that methylated, so they get a methylation group attached, and they have different states of methylation, five different ones in that model we are looking at, and this builds a memory, and we looked into that, we quantified that with information theory, this is a measure, this is called mutual information,
it's not transfer entropy, it's another measure of, in that case, statical information, you can see, this is the amount of information that is actually stored about the nutrient concentration that is outside of the cell. This is in nuts, it's not in bits, it's just a different, you can translate them,
it's just a different unit for information. And you can also see how the different methylation states, so these are the colored curves, how they go through, how they are active with different nutrient concentrations.
And this is ongoing research, so maybe next time, hopefully next time I can show you much more, just to finish this, we also look at time scales, because the time scales have to be right. The system adapts, so if you keep that cell in a certain nutrient concentration, it adapts to that nutrient concentration
and goes back to its normal operating level. Now if you increase the nutrient concentration again, it shows some swimming behavior. So it adapts, but it also has to decide, it also has to compare the different nutrients at different positions. And that's how they have to manage the different time scales of decision making
and memory or adaptation, and we are looking into that as well. Coming to the conclusions, I hope I could convince you that information theory can be applied to biology, that it's a very interesting topic, it's a fascinating area, and we are just at the beginning to do that.
I also showed you that it's such that in signaling pathways, the components can be tuned to their input, which allows differential regulation, so even though you don't have wires, you can still specifically activate different proteins
with one signal, or multiplex if you want. We are of course in the process of studying what features of the input signal are actually information carrying. So we are looking into things like waveform and timing, and we want to look into how these things change
in the deceased case. So if you have things like cancer, where certain signaling pathways are perturbed or fail, we want to exactly find out what does that do to the information processing capabilities of the cell.
We also found out that estimating these information theoretical quantities can be a very tricky business. Another project we are doing at the moment is actually only on how to interpret these in a reliable manner, how to estimate this from sparse and noisy data.
So that's also ongoing work. I would like to thank some of my collaborators, of course my own group, but also some others, in particular the QPASI team that has spread all over the world. And with that, I would like to thank you for your attention, and I would be happy to answer any question you might have.
Thank you. If you have questions, there are two microphones, microphone number one, microphone number two, and please speak loudly into the microphone. And I think the first one is microphone number two.
Your question, please. Has there been any work done on the computational modeling of G-protein-coupled receptors in the second messenger cascades there? Can you repeat that, sorry? Has there been any work done on computational modeling of G-protein-coupled receptors? G-protein? Yeah. Oh, yes, I mean, we are doing that because calcium is actually,
I mean, the calcium signal is actually triggered by a cascade that includes the G-protein. Most of these receptors are actually G-coupled, or G-protein-coupled receptors. So that's what we are doing. Thank you. Microphone number two again. First of all, thanks for the talk. And I wanted to ask you to talk a little bit
about how different proteins get activated by different signals. And could you go a bit into detail about what kind of signal qualities the proteins can detect? So, are they triggered by specific frequencies or specific decays?
Like, which characteristics of the signals can be picked up by the different proteins? Well, that's actually what we study. I mean, we have another package that is linked here is the last one, the oscillator generator. This is a package in R that allows you to create artificial inputs
where you have complete control over all the parameters like amplitude, duration of the peak, duration of the secondary peaks, frequencies of the primary peaks, of the secondary peaks, refraction period and so on. You have complete control. And we do at the moment, we are also running scans
and want to find out what proteins are actually sensitive to what parameters in the input signal. What we know from calcium is that, for example, calcium calmodulin kinase 2, also a very important protein in the nervous system, that shows frequency modulation.
And this has also been shown experimentally where they put that protein on a surface, they immobilized it on a surface and then they super fused it with calcium concentrations or with solutions of different calcium concentration in a pulsed manner. And they measured the activity of the protein and they showed that with increasing frequency,
the activation gets bigger. At the same time, it also shows amplitude modulation. It's also sensitive to the amplitude, meaning the absolute height of the concentration of calcium. Thank you. And again, number two please. Hey, so you talked about a lot of on and off kinetics
and I wonder if you think about neurons, which not only having on and off, but also many amplitudes that take a big role in development of cells and synapses. How do you measure that? So how do you measure baseline sporadic activity of calcium?
Well, in our case, there are different ways of measuring calcium. That's not what we are doing. Not really measuring, sorry, but more like how do you integrate it in your system because it's not really an on-off reaction, but it's more like a sporadic miniature.
Yeah, I mean, in the case of calcium, you have these time courses. And we look at the complete time course. So we have the calcium concentration sampled at every second or half second in the cell by different methods. So our collaboration partners, they use different dyes that show fluorescence,
say when they bind calcium. Some others show bioluminescence. And then we use these time courses. In the neural system, it's a bit different. There you also get the analog mode where neurons are directly connected and they exchange substances.
But most of the case, you have action potentials. And I didn't go into neural systems at all because things there are totally different. You get these action potentials that are uniform mostly. So they all have the same duration. They all have the same amplitude. And then people in neuroscience
or computational neuroscience mostly, they boil the information down to just the timings of these peaks. And they use this information. And mathematically, this is a point process and you can use different mathematical tools to study that. We are not really looking into neurons. We are mostly interested in non-excitable cells
like liver cells, pancreatic cells and so on. Cells that are not activated. They don't show massive depolarization like in neurons. Thank you. Thank you and obviously again number two. Hi. So you mentioned Camp kinases too.
And I got you don't work on neuroscience specifically. But I'm pretty sure you had quite extensive knowledge in the subject. What do you think about this, I would say, hypothesis that were quite popular a few years ago?
I think in the US mainly. About the fact that the cytoskeletron of neurons can actually encode and decode through kinases in the cytoskeletron memories like bits in a hard drive.
What's your feeling? I'm not going to speculate on that specific hypothesis because I'm not really into that. But I know that many people are also looking into spatial effects which I didn't mention here. The model I showed you is a spatially homogeneous model. We don't look at concentration gradients within the cell.
Our cells are homogeneous at the moment. But people do that and of course then you can look into things for example like a new topic is morphological computation. Meaning that spatially you can also perform computations. But if you're interested in that we can talk offline.
Do you buy into this theory? I can give you some pointers there. Do you have a good feeling about those theories or you think they are clueless? Well I think that the spatial aspect is a very important thing and that's also something we should look at. I mean to me random fluctuations are very important.
Infrinsic fluctuations because you can't separate them from the dynamics of the system. They're always there. At least some of the fluctuations. Also the spatial effects are very important because you not only have these different compartments where the reactions happen but you also have concentration gradients across the cell.
Especially with calcium people have looked into calcium puffs and calcium waves because when you have a channel that allows calcium to enter of course directly at that channel you get a much higher calcium concentration and then in some cases you get waves that are traveling across the cell. To me it sounds plausible that this also has a major impact
on the information processing. Thank you. Thank you. In this case Jurgen, thank you for your talk and please give a very warm applause to him. Thank you.