Logo TIB AV-Portal Logo TIB AV-Portal

Effective behavior of random media

Video in TIB AV-Portal: Effective behavior of random media

Formal Metadata

Effective behavior of random media
Title of Series
Part Number
Number of Parts
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Release Date

Content Metadata

Subject Area
constantly randomization Actions bottom breadth time materials expansions fraction representation orders mathematics loss fluctuation matrix Maxwell Elastic field area sampling physical several connections category conduct point current program Errors statistics Quantifier maximal drop distances Electromagnetic elements Tabu Search mixture inclusion specific Formula terms form transmission scale surface volume applications numbers radius Computer animation vector fields gradient descent
randomization time argument dimension different fluctuation square field Arc position descriptions relation Processing point convergence uniforme physical elements connections category uniformity conduct distinguish spaces geometry point probability distribution Errors program functionality elliptic operator maximal incomplete information distances theories elements Densities terms Formula radius operations differentiable operation ensemble contrast analysis operations variance volume incident applications numbers potential elliptic Hause radius Computer animation coefficients
Actions directions time statistical correlations dimension representation sign matrix box field Cover Processing flow point graded sampling coordination Hot two-dimensional space elements connections means period real vector orders related Right conduct spaces current functionality statistics Identify elliptic operator Average potential approximate elements PDE Densities Lecture/Conference Average terms finite sets radius fields boundaries current linear torus sin Ionic model expressive principal unit volume lines applications numbers approximate potential Computer animation vector fields functions coefficients localization unite
point Errors building relation breadth sampling analysis variance volume approximate potential rules numbers elements representation Computer animation Symmetry Average different fluctuation matrix box field unite
Errors statistics randomization percolation Processing volume elements expect representation rattles plane Correlation Computer animation fluctuation field conduct Gravitationsgesetz
point Errors statistics randomization Actions form ones Average inequalities approximate total rules theories rotation variables variance independent expect Average different fluctuation square ensemble optimal linear scale concentrations states sampling variance structure infinity rate numbers notation period Computer animation Symmetry orders theorems set Sum asymptotic optimal progress Results since spaces
randomization Actions time ones expansions part perspective variance orders estimates different fluctuation field graded expansions domain several oSC Tangent orders theorems related relative error Right Results point Errors SLE functionality free group width ordinal Quantifier form maximal distances theories affine domain period inclusion several Average green equations free group Sum set standards scale covariate model structure variance lines limitations approximate diameter elliptic curve Computer animation objects margin
randomization part sites variance fraction derivators representation orders different fluctuation field series extent systems concentrations deterministically expansions meetings completion measures elements sample orders white noise conduct Results point Errors functionality Quantifier student inequalities major surgeries elements period Correlation terms Formula level noise standards scale model commutes variance structure volume Computer animation set objects coefficients asymptotic matrix
now an and 2 in whole of its of leisure of true to be back here so as not the man with the most said I was here for about twice two-month there and enjoyed joined vary the productive stay here in Yushu has this year and and last year now I have to confess right from the beginning that what I'm going to go talk about in fact has nothing to do with I guess signal transmission and information technology and no Shannon so I'm a little bit of sir it's thank you that but so so that sense but the United States and I couldn't meaning that they never worked on on on the subject I mean there will be a randomness coming up but that in quite a different way than than in the previous talks so so this being on this being said so I mean it's often and you can relax it has it's not important for for this for this program so well aside a 1 I want to start by I'm telling you what kind of thing I mean starting with historical examples of what what would 1 understands by random medium and what 1 understands by effective behavior and briefly mentioned what what's more recent applications might be in that area and then and then I went to work and I want to address so but to specific issues we've worked on and so 1 of them in sense is motivated by the numerical analysis so really understanding the of the numerical errors at very popular ubiquitous engineering method that makes on the 2nd day of the 2nd topic has to do what I mean not that I would be bold I would say it has with uncertainty quantification so understanding the the error of 1 is making our understanding the fluctuations the year amount of uncertainty characterizing the amount of uncertainty in solution with random data OK so here is by the way that this is probably the pointer also descent so so this is a this is as I said kind of um and historical example of probably that that's that's the 1st time in the physics literature as problem in in ineffective behavior off from random media was posed and solved in the in the famous work of Maxwell on electromagnetism and and he was interested in that kind of effective conductivity or effective resisted and so in the back of his mind he had the following situation here that kind of a slab of a conductor of homogeneous conductors so sold genius is conducting medium and he was imposing on the voltage difference between the bottom and the top surface and uh as far as we know since the medium is more juniors this voltage difference leads to work more genius current field which here is indicated by the green areas and there is there is a proportionality between between the uh the strength of the case this time watching his current field and the voltage drop divided by the height of the /slash and that's that's material constant and that's the the conductivity of the year of the material 1 Nova that constant is the resistance so that's so that's kind of innocence is the nice homogeneous situation and then he was imagining a slab not formed of a homogeneous material form of hydrogen use material so material where you have out there which is made out of true bear materials where it will let's say you have a background matrix which is made of this material with it's a spherical inclusions of another material with another bear conductivity K so you have kind mixture of material with conductivity K Tool and with connectivity K 1 and clearly has since this material is no world no longer homogeneous even if the voltage is kept constant down here and kept at different constant up here the the current what you get is no longer constant the current density is not constant and effect will be some complicated to some complicated vector field but nevertheless so that you imagine that our Maxwell imagined or Maxwell that if if this is the height of the snap was lodged compared to the typical distance of these inclusions so if there was a clear scale separation between the characteristic going scale of the heterogeneity and length scale of the sample then this a genius behavior insect behaves like an idealized acknowledging his behavior in the sense that there is a clear proportionality between between the voltage drop and the average current which which flows and flows through the sample and that and then back then of course you could resort to work to computer to determine that the effective behavior so he made the assumption often dilute regime so he made the assumption that the typical radius of these droplets is much smaller than their distance and astute regime he was able to exceed derived theoretically derived an asymptotic formula for the effective conductivity of this mixture in terms of the bare Connectivity's K 1 and K Tool and in terms of the volume fraction peak so that's that's the 1st example of such kind of In the physics editor of finding the effective in identifying an asymptotic incorrect formula for the effective behavior of random so nowadays is that people are interested or people don't want to kind of restrict themselves to situations to these dilute situations where you can you know innocence and you can understand the behavior at least asymptotic by paper and pencil that they they want to treat let's say realistic situations like like the permeability of porous rock or the elasticity properties offer of certain mixture adherents ,comma polymer and and then I since there is no dilute this year they have to really resort to numerical simulations to extract from kind of this this the specifications of the statistics to extract the effective behavior and the and the the engineering method which which is used is the method of the representative volume element and that was the starting point for 120 million and myself and we wanted to where we want to understand the error when makes him that much so it was in fact that was a question which I heard from women and that there was still some work to do to understand that from the from the more mathematical rigorous point of view what would the errors made in this very common method so and so so what I want to
do in this talk is I want to 1st this story so what kind of the way the error analysis of this represent about 4 per cent going to explain to you what this represented volume evidence and then I'm going to tell you what to why it would be interesting it is interesting that in our analysis and what the error of their analysis and then now and then I want to I want to go to unify the time still dress a 2nd issue which has really to do with the characterizing the fluctuations of the solution characterizing the fluctuations in this case of the current field and and so was characterizing the the variances OK so let's start with the with this represented volume element method so but the From mathematical point of view of the at this heterogeneity heterogeneity of the medium is described by the coefficient field or tenser fuel so something which expresses the possibly an isotropic microscopic conductivity of the medium or whatever material property you're interested in and and typically in these applications that uniformly let's as as we say in PTE theory a uniform elliptical coefficient field or indifferent to geometry would say metric and and then once you have this metric you can form a kind of the differential operator we call it an elliptic differential operator which which governs which incident which which contains the physics I mean for instance that if you will to stick to the field of connectivity uh this this operator acting on a function would be equal to 0 if you as a potential that leads to stationary current and so that's from mathematical point at the obstruction that's what we're dealing with and that and now the Noland render media you have a since incomplete information on how this coefficient few looks like so you innocence and you just have statistical information which anyway you think is much more appropriate than detailed the details the local information so mathematically that means you're not working with a single coefficient feel that you're working with a probability distribution on the coefficient feels so the coefficient of random field something which depends both on the position in space and on the realization and the and 4 other talk it's perfectly fine to keep the following a simple example in mind America's relations which are going to show it to you are based on this example so we call it can of put some type of example so so think of your space so DD mention space think of decreased 203 in America was listening to a show due to two-dimensional but theory doesn't care for the dimension and so so so the red points here are distributed according to the press on .period processing all space and then around every these points should draw ball you when you consider the ball of radius so it's a one-quarter if the density of the personal program .period process is 1 which means roughly speaking the distance between the points is on average equal to 1 so that you look at the Union of all these balls states the said that Sarandon said and then you are as a very simple example you take this coefficient feel to be equal to 1 times the identity on the blue region and London times the identity were alarmed that I think is 1 of the 10 in the experiments or some other some number different from 1 on the pop so that's very simple that's a very simple example of of such a random coefficient so how
so right before I go to the side so so you have you have a simple a simple very easily over minute random medium which is described in very simple terms right I mean it's just sensitive and you have 3 parameters you have that kind of the number density of the possible .period process which I arbitrarily set equal to 1 year of the radius of the balls which I choose the 1 quarter you have kind of the contrast of the medium which that says one-tenth so effective 10 and and that's it so you have 3 of 3 parameters and now you want to extract the disaffected behavior the 2nd large-scale as the effective resisted video conductivity of this medium and so in a sense and you want to go From 3 parameters to work another fine a number of parameters so well so you start with something very loud and very low dimension description and you want to get extracted very low dimensional information but the matter is not explicit so there's no simple formula which would relate these 3 numbers to the D square numbers your interest so so therefore what what's done in engineering is this
representative volume element method so in order to get an idea of what that is so it's really good to think in terms of this physical applications so the functions you should think of the functions as electric potentials of the negative gradient then by Maxwell is the electric field the coefficient field has the meaning of connectivity so she replied If you multiply the electric field by the connectivity get the electric current Anaconda stationary if it's the virgins ventures so um so what's the what's the purpose was the effective behavior and the effective connectivity of the medium it should provide a linear relationship between the average potential gradient to the average car so what the microscopic connectivity is the proportionality between the the average between the local field and the local cover current and effective behavior should give you the relationship between the average the spatially averaged field and especially average ,comma that's that's what I want to work but this is what this is supposed to be so
how do you how do engineers this in practice they artificially introduced some finite size the main so the simplest thing is to think of a Taurus so introduced some some box and identify boundaries and and then in this finite set in this final sized region they sample their medium according to the same specifications as in the whole space so far so they look at the question .period process with the same density on the Toros and you know exactly the same thing so well so now you have kind of sample what you think is the kind of your rights statistics but not in all space but in the Taurus and then they solve d word is the dimensions and 2 dimensions just 2 was simple partial linear partial differential equations on the Taurus so they seek a kind of a periodic function which solves this elliptic differential equation so but I can't let this on the blackboard because that's that's important so you take the gradient of tax on lost 5 times and this this is supposed to be periodic new numbers to be equal to 0 and of course I can run this year as just waiting for my work audience kind of the the unit vector so where does so in this so this function is called the corrective because it does the following things you may also call at harmonic coordinator you start from the at sign function the ice according direction and you want to correct it in such a way so you will add to its function fine on the resulting function as harmonic so this is why sometimes these coordinates are called harmonic ordnance or why this is called a correct so that's that's the problem that's the problem which is sold and then then you take the the current which belongs to these harmonic so this expression so the field multiplied by the connectivity and you average it over the Taurus and that is what you take as approximation to Europe homogeneous coefficients also by right that year at home people so that's what the that's what's done in that so that's the that's the representative of that's the representative volume and here is here is a numerical simulation to just to show you give you flavor of what you doing so you're picking up recorded direction you would pick the first-quarter direction on two-dimensional model so so we're looking for potential which is a periodic perturbations of the potential that linearly grows X 1 direction so that its current is divergent street so you see the potential lines clearly they're kind of balance in the region where you have no conductivity and their spots around the region where you have have a high conductivity because there more like a conductor With the feud should be expelled and that those other the flow lines that's the current which belongs to this and now you take this vector field you average it over the Taurus you get a you get to numbers and that's the 1st row of the effective behavior and I do the same thing in the 2nd quarter direction so in this case kind of the 11 lines will essentially be horizontal horizontal degrading will Vermont mostly pointing in the EU to a direction you get the same thing and then you take the average of this field and you take this as the as the 2nd row of your matrix so that's that's exactly what would you do in here and so by the way it's
not surprising that this number here is exactly equal to this number and the symmetry of the resulting matrix is in some sense building and um is another question is how much should you trust these numbers and it's clear that you shouldn't trust these numbers if this rule if this representative volume element is too small then it certainly has nothing to do with the fact which shook would capture so these numbers should become better and better the larger this representative volume element becomes and that's that's so that's kind of the error analysis we were interested in so I but but but before saying that when sure before turning to this 1 should top 10 of also point out that the answer which should get here is a random answer because depends on the realization so it's not yet a deterministic quantity but it's a renaissance it's very clear that if you sample you medium once more you will get a different solution and therefore you will get a different our average so this is still a fluctuating a random number but on the other hand it's also intuitive that as you make the size of this box larger and here I didn't draw it as a larger box but every scaled it back to unit length and made the ball smaller and that's mathematically equivalent the variance should go down so that's so that's what we call the random and that's indeed what what
1 season in the merits relations so here you have 3 different realizations of the Rhine medium Of course you get 3 completely different solutions 3 completely different can't fields and 3 quite different matrices they differ let's say by 10 % so the that's what I said I mean the answer which you get is what we would call a random variable because it depends on the realization but if you so
here again this is what we what we saw before but if you do this this installation which a much larger representative volume element then you see that the fluctuations clearly go down from about 10 % about 1 person so the question is never going to to the value of the assumption that the declaration of war they don't rely because the that's not necessary here because we didn't put the conductivity equal to 0 In the compliment if if we would if we would allow this remember that this this coefficient field was was defined to be the call to the identity and equal to the limelight here and I think a London was chosen to be 1 of the 10 but positive if if the if this land were allowed to be 0 exist than you would get in the situation of percolation and that's of course also well-studied well-studied problem but it's not uh my kind of because that of course has additional difficulties on the official challenges and not addressing that in the past 2 years so therefore percolation whether it's percolating or not will play quality I mean some quantitative role but not a dramatic qualitative growth is that onto the question OK so yeah so
that's only 1 type of error there is a 2nd error which is slightly more subtle namely that even if you kind of disregard the fact that this is a fluctuating quantity random quantity by for instance taking its expectations it still would not be the right value a year because of the falling phenomena the when you when you when you go to the Taurus you've kind of falsified the statistics you find if you if you under ratchet Torres into the plane and you have kind of a periodic efficient fuel which means you've introduced kind of Spurrier's long-range correlations instead of having a question process you feel you have you you're you're looking at a different and you actually looking at the wrong statistics doing this so that's a systematic error it's kind of a bias and
and also that if that affected ones 1 seeing so if we're just looking at the at the expectation of this quantity of course America with looking at the empirical average but that's a wheel of 1 of the Monday many many years sampling so that this that we actually excessive the expected value we know that because of symmetry considerations that it's isotropic so it's just about this number and then even this number so even after taking the expectation still depends on Alan only converges to its final value as the size becomes large so their tool there are 2 types of errors involved in this problem I mean there's kind of random error which comes from the variance of the answer and there is a sister systematic error which has to do with the wrong statistic with the with kind of the wrong but with using the wrong song and and of course so qualitative theory which is 30 years old tells that both types of Arab errors go to 0 and in fact we have a nice so a nice Rupert Ross rule in improbabilities space the square it's the the expected square error is the sum of the square of the random error plus the square of the systematic error and both of them go to 0 and now we were interested in at what rate to these 2 errors go to 0 so why why what white wise that kind of question of practical importance
because it depends on it it would determine what type of numerical algorithm you're using for this when you want to infer the effective behavior because you have when you when when you do that when when engineer does that has to knobs to turn he can either look at if you very large samples or you can look at many is moderately sized and it's clear that looking at many samples will only affect the random error will kind of attenuate the damaging effect of the random and will have no effect on the systematic so so therefore in order to find the optimal I mean if you if you have a certain number of unknowns to spend because you don't want to wait 3 years but just 3 weeks to get the results you have an optimization if you want to reach a certain air uh you you can either do it this way or that way and which way is the better depends on the scaling of these 2 of these 2 and so therefore this is this is a relevant from numerical point of view relevant question and of course since it's a relevant question people have a kind of looked at this for a while but it's only since fairly recent years that kind of 1 got a satisfying optimal answer just let me mention because it came up in the previous talk the mathematical tool by which we get many of these results in fact is the logarithmic so what if inequalities were using kind of an infinite we use a concentration of measured over it Sobolev inequality to get some of these results but I will not mention only talk much or will not talk at all about the the methods used so the final result of the final result were getting is following the random the systematic errors much much smaller than the random error the systematic error the point of difference is square and not a square so the systematic errors about the square of the randomness so definitely pays to look at many many realizations because you have to 1st make sure that you make this effective bad effect of the random error small by looking at many realizations and only then you have to worry about the systematic and and by since since we wrote this kind of the things have kind of progress that by now when even understands that the the fluctuations are approximately Gaussian so even the error of not only has the scaling but it it also has its own structure and that's so that's a little bit what I would want to work what I want to come to talk
about the 2nd part of Mark talk how much time lately it's something so that's much more than 1 so so the 1st part was about these kind of error estimates for for this engineering methods and the and the 2nd part is about the fluctuations of the solutions so well so that's not changing a little bit perspective so and so why does kind of why doesn't modernization pay why does kind of this this theoretical concept pay well it's related to a separation of scales and that is something which I already mentioned in the context of Maxwell's in example his separation of scales was the separation between the typical size of domain which was the width of the slab and the typical size of the medium which was the distance between the inclusion so so we have a macroscopic scale which is seen as set by the size of the domain of by the characteristic size of the right hand side of the equation and kind of the microscopic scale which is set by well by the medium itself so the diameter of the balls that to be the distance between the balls and let's set this 1 equal to 1 so when ornaments lies in this way so we have this uh the scale separation between a large geometrical external scale and this much smaller intrinsic scale and now let's think of the simple elliptic equation with the right hand side and so we we put this line scaling to the right hand side which means we're looking at the right hand side which varies on a very slow and very large scale which we call L and whose amplitude we scale like this so that the amplitude of the solutions of all the 1 so far so modernization tells you that instead of solving it this very complex problem where you would have to at the same time resolved the small scale of the coefficient field and the large scale of the domain of the right hand side you can solve a much simpler equation with the effective behavior with the homogenized proficient there so the qualitative use Asian tells you this fluctuating solution this random solution here in blue it is in fact pretty close to work this non-random a non-listed Tory solution and at the end of course also there you want to understand what is the air and in fact in periodic amortization that's very well understood kind of the typical fluctuations happened on scale 1 and the typical deviation from the limit is of the order of 1 of and in fact the same is true in the In the random
case and so on and so when you really want to quantitatively compare the solution and off your head Regina's equation to the solution of the homogeneous equation with the effective behavior you go back to this object to the corrected because that helps you to understand the air so here again this kind of this schematic picture which drew here so the this these functions 5 correct the F find behavior in such a way to get the solution to get a harmonic Ordinance to get us to be solution of this equation and Now therefore instead of comparing you will directly to you you should compare you to a modulated version Of this corrected so that's when I drew tried to explain by this picture so down here you have the true solution and which kind of oscillates around the homogenized solutions and affect you get a much better approximation of the true solution by taking the tangent to the homogenized solution and by doing this construction on top of the tension then you get something were also the gradients are close not just the solution and affect the relative error is like in the lighting periodical margins so here you wouldn't see the effect of randomness and sets and that's a result which tell you random homages ation is not worse than periodic emergency but in fact there is much more structure in the running and that has to do with the fluctuations and that's the story which I want to tell us so so let's suppose you were interested so here again is is our our our solution of the 100 genius problems which we don't really want to compute so we seek can theoretical understanding of it with the right hand side which has the right scaling and let's suppose we're not interested in the solution in the extreme .period wise way we're not interested in understanding the solution every point but we just interested in standing understanding certain macroscopic observables like invited recovery that's the kind of flow rate so so macroscopic absorbed dose would look like this you would take some spatial average of dissolution you and the spatial averages on the same scale as the solutions on this macroscopic scale and now the question is can we get a better understanding of the fluctuations of this quantity so that's now getting a bit in this business of uncertainty quantification can we understand can we not just understand the size of these fluctuations but can we really characterize these fluctuations so while ago we found out that they have the order which you would expect from central interference scaling someone over Ltd so you see that this is kind of very different than the previous error estimates were in pointwise twice since the differences 1 over here it's that the averaged errors let more like 1 over party of the and and that of course means that the next natural step was well I can understand the limit of this quantity if we risk if we put it onto the right scale can be characterized can we characterize these fluctuations and and the 1st thing we now we meaning kind of the the small community that's looking at these problems that was to plug in the to scale expansion so the way that you get a pretty good approximation to Europe had Regina's rented flat wetting solution by doing this construction of taking the emerging solution modulating it would be corrected that was the 1st guests and then and then you would be drawn to understanding the call variants of the correct so what so the fact therefore what we 1st looked at was kind of getting a better understanding of this the subject the covariant structure Of the corrected and together with Jean-Christophe models prevent listed young we we were able to kind of show that there is a limiting covariant structure on large scales and and that it's in a certain sense you we could characterize it in some way it has it has been waging 80 office of the Green function so here is kind of his the covariance function blue and that's the thing that would be that the language unity of the Green's function said genius of autumn miners deal the 2 so I mean he was a probable list and of course if you see something where an object a random field where the covariant structure is the 1 off the Green's function you think of the Goshen wide gushing gushing free field but fact it's not a gumption it's not equal to the current structure is not equal to some some some Green's function so that was the 1st interesting finding it's more subtle and then a jokester from Ohio with a goal from stand for even found out that the Woodmont invaded us in the 1st place to look at this quantity namely the fact that we could understand the variance of any solution by looking at the variance of the corrected is not as simple as that because both of these limiting all kind of variances exist but they are not equal so it's not that was not the right idea to use the true scale continues to scale expansion the simple way in order to reduce the variance of any solution to the variance of the correct so that was a little bit of a puzzle we don't quite understand the the covariance structure and then
very recently with CIA junk so was a Ph.D. student from 20 billion Brussels a kind of we we found what I think is the right way you have to understand the use of these fluctuations at least to leading order and to work to characterize them and where we what we introduced something which we call them modernization commutator an object which is built on the objects which anyway you have to compute on the correct errors and it looks at the difference between the current In the field to which she replied the effective cancer so it's a very simple object and we call the commentators because of innocence and approach that that's the difference between a immediately applying with a microscopic flat microscopic conductivity or for you know doing the homogenization and then applying with my score met with the with the effective conductivity that surround matrix field and the 1st observation is a computer deterministic 1 many that the fluctuations office of our observable can be in path wizened .period wise way described by this kind of played back to this quantity to the skeptics side to this communion with the modernization commutator by solving the joint problems John from remember that GE is kind of the Jabaliya interested in resolving problem on the homogenized level that defines the bar and if we have the the you which has been marginalized solution to the bar we tend to leading order with the kind of presides relative characterized the fluctuations the 2nd part of the finding was that the strange object or which it's not that strange on large scales behaves like white noise like like Gulshan white noise so if you look at the if you look at the correlation function of this object of this town's valued object and you put it on the right scale then essentially you see just see peak at the origin and the flat 0 0 court grants on not scales this homogenization commutator behaves like white noise gushing necessary and Goshen white knowledge so it's characterized by sitting by a single 410 it's a for cancer because this and object was to transfer to its conveyances for cancer and if you put these 2 results together you get a complete asymptotic characters edition of the variance in terms of this in terms of this new object to which describes the equal variance off of that thing so so that's it that's the I thing nice and also pleasing characters edition of the limiting variants and it the nice thing from a a practical point of view the nice thing is that the message inside sends in America would set the take-home message is that if you're interested not just in the solution but you're interested in the fluctuations in the variance in the variances because you're interested in uncertainty quantification you don't have to do any work because anyway To get the homogenized effective ,comma not modernized the behavior you had to resort to your representative volume element method you had to solve for these the 4 yet to solve for the harmonic ordnance yet to find the harmonic according to solve these problems about which I talked in the 1st in the 1st part of the talk because they give you the effective behavior at least an approximate affected behavior but if you have these objects then you might as well look at the Catholic side the homogenization commentator on the east on the approximate level and you may look at the surgery who were tired formula which is which is a proxy 4 comparing structure so without any additional numerical effort there you have access to if you if you if you have if you if you want to get any harm you automatically without any additional effort you have effort to have access to this site to this four-tenths of Q and what happened this 410 search you not only can you characterize the leading order the leading on the behavior of fluctuating solution can also characterized the meeting or the behavior of the variance of observed so therefore so I think this is a guy so I would hope that this is an inside which which which can be used so that that brings me to the summary so so I mean what I want to know what we've been working on for quite a while extensively in the past years kind of making this type of all modernization of random medium or quantitative and here I have been you continue to work 2 examples and in understanding the error in the representative .period element method and an example of what you want uncertainty quantification to leading order which turns out to work to not not to be computationally more expensive than what you have to do anyway to get the the effective effective but put off key the logistical problems In the 1st question so you will the of the world for circulatory system you can use the same techniques and a couple location the scale of the evidence that's uh that's a good question I think the To some extent that can be done and now and I think that has also that had that in parts had already been tackled by the by the probability um the community I mean at 1 1 piece of information which I which I which I did mention here is I mean we characterize the variance and we know that there are approximately Gaussian so there are but but as you say kind of looking at the large deviations scale this is another type of question to my feeling is that this could be done and that this is in part stunned already in the and the probability communities but that this requires different types of types of techniques In a 2nd question about future of the 2 the just like the king of the provisions of the human walls and will also use the use of the expansion of the old also believes that the expansion of ghosts and was so so so now you mean an expansion in the in the volume fractions that this is the 1st of the year and so I think that has been done I don't know I don't think it has so so the 2 of my Michael offers submitted understand and on 20 really I have in a similar model looked exactly the kind of day-to-day I'm uh they they've proven that kind of thing that edition showing that this is an analytic function in the volume fraction and kind of developed if you want some kind of multipole the way of of getting getting to all the series of that can be done up with an idea that not with these tools and anyway and most of our proves we used if you want something like Maleeva might calculus because we're taking the derivative with respect to the noise which in our case is taking the derivative with respect to the coefficients so we we we try to understand how solutions and that's it at 1st completely determines the question how sensitively a solution at this place depends on changing the coefficient displays which described by the non-constant coefficient Green's function and that's really computing the might of on derivative and that then we put into kind of concentration of measure Riddick so what if inequality machinery to get to get most of the many of the St