Bestand wählen

Unveiling the Universe with python

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Erkannte Entitäten
the a validation the 39 and she wanted us about exploring the universe was please would convert area morning everyone contains a for being here and I like 1st of all thank you organizes me or relies on on you tube for the opportunity to talk here I will discuss how we can use Python not in cosmology tool and view of the universe and in particular the dark unit I am malaria 39 and I
a physicist so just let me briefly introduce myself I'm a physicist I work in astrophysics and cosmology so the study of the understanding of the origin of the evolution and the content of the universe this II III work impractical for space missions affected the lights were fantastically down before worth space missions financed by and the European Space Agency announced so the 1st 1 is Planck satellite that was launched in 2009 and we release the data last year for cosmology and the other 1 is this beautiful 1 nucleon space mission that will be launched in 20 20 and you get out of you more about the afterward I've I've been Paul so our working a lot and communication had been in charge of the communication internal communication for the Euclid space missions and public outreach for a 2 two-year and I'm also very much interested in data science has been working separately and healthcare project for a start up in London for sometimes were inclined to project but that's that's a different story but I mention that because that recently also become ambassador for the as to the S programs so
before we go to cosmology let me also tell you about this program it's um signs to signs and it's the largest state of science will come in Europe it's a five-week program that happens twice per year 1 de Chile 1 in place in London and it really means at joining the the academic community we need data science experience and the ambassador program in particular aims to build a network between scientists from 129 and outside the difference community and it supports talks so also called in party expenses so if 1 wants to organize some even I'll probably idea moving actually tool bars and few months and that probably be organized so that defines the workshop next year so if you're interesting interested in taking part of the part of this community please contact me or just look at the web for the Ambassador program OK so let's say look at cosmology in the hall and let's 1st understand that which distances which scales are we talking about whether human beings that our model as sort of dimensions of a meter roughly in scale and if you go down to smaller then you reach the interests the users for chemistry for atomic physics for nuclear physics known to 10 to the minus 15 meters some and even lower scale the scales of particle physics where the large hadron collider at CERN is working or those very small distances detected that we heard yesterday on from gravitational waves but now I would like to bring you up in the other direction at very large distances beyond human beings beyond beyond the sun in the domain of astrophysics and cosmology so we start our journey across cosmos from our planet Earth which is the 1 of the planets in our solar system and is not right there and the whole solar system used here
in the edge of the spiral of our galaxy which is to estimated the so here we are at about 10 to the 20 1st 21 meters but we want to go and we can actually go we have the power to look much farther than In fact this is a picture of the noise it's too too bright to see anything this but that's that's a fault from uh taken from the Hubble Space Telescope financed by NASA in which every single point in this picture is a galaxy like our me he way and we can go even even and that for that I really like like the the light it's possible to be even lower the if there is anyone in their books before it starts with otherwise you won't see anything of the next slide and also I mean it's a direct universals and now the OK that's already grape that's already much better than the so that's the beginning picture of all the galaxies around us so
that we can go even farther so he mentions that you are somewhere into the center of the of the joint you go far away from the center galaxy they're not these are all galaxies which have been observed by the Sloan Digital Sky Survey collaboration that observed about 1 million galaxies and they're just placed around in space as they observed them and as you go farther and farther from our galaxy you can see that they don't really have to fill out the whole of space and but they actually form a web they form void places where there's no doubt that and filaments places where there's lots and lots and lots of context and this is all really mean observed by the Sloan Digital Sky Survey so this is what is called the cosmic weight in addition we know that the
universe is expanding this already since since a long time so in the sense that the distance between galaxies space itself in between galaxies and for a long time and expansion has decelerated so just going slower and slower due to gravity that tries to pull things together and slow down the expansion which is also
what you see here so this like slowing down and then southerly about 5 billions years ago it started to accelerate these expansion started to accelerate the pace faster and faster and this was discovered only in 1998 to it was a huge huge surprise
that got these 3 people that noble prize in physics sometime in 2011 for the discovery of the acceleration of the universe and right now at the university is accelerating East Indies phase of Axelerated expansion now since then since 1998 that's been satirized experiments and so something like a lot a lot of data coming from different experiments from grounded in space of different collaborations looking at different things and they all seem to point out to what's the same surprising feature of the
universe the universe which is mainly about so so ordinary Ottomans ordinary matter all human beings and basically stops so they all accounts for at most 5 per cent of the total energy budget in you the rest is completely unknown and is we will we know that it's the biologically for about 25 per cent in the form of dark matter so that's a form of matter that that still feels gravity and it's like the the good that forms of galaxies that keeps got together and even more and more mysteriously about 70 per cent of our universe energy budget is in the form of dark energy that's diet this and that it does not emit light we haven't actually seen detected the particles of that conditions in order not yeah but we know that it's responsible for the accelerated expansion of the and so understanding 95 per cent of the universe so as not is is like almost embarrassing so it's it's it's that the the major challenges and not at the moment and for the next generation of expected and this is this is the cosmic the Shomu of really having the big picture understanding in 95 per cent of the energy that surrounds us but it's also it's also a big data challenge that joins a lot of different communities to get so there is already a new generation of experiments among which the next 1 to be launched beginning the Euclid space missions which are going to use different probes to counter this guy slicing at different epochs in time so they're they're going to observe for example the shapes of billions of galaxies at different epochs in time and this
is this is a huge challenge it's it's it's a challenge from the technological point of view because you have to costs predictors acknowledging building new technologies to ones for us to have a theory there is solution that allows you to want to discriminate among all the possible theoretical models that can explain like can adjust to actually build the detector to actually transfer this signal and compressed it so the whole signal processing challenge to understand that to to reconstruct the shape of of the galaxies and to compress the the data that comes from space spaces and to extend it to actually in interpreted in terms of comparison with the theoretical models to finally altogether a test gravity and fundamental physics at very large scale like we do we in in the very small scale so testing forces that's interactions at very large scale like people to what the LHC at CERN song and I want to stress that this is not the work of a single astronomer a single person that looks at the not drive right found stranger equations on the blackboard or or looks at the telescope from from somewhere this is really an enterprise in a way this is a
work that involves huge large collaborations some so as I tell you something about the 2 1 which I mean the first one is and this is this is a collaboration of about 100 scientific institutes in Europe and the US and in Canada it involves about 500 people and i've for that have been leading analysis that compares the data from the satellite tool to adequate models that predict the dark energy and and the theories of on the generative He's modified from you the other relation you
get there is more than twice as big so at the moment it includes 13 hundred people from all 120 laps of 13 European countries plus US non-stop and Berkeley so for that I am going non-political apart from working on the communication as I am in charge of the whole forecasting the activity to determine a a reliable pipeline that can uh deadhouse how well you play it will perform in in discriminating among different users the OK
so that means that the valuable to it's the more today what we actually observe and how we actually analyze the data and of course where we used by thinking and that for for a particular for plunk plunk was launched in 2009 and we collected the terabytes of of the data he was sent to 1 . 5 million kilometres away from Earth orbiting around around the 2nd Lagrangian point somewhere here on the opposite side with respect to the sun and that
comes into the entire sky twice per year so the spacecraft spins with 1 rotation per minute and it traces circles in the sky and observing the radiation in all direction at different frequency so it contains 2 instruments 1 at low frequencies and 1 at high frequencies and the high frequency part had to have a very complex cryogenic system that had to cool down the hall detector down 2 . 1 can so it was literally the coolest places in the universe of for while it observed in all directions see the radiation and what you see here is is the emission also from our own galactic plane along these lines which actually for us that is the background we remove the we don't want to see the guy up to our galaxy the light from our galaxy would we want to see something which is much more challenging and what we actually solve it something
which is much much identity which is the light of of the cosmic microwave background that was emitted 13 billion years ago and and this is a model of this is 1 of the main result of the outputs from the bunker collaboration in which you see this is a microwave radiation and that the different colors correspond to different temperatures so tiny tiny differences in temperature and radiation the mean temperature is about 3 can be so it's very very cold that's why that the detector had to be so called even colder than that but will we actually what we are actually interested in is this tiny tiny differences in temperature when we look at different directions so all these are alike hot and and called spot when you look in different directions and that there there are many things we have such an amazing resolution of of these market that we can understand how this light travels down to us and from there understand the evolution of the universe and reconstruct the the content of the universe
so that's really sort of similar to what you would do well on a map of the temperature on Earth on the top where you would go up to where the 40 degrees or even criterion can be bought or recently or but just on the sky so all this is 3 get around 3 and someone has to 170 the great centigrade and really you see tiny tiny differences of 1 part of 10 to the 5th 2 of something that was emitted 13 billion years ago and that gives you a resolution on the only on the parameters that describe your universally amount of the energy that mountain and the expansion of the universe with the resolution of the and the cent let so I think that's an a distance almost the astonishing and and most of the
analysis is actually in the world processing of the data in trying to get rid of all the other sources all the individual of point sources that for which we have the catalogs that we just remove we removed the radio emission from the milky way this making ways really annoying favor we remove all the dust emission again from our lovely the way which is in itself however of course all the interest for other communities that that means that that to collect in on and view of the cosmic microwave background and this was the result that I mean you might have also seen these
multiple uh it was kind of advertised in is the the old newspapers the front page what we actually get of course is not to be read too much he
said something terrible happened apart probability cues that disintegrated yeah yeah the yeah so we what we get is just a time-ordered data at the beginning that that's an example of 3 minutes of the raw data that we get from the satellite and then and this is the most of the analyses stream processing these
data and for that we used several classes so all over the world basically this is the the the main data processing center are you not in Italy and France for Planck and therefore you get the this and they collected basically terabytes of of of the time for the next generation of experiments we expect also from rather telescopes about there are bytes per minute of state of the art so there's all this information that comes from the satellite arrives at the Mission Operations Center in Germany and then it's transferred to all Italy and France where there's still a data processing center and then it's transferred to the whole community basically in around the word in different students and there's different groups we that that the extracted from those stayed on clean up all the all the data and the and extract these this this month the challenges between different groups to understand which 1 performs better we then extract from them we we projecting spherical monarchs toward 25 the the the dependence in is a different angles from which we're looking at and for all these processes are actually has was mentioned doctrine the talk before this lots of different codes by different people love and new reading different languages such as so for the extraction for example of the maps lots of them actually ERA in idea and land use you take such a lots of of which on its unfortunate intense it's not even open source of uh there's a lot of on there's lots of breasts blossom much above and yet from all of that time so from terabytes of data we can extract it from power spectrum sorry that the of that that's a on the Y axis you see again the temperature perturbations temperature differences at different scales such as a function of the state of the monarch so as a function of the angular scale disease very large angular scales and is very small tiny under large-scale and then there's the whole process in which have to try to compared this week juridical model and feed that the somehow and the thing that I'm showing you is exactly the 1 that corresponds to the that
I showed you before so the thing you see that think tank I can probably show you do something here
is that you can find on line so depending on the amount of of them some of that material the can edge that you put you'll get 2 different kind of predictions different kind of curves so for example if you have a 100 % of only out on some then you would get this kind of curve that as you obviously see does not fit the data to silly fuel in order to fit the data you need 2 would decrease the amount of art on items even more and decrease the amount of the clutter and as you see all the predictions are uh changing and do uh there's other media and so on the expansion and send the web and the annotation to explain the sounds the initial conditions and the finally hopefully if you have only 70 % of the can and you then you can actually finally um yet that match the data not
obviously we don't do that that in in this way how we have
to 1 analyze the whole of the region of parameter space and and use of
several 2 was there's a whole collection of tools which is available in their cosmology in denies a website to mention you
here and all these codes are all of open source and all of the available you can all play with them there's a several of them true so
for the future the emissions of you period there but in particular the whole sounds ground segment and also the forecasting activity that I did that I have chosen Python as recommended the languages so most of all of it will be in Python afford the action interpretation at least the interpretation or there are some of the plant that they don't of questioning the 2 dolls simulations we use a Monte Carlo a Markov chain to compare the data we don't know the whole parameter space of of of the theoretical models of the predictions of the theoretical model and we use a Bayesian analysis with that so we try to work to be chains that construct the posterior distribution soul does the probability to have that model given those data and we have several tools for debt to but in particular
this 1 uh that I want to mention because it's written in Python and it's a code which is open source it's called Monty Python it's a Monte-Carlo called the written in Python look at you can find it all know on give out but there's the documentation and the main developers are Benjamin exam and the usually and less schools laughs and many many others so this for example will also be used to um also mean for the forecasting activity In in knowledge is is uh um requires to deal with complex data and also to combine data that come from different sources from different experiments this and that open sometimes I mean look at different things like different parameters that you have to deal with several free parameters so the the ones that describe your command cosmology so the amount of matter the amount of of dark energy the expansion and so on the order of 10 para meters per cosmological model plus about 10 to 100 the parameters that describe the instrumental and all the systematics involved in so we need to always sample of very efficiently in parameter spaces different a possible samplers that are used up and also integrated in our Monty Python for a long time and there are some meaning to let the user and they're still using the the code which actually was written in Fortran 90 and that that is the sort of part of it is not for sedated them abide by some element which is called the because once you blue and on the bottom is a more recent version that for the moment written in invited to and
of course it guarantees that it's much more concise so was respect to the previous recorder that was used up and it there it ll astronomer which can mean much more stable and among the garnish aims for 4 a day some investigating parameter space and there is
also allows whether much more module and structure in which we you have to understand that basically this this enhanced interface so we the different codes that for example dealing with the data from different experiments or with different samplers same parameter space or uh with the different courts that to solve the option equation that describes the universe from the Big Bang down to us so all these models are written sometimes also in different languages such and their own integrated that week in monkey doing so that's the sort
of the mouth the modularity of on the bike and positive that's the part here for example integrated here with class which is the cost of coding C this that's also and we evolution of the background the old top is comes from different data sources and then there's all different sampling so here on the right hand side and it is also so 1 divided
recently uh social sons binary so if you goal of example today during the link on the top you can also see part 2 of the of the class he got repository transport into IPython are not books and you can play with it and it it includes examples such and repository we previous results the what is not yet optimal invite atlas to produce protective properties mainly that it's a therefore what we need to do all for a forward force and expansion in some some parts of the previous the previous called the uh that was written for automatically and has a as a you rigidity these integrated with dominant and 1 can run simultaneously loss of different chains and also around everything on agreed that so that basically you you investigate very quickly a large fraction of parameter space this is not yet integrated uh into although Monty Python part and Monte but uses MPI for tonight but of course it would be much useful movements were useful to have something there like Open MPI and get so in fact if you have any ideas foreign debt on how we use we prove that the we we need input from the they utterances community
and Python is so you would think that a lot in all courts 4 D analysis of the chains and plotting the older posterior credible regions so basically all the regions in parameter space so that identified how big can each parameter b so that's an example of that's another
example of plots that we usually look at all for like 3 D and sort of products produced with Python and also in combining 2 different experiments so this this is for example 1 of the of the results that we that I had in the when comparing the data from blanket we that's a general relativity so if you if you see here describes corresponds to the model represented by the the standard the generality of it and you see that while applying to the blue contours roughly is still learning finally generative it it it agrees with the general relativity there is some tension when you combine plant information from the early universe with information from the late time universal with other props with other props from service of galaxies so you basically you just have to look at the red the contours this combine different data sources from different experiments that combine information from the early time information from the late time universe and their combination of of oppressor so so which modify gravity With respect to gender tragedy Of course this is not this is just I mean this is only at the 3 point 5 and sigma so it's not that what you would call that a detection about it that it's something that will be of course of much interest in would with that we will be able to detect with the future generation of experiments that we have a much higher resolution In
addition we can produce the moments like that don't so that that's a full-sky map of the polarized emission from the best of the milky way it looks like some are some impressionist portray that but it's really the polarization of the light uh related from the dust of the milky way and that's important because this is in a way it's a background so the problem is that the the point is that the gravitational waves that he we heard yesterday can also have an impact on the polarization of the CMB so on the depolarization on on the slide that gives us a picture of the of the early university and these undirected action so the detection of gravitational waves throughout the CMB through this dataset microwave radiation and has has not happened yet so and we haven't seen that yet but for the 1st time we have uh full-sky among people of other sources so that the uh can we make the same kind of signal and so in the next step in the next months basically there will be also in your data from from the ground from another 1 3rd looking at the polarization of of disunity trying to understand them trying to detect a gravitational waves and also in this way so there
is really a revolution that is coming in the next 5 to 10 years to on the field The Dark Universe some it's a huge challenge to technological challenges of big debate and silence we have begin terabytes of web of data coming to per day at the moment of experiment permanent future of created telescopes so uh there's a lot of that there's been a lot of investments already formed national funding agencies from the announcer toll understand this problem and again it's a big uh they don't uh and challenge and we want to join different communities of course to get the best of scientific
returned so it's not just about the 1 person on the working there somewhere and some of this so it's about that joining expertise and because this will actually in a way determining what
happened at the big just stick the future of our universe where there everything will be destroyed or whether we will expand forever or whether we will just collapse a gain from gravity and this depends on how much of the data can achieve theories in the universe so overall we really want
to be sure that we look at the big picture and we join expertise coming from different views to understand exactly what are we actually observe think so yeah thank you that's fine for yeah so questions officials thinkers excellent talk really really exciting language and the background radiation the meiji of that is not uniform noise that and to me it looks like clouds in the sky high fractal doesn't mean anything and yet OK so that's a very very good question actually is material and test that might be the something more yeah exactly so it if you look in different directions it's mainly isotropic and homogeneous in the sense that it doesn't mean temperature as everywhere 3 kelvin but really what we're interested in is accepted in the online sort of seeing differences in temperature this is what is marked you differences in time time different differences in temperature with respect to the mean temperature and D is tiny differences in temperature are due to the fact that in the very early universe there were very tiny of density a perturbations all very time differences in in in space of uh do you want would tool red they were stretched it's it's really about the initial conditions of the universe of and just after the Big Bang there was there was a face of very fast expansion which is called the inflation in which very tiny differences in in in densities and so on which somewhat there was some more in some places and less than some other place where stretched to microscopic scale and this reflects this effect the temperature of this regulation that was emitted at that time so what we really see here this is really a picture of the initial conditions of the universe is a picture of the universe as it was uh 13 billion years ago that's the that's the farm there so that we can go up to now it said that right so it was surprising because they are this group would expect things to be uniform unless we had additional information so in some signs for example we use probability of previous use of equality or modularity also for the if we don't know me and so just this says that people are working to find out why there were differences in the matter density which capitalism surprising to me yeah yeah I mean your you know usually when you solve the media would result universe you assume homogeneous and isotropic and isotropy and uh and then you treat this as in linear perturbation around the mean the universe o mogeneous an isotropic and background and yet at I think pretty amazing I think OK language should quit when questioned about and you mentioned before that you you're looking for ways to accelerate some a competition stuff have you tried a number of for instance in many any other essentially this site site on its size and yes it's a really used for example to wrap um to wrap Monty Python source model assuming divided which would be with data how India with the likelihood of Sweden the codes which actually solve the evolution of the universe so that that's already used a lot the the main problem is that the region of parameter space it's it's really huge expecially if you want to test model beyond the and generality distance which are a standard allow absolutely allowed by the data and so on it's it's really the process of sampling that I think should be on have become faster the is that you use you have more questions that you drew granted the neurons and the in the Quran thank you very much so few
Diskrete Simulation
Vorzeichen <Mathematik>
Grid Computing
Kette <Mathematik>
Solar-terrestrische Physik
Quantisierung <Physik>
Dichte <Physik>
Generator <Informatik>
Kollaboration <Informatik>
Einheit <Mathematik>
Ordnung <Mathematik>
Lesen <Datenverarbeitung>
Formation <Mathematik>
Algebraisches Modell
Wrapper <Programmierung>
Repository <Informatik>
Modul <Datentyp>
Diskrete Simulation
Theoretische Physik
Endogene Variable
Formale Grammatik
Konvexe Hülle
Konkordanz <Mathematik>
Binder <Informatik>
Symmetrische Matrix
Komplex <Algebra>
Offene Menge
Wort <Informatik>
Prozess <Physik>
Element <Mathematik>
Element <Mathematik>
Mapping <Computergraphik>
Plot <Graphische Darstellung>
MIDI <Musikelektronik>
Lineares Funktional
Speicher <Informatik>
Web Site
Plot <Graphische Darstellung>
Konfiguration <Informatik>
Digitale Photographie
Arithmetisches Mittel
Framework <Informatik>
Projektive Ebene
Plancksches Wirkungsquantum
Lineare Abbildung
Web Site
Total <Mathematik>
ROM <Informatik>
Physikalische Theorie
Physikalisches System
Dunkle Energie
Nuklearer Raum
Leistung <Physik>
Fundamentalsatz der Algebra
Kollaboration <Informatik>
Magnetooptischer Speicher
Physikalisches System
Kombinatorische Gruppentheorie
SALEM <Programm>
Innerer Punkt
Umsetzung <Informatik>
Quilt <Mathematik>
Streaming <Kommunikationstechnik>
Interaktives Fernsehen
Regulator <Mathematik>
Shape <Informatik>
Gruppe <Mathematik>
Kategorie <Mathematik>
Kontextbezogenes System
Dienst <Informatik>
Algebraische Zahl
Rechter Winkel
Allgemeine Relativitätstheorie
Klasse <Mathematik>
Content <Internet>
Open Source
Inhalt <Mathematik>
Open Source
Formale Sprache
Snake <Bildverarbeitung>
Reflektor <Informatik>
Einheit <Mathematik>
Funktion <Mathematik>
Zentrische Streckung
Analoge Signalverarbeitung
Element <Gruppentheorie>
Globale Optimierung
Gesetz <Physik>
Verkettung <Informatik>
Funktion <Mathematik>
Geschlecht <Mathematik>
Technische Optik
Gewicht <Mathematik>
Virtuelle Maschine
Interaktives Fernsehen
Cookie <Internet>
Dämon <Informatik>
Mapping <Computergraphik>
Formale Sprache
Shape <Informatik>


Formale Metadaten

Titel Unveiling the Universe with python
Serientitel EuroPython 2016
Teil 87
Anzahl der Teile 169
Autor Pettorino, Valeria
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben
DOI 10.5446/21233
Herausgeber EuroPython
Erscheinungsjahr 2016
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Valeria Pettorino - Unveiling the Universe with python I will describe a scientific application of python in the field of Astrophysics and Cosmology. How the publicly available package Monte Python is used to compare data from space satellite missions with theoretical models that attempt to describe the evolution and content of the Universe. The result is surprising, as it points towards a Universe which is mainly dark. ----- Python is widely used in Cosmology, which is the study of the Universe and all forms of energy in it. A large amount of data has been recently obtained through space satellite missions, such as Planck, financed by ESA/NASA. Planck has observed the radiation emitted about 13 billion years ago (the Cosmic Microwave Background, CMB), which gives us information on the content and space-time geometry of the Universe. Many competitive theoretical models have been proposed that aim at describing the evolution of the species contained in the Universe: therefore, cosmologists need a method to identify which theoretical model better fits the data. In order to compare data with theoretical predictions, cosmologists use Bayesian statistics and Monte Carlo simulations. Among the tools developed for the analysis, the package ‘Monte Python’ is publicly available and uses python to perform Monte Carlo simulations: this allows to determine the theoretical model that maximizes the likelihood to obtain the observed data. Such model is now the standard cosmological model and reveals a Universe that is very different from what scientists had ever expected. A Universe in which the atoms we are made of, constitute only 5% of the total energy budget. The rest is the so-called ‘Dark Universe’. I will illustrate the story of how cosmologists used python to analyse the data of the CMB and unveil the Dark Universe.

Ähnliche Filme