Merken
Handling GPS Data with Python
Automatisierte Medienanalyse
Diese automatischen Videoanalysen setzt das TIBAVPortal ein:
Szenenerkennung — Shot Boundary Detection segmentiert das Video anhand von Bildmerkmalen. Ein daraus erzeugtes visuelles Inhaltsverzeichnis gibt einen schnellen Überblick über den Inhalt des Videos und bietet einen zielgenauen Zugriff.
Texterkennung – Intelligent Character Recognition erfasst, indexiert und macht geschriebene Sprache (zum Beispiel Text auf Folien) durchsuchbar.
Spracherkennung – Speech to Text notiert die gesprochene Sprache im Video in Form eines Transkripts, das durchsuchbar ist.
Bilderkennung – Visual Concept Detection indexiert das Bewegtbild mit fachspezifischen und fächerübergreifenden visuellen Konzepten (zum Beispiel Landschaft, Fassadendetail, technische Zeichnung, Computeranimation oder Vorlesung).
Verschlagwortung – Named Entity Recognition beschreibt die einzelnen Videosegmente mit semantisch verknüpften Sachbegriffen. Synonyme oder Unterbegriffe von eingegebenen Suchbegriffen können dadurch automatisch mitgesucht werden, was die Treffermenge erweitert.
Erkannte Entitäten
Sprachtranskript
00:03
having expressing DataEase till the end he is that a scientist and in the links and a diagonal of his presentation he's sending it is the daily depending on how yeah thank you 1 welcome to my talk thank and handling today if it was applied to my name is loaded an and I a data scientist innovates innovates is a technology consulting company based in Germany in all the major cities so that I once a broad where the task was to was relate them to the brake pedal of trucks to do that striving how much did frighten various writing this was a time when I 1st had to like you with GPS data because what we had from this those trucks were GPS data and of course the the current condition of the of the brain cats and so this is how it came to that and when I tried to find information about all the libraries about what kind of libraries they could be used to to in order to fulfill this task the Founding quite hot actually because normally you are you're used to that whatever libraries searching you get a lot of good tutorials and so on but I was kind of hard for and GPS and so this is the reason for this talk that I wanna share a little bit about information about things I found out during this task and of course I'm also have magician interested in the algorithm and the other mathematical algorithms and that remaining on and it's always good to have a talk right and you're going to quit conference like Europe I so 1st of all when you're starting to to deal with GPS data out there needs to be stored in a way right so and the typical formant for should be socalled should be accessed that should be as exchange forward and and it's based on XML it describes 3 different points so we have here 3 different things we have points we have removed so is yeah the the way you actually want to go in you for instance using it for hiking and then the actual track that's due to that actually took because maybe there was some kind of mountain in between and there and this is basically what this format is describing so to give you a little bit of an idea how the general structure looks like we have here I the points as shown before we have found the root and the root is just the latest in list of different food points and we have to track the track is composed of several segments and you using segments from for instance when when you lose GPS the signal to start a new segment or when when you hitting pause for instance on your on your writing sense then is that new segments and each segment contains many tracked points which build up that track you you actually took and for this presentation I will use not the customer data of course but they needed to from my for my watch from American Paul uploads goal and uh this contains only extract because if you go for a run you don't normally don't plan you just go for a run and your points and we have here the latitude and longitude and ventilation and of course the time when this measurement was taking so how do we know how to deal with those kinds of files so there's library called should be x it's it's it's file parser for reading and you can even write that should be X Files is licensed under the Apache 2 . 0 it contains a nice small a command line tools should be x in for that gives some basic facts about the files so if you just interested in OK but with Malevich uh velocity then you can just use it it's perfectly compatible with is always important with small letters like this written by 2 more know and it's used on his website traffic profile of the course so how do you actually use this and so it's really really easy just imported you all know and you open the file of handle and just positive and basically that's it so when you get back is an object and you can then use and tracks execute too young to access different tracks in this case you only have 1 track and the segments goes also had only 1 segment and then normally what we do right want you somehow with this information we can make a data frame that data frame upon the state of Vermont and that's exactly what I did here set the index time and that's the data from so so far so simple so of
05:25
course so was there seems to be excess something really visual we want plotted and just to give you an idea of this was 1 of around i did in in in hamburg so this is I just the longitude and latitude with the help of matplotlib at and this was so just to track just the line what happens if we lots not only aligned with the actual points with the Boston we see that we have a lot of points we have more than 10 thousand points actually so at this point I might think OK maybe 10 thousand points for small run is not there is way too much so so how can we produce it how can we reduce the number of points that we have to do with less data without actually destroying the gym what geometry of the of the track and so the really simple trick
06:24
is we use only every 150 points and in that case it works because it was running with the ball always the same as speed but this would not work if a and I have done this with a car or we also see that on straight lines like here of course you would need less points to describe straight line only 2 points actually but in a curve you would need more information described a curve C C Ds need to somehow using added them to to simplify it should be a track and 1 other than that it's quite often used is the socalled much last point out of him and I think it's best explained in the US in how it works so let's assume we have this GPS tracking and we want to simplify it and what a simplified means so we have to give some kind of of epsilon some kind of error that we say let's take to make that kind of error but anymore not not more than a certain epsilon
07:32
so that didn't just words stating the 1st point to the last strong straight line and this is epsilon um surroundings this excellent wine from the environment around that straight line and that it goes if now all points would be inside this epsilon environment then we could just use all points in between if it's not the case then take the point which is the furthest away with from this language that 1 and apply the same algorithm recursively
08:07
on the true the segments that's our by splitting at that point so 1 segment is that 1 year and the other related to you that 1 so we start with inspecting that when we see OK at all points are in this epsilon environment we can just remove that and now we do the
08:28
same here we have the problem of all points are included we take the 1 that is the 1st way so that once we got is played out from here true there and then from there to there so 1st this 1 we can remove that point here we have the same problem again and this is how this over them recursively works and we see that we we have simplified the track region reduced number of points and if we go back and forth see just even better and this is an idea that we can use and which is quite often actually used them to simplify and GPS tracker we will later make use of this so In this case I'd just use this this algorithm and the and and that point 1 also making it's the 1 where of course and so never ever use recursion in Python you will always run into problems and if you go to if do your recursions from you so if you have more than uh recursion the death of more than 100 1 thousand then you will get into trouble so it's better to reformulate this other them iteratively which makes more complicated but this is what I did here and yeah so I used an implementation of this iterative and they're just run it on the longer latitude and they can reduce the number of points from more and 12 thousand getting to less than 200 and this year the outcome we see as expected straight lines are really straight lines and here for for instance during a curfew more that's actually used and yeah that's it about this simplification so when I was doing this project with the the trucks I had only longitudinal latitude of course I was interested in the kind of finding out OK what was the heights uphill and downhill distance since the cobbler's actually doing but OK where it would make it the generation from was was looking around in in my tracks so this is detract from a from a bicycle rights and did so in in in Italy and then there have to elevation but I will now just remove it so that it's like it was in the project and I don't have in that case there's a really cool cool and extensions to expired called S R T and pi and as RTN tight stands for shuttle radar topography mission to the recent data type and that was mapping and maybe some of you remember in the year 2 thousand there was this huge idea not summation where they were uh using radar and find out the elevation almost everywhere on earth and this data is publicly available on this is just the yen interface for it and you can really easily use it to get on the relational data so you just imported as object and get paid and you say at elevation to the to the X files this
11:54
object the old before and what exactly and will do it will just start downloading partial files that that this data and add to your to the X file and additionally we can say that these smooth the elevation and so on to neighboring olations our uh averaged in that case and yeah so if I just wanted to see how to compare the elevation that my in my or 2 or measure and to compare to the S R T and they don't we see that it's just basically almost the same so this works always great if you don't have a and the elevation data but somehow needed so of course the new project in the end you want % something to the customer and the customer always likes nice picture so that it was also facing the problem of how can now visualize the data in a in a nice the customerfriendly way and I found out about the cut library called MPL leaflets and the nice thing about this is that it can just use any much of the plot and put it into cannibals maps all this smacks you know web pages where you have OpenStreetMap of Google directly embedded in In about page this is really easy to use also a new BSD license it integrates fantastically well with attribute notebook and perfectly compatible and a word of caution and you should definitely simplifying attracts that there are multiple point out 1st because if you started using it with 10 thousand points then it will not work but so how does it work as it it's really simple we start with the matplotlib plot that 1 we have seen before and now we just project it onto OpenStreetMap just import and now leave let and say display the figure we created before and this will embedded into them into their old put which or if they would say it would have said show then it would just Coleman window and this is fully determined in the active and so mobile and it's 2 lines of code to actually embed this somewhere and this is a sort a really really like this and that that is so easy and another track and didn't want to talk about a little bit
14:37
more now this and you have this this bicycle the right I did in in Austria and Italy and when I was looking at the data of the tractor find out that there some some curious thing so if a cold get uphill downhill and method also should be X clients it tells me that I was actually doing the height meters all before about 400 46 and yeah and downhill about the same and was wondering OK and mean sounds good but it's a lot of hype meters to actually but that's the organized they told me or it said it's it's on the website it's roughly 4 thousand 700 high estranged somehow my sensor are must have measure something strange maybe I'm doing something wrong so was I was trying to start investigating and if will look when I look at clause at the very low wages I realize that they're just bumps here and these bonds are quite unrelated unrealistic so and we directly see that and elevation sensor measures in accuracy or precision of 1 meter and sometimes it sums up Andy Johnstone it jumps up and down and if you do that a lot of course if you measure the uphill distance on that side you of meters of those this seems quite unrealistic right so we have to somehow smooth beta because this is an artifact of the sensor or that's not the actual position I was at the time and also when I was looking at the speed of was quiet your feeling strange was sometimes during 230 kilometers per hour which is on a on a bicycle I mean agreement was going downhill sometimes but even downhill that's way too much and there is also a inner was using that index file at missing speech function which basically takes the time and the distances and calculates the speed so it shouldn't be a mathematical problem it's more like the problem could be so many data and the directly see that this P is extremely unrealistic but why how do we know that this is unrealistic and mean it's like OK we have a picture of a bike ahead and we say the bike has a position in some kind of velocity and if you know the velocity of the the center position will be at the and of the next position will be somehow based on the current position and on the speed make we have ways to see someone like walking it's just a new blocking then you know in the next 2nd I'm going to be there because this was my direction I was going and so we have this kind of model and now I have and this is exactly what the common field there is all about so come on them actually and he passed away just a few weeks ago also really beautiful mind and he came up with this idea of describe being giving a model to some to all kind of physical things and in this case we have a and you want described like my my bicycle right and then the idea is that you have that you have states states you can't really directly observe not like that the real states and I can think of my some of my bicycle like having me on my bicycle of having a position and some velocity is this is this state and the state acts and somehow I 1 project is into the next states that if I have like I said before at the position and velocity it can just use the velocity and the added times the time to the position and I will get a rough idea of the next position and then we have been there in the general form of course is a mathematician he admitted as as general as possible there every also have control that in this case we would not need but for instance if you're looking at the falling apples bowling ball then this could be the acceleration due to gravity to the permutation and of course you always have an an arrow terms so the error term is when I'm walking and slowly changing my direction then this is the part of the evidence and then decide to take equation we also have the measurements and measurement equation for this is what I read that should be s idea comes into the into play so this state database that we cannot directly it that we observe the measurement and this is the measurement is somehow it generated from and states plus an additional an arrow and this is basically the the main idea here and now that I kind of have a model how I think something physical some process behave like my ride a bicycle and on the other hand I get measurements and those will never be exactly the same I somehow only 2 methods to to bring them both together and this is something that the main idea behind the common field there so it takes everything I know up to a certain point about the process and above all prior and measurements this is called here and X hat minus the kind of really state of the nation and then the next step get the measurement this is the set of and then I want somehow or get up constantly euros state estimation which is better than my apriori knowledge I had before and this they do by taking my up really that knowledge last some procedural between from the measurements and the 2 states like mapped onto the measurement of would get directly from my your state and this is multiplied with the k 2 socalled call mundane and finding an optimal column I'm doing is I'm now the hot spots optimal in the sense that you want to minimize the error of and if we had something like this and actually we have we can just use this to to predict and to correct and do this iteratively with each new measurement and then we have the basic idea of the identity of the basic idea of that but then we have the common filter so what happens is that we use our over the
21:54
transition equation I worry about our model about our physical process and make predictions with the operational knowledge also the error covariance and then the measurement comes in we can now corrected the calculate optimal calling games of this is to form a lot but you just ignore than we do this optimal averaging of operating knowledge and and the measurements and we get an update and you can go back and start doing this for the next time step and for instance if you if you're looking at something very very looking on Google Maps and if you'll using this measurement update because losing the GPS signal you see that this is always a circle around the point which is starting to increase and this is exactly what happens and that's part of the prediction part if you lose seems to be a signal for 5 of the 6 seconds and so this service increasing and then someone measurements come in and sort goes down again and this is directly what you see in many many to be as applications so our common field our concrete cases and the moles state equation is just the next position is the current position blaster velocity times DT sold velocity times the time step blast and some noise and their current velocity that the next velocity is the current velocity plus some error term that it means I'm not and accelerating on the accelerating so fast or it's it's should be something as smooth process and operators in vector notation just get this matrix is small less just transcribing this down to this matrix here in our case and the sampling rate of this of this watch this 1 2nd so you to use just 1 2nd and the that the measurement equation is also really really easy we're measuring only that should be as position so without to be as only have position and only implicitly we somehow and knows something about the velocity that means that our state which includes found the position and the velocity vector gets mapped to only a measurement of the position plus some yeah just some so this is like a like a really easy question and what we also known in this case we know something about and the precision error of GPS I mean you can look that up it's something like 10 to 30 meters and this uh relates to 10 to the power of minus 4 in longitude and latitude and i've assumed an arrow all the other ways not 100 meters so that the elevation which should be excellent never trust it's really an extremely and in imprecise and this take for there are a called variance so far a everything about the man and yeah I hope you can do that a good understanding of the basic idea that some of common so this is former is a small summary you have some model some all equations some measurements somehow optimally average those 2 to get a better idea of of the of the current state and it really cool library for this is type of money it's a common filter smoother and and and yet Maximization libraries it's that simple to use and really powerful comes with many examples and could you can mentation besides that a few nor or have been yeah transition matrices you can also define and nonlinear and nonaffine um states model is licensed under a BSD license and it's written by the end of the world now we're going to use it but to actually help with the curiosities from the data but before that of course we have to do some some some data wrangling so again we start out with kind of data from of our data and you have to start looking at the big around and looking in detail I realized what OK it's not really 1 2nd the sampling rate then it's it's sometimes in a less than a 2nd differences of always exactly 1 2nd and it's really important that the time interval is uniform for this discrete Kalman filter so what we have to do it here it is that might just rounding the the at the next the next full 2nd there the time but what about signal loss so maybe had signals during my bike ride and so yeah how can be checked that easily for instance we could just use the number of function differ and just this over the time index and then received a difference of 2 entries is not 1 that we know OK there's going to be some kind of a signal loss and we have here we see we have 3 times we have a signal loss and we can fix this we need to fix this is is that we need a uniform time interval and we can fix this by just using the kind that resample functionality and this we do with resembled 1 2nd and we get some additional the rows which of course another way of because yeah that's debatable in June time the signal but yet we needed this we gonna fill this value later common that's going to help us FIL these values now since the Kalman filter works with a non piracy have to go back now we have we have all kind of data frame with the longitude latitude and elevation and we take the values to be converted to what we take the pundits stated number hiring from it but we don't use a noble and unpack reviews a mask area because common expect us and that the the real measurements our the are of note the missing values that that that they are masked and this is exactly what I'm doing here and so the number values are directly master and the additionally I just looked at then the signal lost points to get an idea of where the signals was here was a tunnel for instance and have yet to to just double check so now
29:14
basically the year all set and find for using the Kalman filter so that we have our the state transition equation this is just the matrix of the seen before Continues 1 2nd here we have our measurements matrix this we cross matrix and we have our covariance your clothes area along and we need to define an initial state in the Union state core and so like initial conditions that just take the the 1st measurements and we give all that the common filter from the pi common package and say what we don't know what is actually different systems or variance matrix so how well fast I like for instance changing my direction on the bike so this is something we don't really know what we can x estimated and this is what I'm saying here so my expectation maximization variables rep variables should be this transition and then we can just finished and that it is just call it's calling and K. at the end so expectationmaximization again on the measuring and the number of iterations so this took to 1 thousand iterations just to be sure it really converged takes a very long time that a few hours and now that our model is fixed we can use it to actually smooth an old measurements and we get in return um and Mean estimation of the states I really had at that point so now we're going from the measurement to the real to the access to the to the to the States I described before and effect a lot now this stage so this is on the right side and to the left are the measurements we see that actually it's it's it's looking much much smoother so we get rid of all of the bonds song and then it's this is looking find and of course we can write it back this movie that track and um distress basically as we iterate over all the segment points and um yeah we we we look back in and then we call an uphill downhill distance again and we end up with 2 thousand 677 height meters and this is roughly the about 2 thousand 700 height meters that organizers told me about told us about the web page and so on yeah we have finally used and the common filter without and actually and yet coming up from cool smoothing tricks minute talk to other people said you why didn't you just like take to work read points and took the average somehow but then you have a lot of variables you need to fit and here with the common field every actually only describe the physical process that uh the relation between velocity and the position and it turned out to be justified and of course if we do this with the speed the speed was a little bit more involved since you the other 1 had to be be careful with them with the points where the signal was lost so the problem is most most sense so as to be as honest as already have some kind of common field the inside that means if you're losing the GPS signal and then differences to go in at home and then you start getting your 1st measurements at the end of the tunnel and what is a sense of and that's the it's still things you were still in front of the home because this measurement is way off what it expects wouldn't it expects it to be and then it starts like yet going after the real measurements would realize OK the the measurements are really sort of really somewhere else and then you get like chums we really really fast and this is what what happened here so I had to delete a few points and use the common filter to fill in some of the more realistic points but in the end it turned out really good so I was driving according to this about 77 limited powers which is still fast but many downhill and was checking on my half a meter and said something about 70 the maximum speed so it there's still a little bit off and the what's cool about a common field there you could even now used to data for instance of powerful media from your bike to actually um have to who every step of the the measurements from different sensors to get even more precise estimation of your of your state but this is beyond this talk but it's about possible all right so this is apologies to about libraries I used so I wanna give a short summary of all the libraries we've seen right now and the point should be expired for reading and writing GPS files uh spells imputing missing values like they're they're a lowwage in which you can do with a and this RT empire and we have visualized knowledge of the abstracts with them and leaflets which uses internally leaflets map which is telescope level agreed and we talk about that I'm at the point of room and there is a firm actually there's really an implementation of its RTP which you can just keep installed but that said be careful with this it's a and it's programs recursively and for my tracks adjusted work and then and then there's the pie column on and that that symbol common I have amateur more notebooks said that they created during the during made this talk to the creation of the talk and you can find it here it's it's a tool for for notebooks where you can play around with it and lecturer parts out there you have it you can get the full picture is just play with and of course credits record is used so um the RDP examples this nice animation from models cut holes the evade contract profit from the pedia than the 1 the 1st slide is in no maps so if you haven't used this happens really really pools of all the Linux user and it's a really nice OpenStreetMap tool and yes similar to yeah Google alternatives to become a predictioncorrection traffic from pension block and if you're interested in about reading more about common so that there's a really good so common for this book by rupture lovely 11 and and yet he also using to better note bolts and showing how all Kalmanfilterbased ever OK so and you can find this talk on it you and thanks for your attention than listening and I'm ready for the questions the and it's not an see inclusion great sort of thing is I was wondering if standards so that you libraries like below gee I don't supplies like the parable functionality what sort in it and what kind of library do you know about libraries like G. download which we pretty standard that you libraries fire I haven't haven't actually check their knowledge about the act and the question is do not have a cooler and Python interface so they do have a button that says I was just wondering if they'll still supply GPS specific functionality but you don't know no idea that I don't know so I was I was doing when I started this research and for me that the most important point was that can really easily access it from from pipe and and so on maybe enlisted but uh so far but I don't think so that you can easily use them but are from heights so things and and what would you do if you had to deal with to be maintained feet form in on the mean exporting justice each formant and there's some light on the bias library but it's notability these 2 years ago would you use some GP is favored to the chances of interest for me to be here with you you know what I mean will be your approach you so I would say and I don't notice the former of common source and so I'm not I'm hardliners reducing the report our products they invariably allow you to upload download everything and should be accessed but in this case I would look for converted actually is converter and then if it's there yet human readable format this common format a source of binary community In more efficiently memory and when you look like a really long tracks really memoryefficient so it's OK then you might be having a hard time but maybe a converges for it I guess this term files can be imported in the garment website and exported there's DBH it's a cop out but it on OK I'll see but it is but that's the actual points so if the cell is problematic if you use for part the entire region formants stand for the but then it should be possible to find software that's transforms into GPX quite children so often RGB estate also comes with like a value of accuracy I know if that was available in in this uh in the sol or in this in this system and would be possible to also use that data to make it more accurate if you know the accuracy of the GPS data we have that the possible if you're at that point I mean you know if or you have for each point there's this article reference matrix and if you have an and estimation even what there is at that point you can also use that additional information you could use the arrow maybe just as a new state variable and then you can use your measurement of the error to to correct this so I the I would to think about this but it should definitely be possible so basically with the help of the common whatever kind of sensors you have uh having different error variances and even if you have the measurement for the variance so there are there you can yeah merge them together to get a better estimation thank you for coming Indian admission using tournament but how how
00:00
Geschwindigkeit
Domain <Netzwerk>
Web Site
Subtraktion
Bit
Punkt
Rahmenproblem
Gemeinsamer Speicher
Polare
Automatische Handlungsplanung
Parser
Schreiben <Datenverarbeitung>
NPhartes Problem
Mathematik
Information
Kombinatorische Gruppentheorie
Computeranimation
Task
Algorithmus
Statistische Analyse
Programmbibliothek
Polstelle
Wurzel <Mathematik>
Datenstruktur
Einflussgröße
NPhartes Problem
Algorithmus
Güte der Anpassung
Profil <Aerodynamik>
MailingListe
Binder <Informatik>
Elektronische Publikation
Parser
Objekt <Kategorie>
Datenstruktur
Rechter Winkel
Automatische Indexierung
Konditionszahl
Dateiformat
Information
Datenfluss
Computerunterstützte Übersetzung
Ordnung <Mathematik>
Diagonale <Geometrie>
Instantiierung
Lesen <Datenverarbeitung>
Aggregatzustand
05:23
Algorithmus
Punkt
Zahlenbereich
Räumliche Anordnung
Computeranimation
Arithmetisches Mittel
Punkt
Laufwerk <Datentechnik>
Information
Kurvenanpassung
Bitrate
Hilfesystem
Gerade
Fehlermeldung
Grenzwertberechnung
07:30
Algorithmus
Algorithmus
Punkt
HeegaardZerlegung
Formale Sprache
Wort <Informatik>
Programmierumgebung
Gerade
Computeranimation
Grenzwertberechnung
08:27
Bit
Gewichtete Summe
Punkt
Datensichtgerät
Parser
Implementierung
Zahlenbereich
Maßerweiterung
WebSeite
Code
Computeranimation
Homepage
Algorithmus
Interaktives Fernsehen
NotebookComputer
Datentyp
Bildschirmfenster
Programmbibliothek
Plot <Graphische Darstellung>
Punkt
Abstand
Maßerweiterung
Figurierte Zahl
Schnitt <Graphentheorie>
Gerade
Einflussgröße
Attributierte Grammatik
Schnittstelle
Algorithmus
Plot <Graphische Darstellung>
Elektronische Publikation
QuickSort
Mapping <Computergraphik>
Generator <Informatik>
Rechter Winkel
Laufwerk <Datentechnik>
Wort <Informatik>
Projektive Ebene
Surjektivität
Rekursive Funktion
Shape <Informatik>
Instantiierung
14:36
Kovarianzfunktion
Einfügungsdämpfung
Extrempunkt
Signiertes Maß
Gleichungssystem
Aggregatzustand
Computeranimation
Richtung
Zahlensystem
Client
Prognoseverfahren
Metropolitan area network
Güte der Anpassung
Prognostik
Strömungsrichtung
Digitalfilter
Bitrate
Dienst <Informatik>
Rechter Winkel
EinAusgabe
Messprozess
Instantiierung
Fehlermeldung
Geschwindigkeit
Subtraktion
Kontrollstruktur
Geräusch
Diskrete Gruppe
Bildschirmmaske
Informationsmodellierung
Spieltheorie
Datentyp
Programmbibliothek
Zeitrichtung
Abstand
Glättung
Varianz
Beobachtbarkeit
Matrizenring
Datenmodell
Indexberechnung
Elektronische Publikation
Verdeckungsrechnung
Geschwindigkeit
Mittelwert
Matrizenrechnung
Punkt
Prozess <Physik>
Minimierung
Kartesische Koordinaten
Zahlensystem
Nichtunterscheidbarkeit
Uniforme Struktur
Meter
Plot <Graphische Darstellung>
Einflussgröße
Lineares Funktional
Nichtlinearer Operator
Permutation
Prozess <Informatik>
Datenhaltung
Globale Optimierung
Datenfeld
Automatische Indexierung
Mathematikerin
Projektive Ebene
Aggregatzustand
Gravitation
Rundung
Rahmenproblem
Ortsoperator
Physikalismus
Gruppenoperation
Matrizenrechnung
Zahlenbereich
Vektorraum
Sprachsynthese
Term
Datensatz
Ungelöstes Problem
Mittelwert
Schätzung
Gleichungssystem
Fehlermeldung
Kreisfläche
Stichprobennahme
Betafunktion
Softwarepiraterie
Zwei
Vektorraum
Kovarianzfunktion
Umsetzung <Informatik>
QuickSort
Flächeninhalt
Loop
Mereologie
Bitrate
29:11
Matrizenrechnung
Turnier <Mathematik>
Kovarianzfunktion
Bit
Umsetzung <Informatik>
Prozess <Physik>
Punkt
Extrempunkt
Signiertes Maß
Iteration
Rekursivität
Gleichungssystem
Binärcode
Computeranimation
Übergang
Richtung
Meter
Plot <Graphische Darstellung>
Analytische Fortsetzung
Einflussgröße
Schnittstelle
Umwandlungsenthalpie
Lineares Funktional
Abstraktionsebene
Wiederkehrender Zustand
pBlock
Quellcode
Biprodukt
Arithmetisches Mittel
Rechenschieber
KalmanFilter
Datenfeld
Genetische Programmierung
Festspeicher
Dateiformat
Messprozess
Information
Programmbibliothek
Aggregatzustand
Lesen <Datenverarbeitung>
Instantiierung
Fehlermeldung
Geschwindigkeit
Subtraktion
Web Site
Ortsoperator
Selbst organisierendes System
Gruppenoperation
Zahlenbereich
Implementierung
Zellularer Automat
Anfangswertproblem
WebSeite
Term
Datensatz
Informationsmodellierung
Variable
Erwartungswert
Software
NotebookComputer
Pi <Zahl>
Programmbibliothek
Äußere Algebra eines Moduls
Zeitrichtung
Glättung
Abstand
Optimierung
Inklusion <Mathematik>
Ganze Funktion
Varianz
Hilfesystem
Leistung <Physik>
Soundverarbeitung
Schätzwert
Algorithmus
Relativitätstheorie
Physikalisches System
Umsetzung <Informatik>
Elektronische Publikation
QuickSort
Design by Contract
Mapping <Computergraphik>
Flächeninhalt
Hypermedia
Mereologie
Speicherabzug
Verkehrsinformation
Metadaten
Formale Metadaten
Titel  Handling GPS Data with Python 
Serientitel  EuroPython 2016 
Teil  145 
Anzahl der Teile  169 
Autor 
Wilhelm, Florian

Lizenz 
CCNamensnennung  keine kommerzielle Nutzung  Weitergabe unter gleichen Bedingungen 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nichtkommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben 
DOI  10.5446/21141 
Herausgeber  EuroPython 
Erscheinungsjahr  2016 
Sprache  Englisch 
Inhaltliche Metadaten
Fachgebiet  Informatik 
Abstract  Florian Wilhelm  Handling GPS Data with Python If you have ever happened to need to deal with GPS data in Python you may have felt a bit lost. This talk presents libraries starting from basic reading and writing GPS tracks in the GPS Exchange Format to adding missing elevation information. Also visualisation of tracks on OpenStreetmap data with interactive plots in Jupyter notebooks is covered. Additionally common algorithms for GPS like DouglasPeucker and Kalman filter are explained.  If you have ever happened to need to deal with GPS data in Python you may have felt a bit lost. There are many libraries at various states of maturity and scope. Finding a place to start and to actually work with the GPS data might not be as easy and obvious as you might expect from other Python domains. Inspired from my own experiences of dealing with GPS data in Python, I want to give an overview of some useful libraries. From basic reading and writing GPS tracks in the GPS Exchange Format with the help of gpxpy to adding missing elevation information with srtm.py. Additionally, I will cover mapping and visualising tracks on OpenStreetmap with mplleaflet that even supports interactive plots in a Jupyter notebook. Besides the tooling, I will also demonstrate and explain common algorithms like DouglasPeucker to simplify a track and the famous Kalman filters for smoothing. For both algorithms I will give an intuition about how they work as well as their basic mathematical concepts. Especially the Kalman filter that is used for all kinds of sensor, not only GPS, has the reputation of being hard to understand. Still, its concept is really easy and quite comprehensible as I will also demonstrate by presenting an implementation in Python with the help of Numpy and Scipy. My presentation will make heavy use of the Jupyter notebook which is a wonderful tool perfectly suited for experimenting and learning. 