Merken

From global observations to local information: The Earth Observation Monitor

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
OK so no no we would go my you must have ability for speaking about the situation when you are and and and welcome to my presentation about the observation 1 or from the old observations to local information and this is the top with in the in the in the academic track so so what are the the
main challenges when working with us observation data and data and data in general so we have an increasing global data and data clients and then we need to think about how to to simplify it times data access and then and also analysis and we need to 2 decimal increasing need additional related to monitor environmental changes because the diocesan this is changing and we have the and the data to to monitor on the these changes and and if you there have been several years ago was so some some research going on for example about the next generation to the to to the house uh from exactly dominant to the 2008 and a stated to have been in the need to have easy to use interfaces and the problem problem oriented focus so this is not not neutral but but but still we we have um uh many issues to to phase when accessing time-series data and and making some analysis so there are a lot of steps included uh for example for downloading data for certain searching for data but also to uh to to process the data sets um but but also in the new are initiatives like diffusion future putrescent messages they data we need to make research accessible for all kinds of parties and we need to develop useful tools for applying knowledge so the the awareness is this is and that it to have to always started to develop tools to make us observation data better accessible and this is also the and the content of in this paper and this probe presentation so
the object of some general to simplify the talent data access and analysis without having the user to to process any data by themselves to make suffered tools available as as web services because then we can use or to integrate the induced services in any kind of applications such as mobile application for them as a web-based tool use these mn scripting languages for example but but also to to provide easy-to-use applications for all of us they called us and and sciences and in general to to come to my statements from global observations to local information so we have global datasets but in general the the users interested in some local area so and in this case we have our users here we have our local area of interest and we have about our global last observation at Dayton Dayton data kind and on and on the right we have some analysis also puts on the and the information extracted from the DOS observation data because and the and the technical background behind this is of course a service based infrastructure providing web services so at that at the that we have our client
applications as a set before a web portal
of the mob mobile application or any other software tools and they are in general connected to the to the Internet and at the bottom we have our geospatial services infrastructure providing web services and and in the best case and the geospatial domain now what it is that services are all mode is the compliance so providing the time information using the catalog service providing the of his Asian services for ideas and well with the web map service providing access services for example was the web coverage services and the special status especially processing services using the OGC Web Processing Service specifications that better benyamin has also introduced in this last uh in in the last part of presentation and will that means in this step processing services we can have different those services can for example when we're using a Python and based on Web processing Service empire WPS and uh you can use any more Our anything where would you can use will impact and you can also exposed as a Web Processing Service you can also have a library named our to so as to have a connected direct connection from prior to to the statistical language R you can also use the command line tools and and have any other software package that is is available and these can be expressed as a weapon processing services so that this process is it is available to you to draw kind of users so let's
take a little bit about the processing services and and I have a list of research topics that is as this it is still on the ongoing we distribute the problem processing we have semantic processing we have process orchestration and processing in the Cloud Moving carrot and then modeling and generation and and this shouldn't just just our own only 1 1 paper referred to each of the topics but see the yes and they and they are still up to date so so there's uh as still the research going on but this is not that are what needs to be become and consider some so we need also to think about uh the society in general providing a process all but we can also use provided with processing services like um providing processing services for data access our data and discovery because we can benefit from different processing steps linked data integration and data analysis tools because at this time to to provide an analysis tools as as a web-service but at 1st did the user needs to have their and the data available so we need the it looking good to link to link these 2 steps to the data integration of our data and data access and the analysis tools can of course also the feasibility or the applicability of processing services uh need also be a little bit more investigated what kind of 2 could we provide was down WPS so services so I'm 190 problem we're working a lot with vegetation time series analysis of them this is a plot uh of of a single pixel showing that clearly had sent a negative trend from the year 2000 to the year 2 thousand and 15 and that showed the to produce the enhanced vegetation index as shown in the mutation and g and the utility and the is a clear change of so uh was in 2000 11 around so we have a clear and low value of of patient and this directly comes from the from the satellite but this is just the data what you want this this is information so there are a lot of analysis tools available to to extract information out of the in there the data and then these our plots from 2 words that have been developed were moving the research community like the start on on the top left is that's about the breakpoint the detection so analyzing the time series and looking for and fall
prey and to come and this is sort of from the same times as as so she shows him in the previous slide and there was a change detected in the environment 2010 around and change for the seasonality of 2 vegetation so these are useful information so now we we don't have the the data we have the information about the date when the change has been accused and there are also other now analysis to it's like this . com on the right we have the um this baby spatial trend analysis tools uh maybe it's not too good to discover from from the back but you have bound pixels our my pixels and the kernels bound with negative trend and pixels with the color of green will positive-strand so that you can clearly identify the the area where we have a negative then the iterations and and the positive the contagion and there are from tools but these tools are also available as individuals suffer package they open source and then then they can only be used but they are not connected to to any data set so you have to to a the the data and the data yourself and then we
can have a look what what kind of
tools are available was already available available as online time data access and data analysis tools and there are a few and this is just an excerpt uh does for example now zadnje of irony of providing now uh analysis to its annexes to do more this datasets and there's
also a based Web and Web service from those Oakridge National Laboratory in many years providing access to data sets have been there are added to it but in general they are all have so there are limiting to that are just a few it's not all day and my datasets so they they all have have limits and unfortunately there and they're not probable the most of them are not providing a standard-compliant service system does only the last 1 the the also work and then there providing Web coverage Service and the back covers processing services but and they are not having the and the complete data sets from from what the segmentation and available for example and so on so I would say that that is using this estimate compliant processing services like entity dubbed WPS as it is a very good technical solutions to integrate services associated external applications so making processing tools available to others that they can use them in their in their applications
and this is during there the mainland uh . within our concept of of the US observation monitoring and there are the main points are entering the easy access and analysis of spatial terms is they they have found that monitoring on a local scale so you cannot do any global analysis within our framework because where we're focusing on on on the local scarce and we provide web web services based on geoprocessing services so everything what what is done within the the US observation monitoring is being exposed as web services and the if this has anything to do with processing of existing data datasets integrating the there's there's Indian these are explosives geoprocessing services so we're we're
losing a lot of open source software um for the web and then the mobile did their application development uh like Jake very open
layouts Jake very mobile above but also imply the geospatial Web services we're using map server and the pi that WPS and for for the geospatial processing and there is the the our statistical language but those a Python tools for our times analysis and then the times of tool for the addition phonology and the need to
1st point of the US and
was observation money toward the 1st step within this the automated data access because of course is we wanted to make some analysis we we need the data 1st so we we have developed a data processing middleware uh and then you and so it's a
Python library that that budgets is bridges the gap between users and data providers so I'm at the bottom we have our different data providers like modus uh also from all this data about but also for climate station data from have for example and then we we have built up some some connect this to draw on the these external data sources and then integrate the data right in our a common data format and then provides the data at 1st will vote is a compliance services but also provides not only the the data for visualization but also and and some some retired from our mission was the catalog service for that and then in the 2nd step is that some of
just have all of your of foster datasets we have most mothers vegetation and a data we have some more and more this land surface temperature data available and we have some some clans station data available within the US observation 1 and 2 and those in the 2nd step is then the in the in the and data analysis and the link to the data and integration so it in the 1st part here is as stated the automated
data access so the data retrieval of on the 2 input from the user so the user only has to there has to stay the the area of interest of course and the name of the dataset he can also provide some some further problem apparently and depending on the data set like quality mom masking and so on and D and the status and I'll start in local problem told processing down directory and this is also a WPS so we have 1 WBS for the data received retrieval and another WPS for the data analysis and their and therefore we we also used in local processing them directly to directly access to data sets and make the analysis and all the steps for data preparation and running the analysis and to the the preparation of the the output size has been been ultimatum and also a little bit optimized for of force of a performance and all the outputs are then also provide pool where is he complained services so as
a said before the end the data are also provided with artistic complain services here we have a list of of different outputs for for each of the analysis tools so we have the uh uh C is the founder and that been transformed to to and the observation service we have just normal plot in images but uh we we have also all want to see that service or quarter-pixel shapefile provided as uh as a web feature service that can be directly used them in the port of form for example so here I have
2 to 3 examples for uh about processing
service for the data retrieval for example and this is just and just made http we request with the data set as an input and then the day the pixel column coordinate and then the the process is being executed and you can more that the client can can't wait for the results and the 2nd was for the diamond data analysis of this is nearly the same source 1 output of the data retrieval is a unique identifier and will list unique identifier I can reference to the datasets in the analysis and is a space for for this local processing directory so that we can use this integrated data datasets diamond directly within the and the data analysis tools the so for this we
we have a web based tool that developed their work and users directly through create some some future compliance compiling on it and directly integrated in the dataset and also makes some analysis uh without any processing steps the picture that this is all done by the by the server and that the user has always the possibility to to Donald at their own all the data as it is about archive and to to process the data further offline on his own computer form for for example and we we have a mobile applications so directly in the field with the GPS position for example you can access to the WPS that the application accesses the WPS and extracted the 15 years of of time data and provide some some plots and uh analysis and results I have some some example areas where where we have a test in this uh where we can see some some now changes for example in the Bavarian Forest in unique where we have
uh and uh the growth of the volume vegetation beginning and 2004 up after the bark beetle attack some so here you you can use this in the data but also the the information from the analysis results to detect exactly the media and the date of to change and there are some further
uh possibilities for example here you can have an of uh this freelance at seeing from 1998 from 2001 and 2014 and unfortunately that there were no good lands of things between 2002 1 and 2 thousand and partying and and that was clearly change of a forest you can can cannot discover this was the land of the and but with small this data with the daily more this data set and we can clearly had a identify when the change has happened and the best I come to my conclusion
so some of notice the WPS need she it definitely to become so that establishing easy-to-use data access and the and and analysis tools and especially to reducing barriers uh for as observation times data because still others this and there are some some ongoing ongoing declivities class but for of for most datasets it is still very complex for use of self are kind of users uh but also for for for scientists for students and so on so it must be be easier and we we need also to link the data X is listed data analysis because you cannot do analysis of art uh accessing and then the data and we need to hide it the context data and data processing tasks that some future work we will uh doing this kind of field so providing metadata for processing services with semantic descriptions to integrate real moving code approach so that you can upload some some analysis for example by using the Jupiter notebooks to directly process data on the server and to extend their their ideas observation and talk to the sentence data and with this and like to thank you for your attention since it is with those anybody have any questions this yeah
I can many of them but 2 will slide 15 distinct and efficient multi year in the form of self close
to the Lucas because that's a bit of a flaw in this specification and you can you can have everything a single and this 1 little was it's so
small is like the 1st 1 data input on knowledge also uh it's just the enormous adjusted to be not get carried k the T because key-value pair but it didn't put linear what's that what's the form of of the input the data they're just the form of and unions and what I mean it's it's a string of is the string or is the area into
yeah and the and the and these are strings so so the and these are the parametres dataset named behind the online and then the and these are the the literary data inputs could and was just thinking about it doing this some semantics about these inputs or and you know book about regarded them
more living with these services and he has this is so far as a set on and on the last slide in the future work we would like to to to describe uh every input parameter will this the time information nation but this is not included at the moment but can you could also sing about profile and maybe here so 6 Phineas official where she's going from a lot of questions the what are the targeted users here is it mainly for research and all of that's in the beginning basically is a research topic and then later on was the use of you target yeah uh as you said at the beginning of from for research of course so so we have uh we have requested to to to to use these users as observation it also from from other departments around the world because they they have the same users when when existing data sets and unrelated later stage so we we plan to to make there are also with the south Serbian beauty behind is available as as open source so we're we're we're um developing this as an open per project with the possibility to um to set it up for all 4 regional problem projects so so uh I had request from 1 guy from the the water county Department in the US and they they were very interested to have regional maps from all this data from from Landsat data from with an automated post processing chain so the DEA is to to provide tools to to automate the extraction of a given area of in his interest like state for have for example and to provide 2 other automated tools services or what to expect that they can become have access to the data that is clipped to the area of interest thank you very much so you can move to another city and if you need it you have a
Stellenring
Weg <Topologie>
Einheit <Mathematik>
Weg <Topologie>
Luenberger-Beobachter
Räumliche Anordnung
Oval
Information
Kombinatorische Gruppentheorie
Microsoft dot net
Computeranimation
Formale Sprache
Web-Applikation
Kartesische Koordinaten
Kombinatorische Gruppentheorie
Analysis
Computeranimation
W3C-Standard
Client
Digitalsignal
Web Services
Zeitreihenanalyse
Standardabweichung
Luenberger-Beobachter
Schnittstelle
Analysis
Web Services
Beobachtungsstudie
App <Programm>
Befehl <Informatik>
Prozess <Informatik>
Mathematik
Stellenring
Programmierumgebung
CAM
Kontextbezogenes System
Fokalpunkt
Objekt <Kategorie>
Reihe
Software
Generator <Informatik>
Menge
Flächeninhalt
Rechter Winkel
Fünf
Räumliche Anordnung
Information
Message-Passing
Prozess <Physik>
Formale Sprache
Gewichtete Summe
Online-Katalog
Kartesische Koordinaten
Kombinatorische Gruppentheorie
Computeranimation
W3C-Standard
Benutzerbeteiligung
Domain-Name
Web Services
Software
Minimum
Mapping <Computergraphik>
Programmbibliothek
Weitverkehrsnetz
Vorlesung/Konferenz
Web Services
Umwandlungsenthalpie
Algorithmus
App <Programm>
ATM
Statistik
Prozess <Informatik>
Mapping <Computergraphik>
Reihe
Portal <Internet>
Mereologie
Client
Information
Innerer Punkt
Modul <Software>
Satellitensystem
Bit
Subtraktion
Prozess <Physik>
Desintegration <Mathematik>
Datenanalyse
Kartesische Koordinaten
Analysis
Computeranimation
Kernel <Informatik>
Konsistenz <Informatik>
Negative Zahl
Informationsmodellierung
Web Services
Zeitreihenanalyse
Code
COM
Analysis
Web Services
Streuungsdiagramm
Automatische Indexierung
Pixel
Prozess <Informatik>
Mathematik
Feasibility-Studie
Datenmodell
Softwarewerkzeug
Mailing-Liste
Plot <Graphische Darstellung>
Quellcode
Satellitensystem
Rechenschieber
Reihe
Generator <Informatik>
Flächeninhalt
Twitter <Softwareplattform>
Verschlingung
Automatische Indexierung
Rechter Winkel
Modem
Räumliche Anordnung
Wort <Informatik>
Information
Programmierumgebung
Streuungsdiagramm
Web Services
Teilmenge
Euler-Winkel
Datenanalyse
Gewichtete Summe
Varianz
Analysis
Computeranimation
W3C-Standard
Warteschlange
Spezialrechner
Reihe
ASCII
Nichtlineares Zuordnungsproblem
Mehrrechnersystem
Case-Modding
Biprodukt
Ext-Funktor
Analysis
Manufacturing Execution System
Schnittstelle
Teilmenge
Prozess <Physik>
Punkt
Desintegration <Mathematik>
Kartesische Koordinaten
Term
Framework <Informatik>
Analysis
Computeranimation
W3C-Standard
Spezialrechner
IntServ
Metropolitan area network
Benutzerbeteiligung
Web Services
ASCII
Mapping <Computergraphik>
Inverser Limes
Luenberger-Beobachter
Analysis
Schätzwert
Web Services
Zentrische Streckung
Vervollständigung <Mathematik>
Varianz
Indexberechnung
Differentialgeometrie
Physikalisches System
Warteschlange
Reihe
Menge
Räumliche Anordnung
Case-Modding
Explosion <Stochastik>
Ext-Funktor
Manufacturing Execution System
Retrievalsprache
Prozess <Physik>
Formale Sprache
Kartesische Koordinaten
Ikosaeder
Analysis
Computeranimation
W3C-Standard
Open Source
Metropolitan area network
Benutzerbeteiligung
Web Services
Software
Weitverkehrsnetz
Mobiles Internet
Softwareentwickler
Analysis
Web Services
Addition
Prozess <Informatik>
Open Source
Google Maps
Satellitensystem
Mapping <Computergraphik>
Software
Funktion <Mathematik>
Framework <Informatik>
COM
Server
Versionsverwaltung
Metropolitan area network
Online-Katalog
Punkt
Emulation
Fächer <Mathematik>
Dezimalsystem
Datenverarbeitung
Bridge <Kommunikationstechnik>
Computeranimation
Analysis
Subtraktion
Datenanalyse
Online-Katalog
Bridge <Kommunikationstechnik>
Analysis
Service provider
Computeranimation
Open Source
Web Services
Online-Katalog
Datennetz
Minimum
Arbeitsplatzcomputer
Visualisierung
Programmbibliothek
Zoom
Luenberger-Beobachter
Gasdruck
Prozess <Informatik>
Indexberechnung
Quellcode
Binder <Informatik>
Integral
Funktion <Mathematik>
Emulation
Rechter Winkel
Ein-Ausgabe
Mereologie
ATM
Dezimalsystem
Dateiformat
Räumliche Anordnung
Manufacturing Execution System
Information Retrieval
Bit
Subtraktion
Stellenring
Desintegration <Mathematik>
Datenanalyse
Datenanalyse
Analysis
Computeranimation
Metropolitan area network
Benutzerbeteiligung
Bildschirmmaske
Web Services
Information Retrieval
Luenberger-Beobachter
Gasdruck
Bildgebendes Verfahren
Analysis
Funktion <Mathematik>
Web Services
Adressierung
Prozess <Informatik>
Mailing-Liste
Plot <Graphische Darstellung>
Umsetzung <Informatik>
Gleitendes Mittel
Ein-Ausgabe
Wendepunkt
Gesetz <Physik>
Ultimatumspiel
Funktion <Mathematik>
Forcing
Menge
Flächeninhalt
Emulation
Ein-Ausgabe
Räumliche Anordnung
Normalvektor
Verzeichnisdienst
Manufacturing Execution System
Information Retrieval
Resultante
Prozess <Physik>
Datenanalyse
Datenanalyse
Computeranimation
W3C-Standard
Client
Web Services
Information Retrieval
Analysis
Funktion <Mathematik>
Web Services
URN
Pixel
Prozess <Informatik>
Stellenring
Netzwerkbetriebssystem
Quellcode
Ein-Ausgabe
Rhombus <Mathematik>
Funktion <Mathematik>
Menge
Vier
Identifizierbarkeit
Räumliche Anordnung
Verzeichnisdienst
Koordinaten
Resultante
Prozess <Physik>
Ortsoperator
Gewichtete Summe
Mathematik
Kartesische Koordinaten
Computer
Computeranimation
Metropolitan area network
Spezialrechner
Bildschirmmaske
Benutzerbeteiligung
Mobiles Internet
Spezifisches Volumen
Analysis
Softwaretest
App <Programm>
Wald <Graphentheorie>
Indexberechnung
Plot <Graphische Darstellung>
Programmierumgebung
Packprogramm
Satellitensystem
Reihe
Datenfeld
Flächeninhalt
Hypermedia
Räumliche Anordnung
Case-Modding
Information
Schnittstelle
Datenanalyse
Klasse <Mathematik>
Gewichtete Summe
t-Test
Mathematik
Wald <Graphentheorie>
Analysis
Code
Computeranimation
Formale Semantik
W3C-Standard
Task
Deskriptive Statistik
Metadaten
Task
Web Services
Code
Notebook-Computer
Datenverarbeitung
Luenberger-Beobachter
Flächeninhalt
Ordnungsreduktion
Meta-Tag
Feuchteleitung
Analysis
Binärdaten
Wald <Graphentheorie>
Prozess <Informatik>
Mathematik
Güte der Anpassung
Kontextbezogenes System
Reihe
Datenfeld
Menge
COM
Server
Plateau-Problem
Räumliche Anordnung
Case-Modding
Web Services
Umwandlungsenthalpie
Bit
Prozess <Informatik>
Datenanalyse
Rohdaten
Gewichtete Summe
Abgeschlossene Menge
Luenberger-Beobachter
W3C-Standard
Metropolitan area network
Multiplikation
Bildschirmmaske
Dezimalsystem
Räumliche Anordnung
Web Services
URN
Prozess <Informatik>
Datenanalyse
Gewichtete Summe
Netzwerkbetriebssystem
Ein-Ausgabe
Computeranimation
Formale Semantik
W3C-Standard
Quader
Bildschirmmaske
Funktion <Mathematik>
Einheit <Mathematik>
Information Retrieval
Räumliche Anordnung
Parametrische Erregung
Zeichenkette
Parametersystem
Prozess <Physik>
Momentenproblem
Hyperbelverfahren
Open Source
Wasserdampftafel
Ein-Ausgabe
Dialekt
Computeranimation
Mapping <Computergraphik>
Rechenschieber
Web Services
Verkettung <Informatik>
Funktion <Mathematik>
Flächeninhalt
Menge
ICC-Gruppe
Luenberger-Beobachter
Projektive Ebene
Information
Aggregatzustand

Metadaten

Formale Metadaten

Titel From global observations to local information: The Earth Observation Monitor
Serientitel FOSS4G Bonn 2016
Teil 62
Anzahl der Teile 193
Autor Eberle, Jonas
Mitwirkende Jonas Eberle
Lizenz CC-Namensnennung 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/20453
Herausgeber FOSS4G
OSGeo
Erscheinungsjahr 2016
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Earth Observation (EO) data are available around the globe and can be used for a range of applications. To support scientists and local stakeholders in the usage of information from space, barriers, especially in data processing, need to be reduced. To meet this need, the software framework "Earth Observation Monitor" provides access and analysis tools for global EO vegetation time-series data based on standard-compliant geoprocessing services. Data are automatically downloaded from several data providers, processed, and time-series analysis tools for vegetation analyses extract further information. A web portal and a mobile application have been developed to show the usage of interoperable geospatial web services and to simplify the access and analysis of global EO time-series data. All steps from data download to analysis are automated and provided as operational geoprocessing services. Open-source software has been used to develop the services and client applications.

Ähnliche Filme

Loading...