Merken

An image browser for the Planet

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
ch what
happened well everyone and my name is also drives expand from Thailand's work the at scene where we build tools different tools that user used to interact with elementary and find and search for things and Tim job but all just back up a and also the here's bottom through presentation from work cited the talk yesterday about what we've been building the ends of the challenges and what were some looking for going in future right so I have to get started to think it's useful to kind of look at what we try to do with and we are Our mission 1 to image the whole world every day and to achieve this we have a constellation of satellites called the doves re right now have 30 and they orbit the Earth in 90 minutes and the capture of images which we call scenes which are then the natural ground station make it your image pipeline and and actually become part of our image library we can't have 150 by 2016 which is along the way they have Delia's coverage chief um that's a lot of now what we can detect the problem we're tackling is how are we going to let users find the images they need care about among the stats library we saw the thank you just plug in the showed earlier and we're trying to build a tool that you can use in the browser to do this on we call this the scenes explored creatively on and it's supposed to help you browsers large image library now the the interesting thing
about working on this problem is that the problem statement can changes as the operation grows and as more and more scenes become part of our library we realize what the real challenges are of presenting the users and letting letting axis we of right now we're working on and what were presented as the 3rd iteration of this product you can see here some tiny little screenshots that you probably can't 0 but these are the 1st 2 iterations and in fact the 2nd 1 is the 1 that we were working on was submitted stock and so the through all this iterating what we learned this seems obvious space and time are really important and in fact they are and but you know and have typical programmer fashion we're trying to generalize everything and treat space and time as filters like anything else are scenes carry with them a lot of metadata about when they were captured with the capture look like and we're trying to let users filter by everything and everything and that space and time 20 treat them as equals everything else but through our user testing we realize that as humans we live and dimensions of space and time and those are particularly important so we decided we should treat them like so and that led us to can we define what we currently think our core user experience needs to be some essentially we think that users should be able to define an area of interest and that's the primary the tool select the images that they wanna directly once the area is selected the there is an interesting problem where our frequency of capture who will create large stacks of images and you users need to be able to work your way through the stacks and selecting things that are older at the bottom becomes complicated unless we allow you to use time isn't as a dimension to work your way through the steps and after that you might wanna refine your search further by other aspects of the injury for instance you might want to not see images that are particularly cloudy or you want to see a certain ground sample distance at least or images there are not too often and last but not least you will want to be able to select and download the images that you and so I guess at this point so what this will look like a here it is a gray box
no it's actually got this is what it looks like and I hope you can see it well I'm so the user as tools up presented with a coverage map this shaded kind of map here is our current coverage as the images are on the captured right now by the 3 satellites in areas that are specifically cast the moment and as they said our 1st 1st step is for users to select the location of the so now for the purpose of this demo I think we can look at no and
that's searching wall creates the automatically created an area of interest for the user 1 could also use of polygon drawing tools to select better more carefully but this works for now as you can see that we're currently now loading all images within this area and the then still living and it takes a while but we will be able to go through and filter through all those images once they're all of it the and yeah it's the it's so I think we're at their their so now we're working with the 60 to 100 images more or less and as I was saying this is the tool that you can use to I work your way through the through time and analyze how images are stacked up and let's say we hear about August since it's pre-selected here the we can now look at the images for August North Korea and see now at this point i was saying we
have an additional set of filters this is the metadata that's carried there all along with the each scene and so we can filter in real time with these as well the not only time and we can you know select images that have ground sample distance we don't want it to be too far off later we want that S and are to be excellent hands we want the quality to be outstanding so that leaves us with a smaller selection of images that we can now browse through here and I know there are some there really nice even though the user pretty small thumbnails weakened actually even order them by the cloud cover and look at these images with a least cloud cover
and all these great signal reaches let's look at what that looks like a really is coastal scenes are pretty nice so here's a pretty good looking strip of imagery and at this point we could decide that we want
to select these images that we like and download them uh we can download the analytical or visual products or on rectified Judas and and indeed
every density and send it to my e-mail but so this is what to look like at this point and let's see the in the right so a little bit and behind they're on the topic of what's behind the scenes for which used to propose seems Explorer and primarily the the planet API I'll be talking a little bit more about this but this is the HTTP API that we use a query and the scenes catalog can get that metadata and the client at the whole application assembled with react and that we have about 100 130 react components that comprise this application years seeing and there's a bound to the sister application for browsing mosaics and a couple others for managing your account to them reflux is the library using to manage data and and manage actions encode data flow in application OpenLayers 3 provides the whole mapping interface and the geometry model feature model them and does all the rendering and then that as is at the start uh there's a coverage map the reusing Carter DB for so we can have all the same data incurred to be and we initially in the coupled initial versions the 2nd version of the Explorer also did all of our filtering entirety so when you're away zoomed out we're using as a template to send the filter parameters to carry B and then having that rendered tiles we end up feeling like it it wasn't as interactive as we wanted we 1 really to be able to allowed users described through time really quickly and see uh what imagery was available and so now we've gone with this more a constrained workflow you 1st have to select a small area and we download all the data to the client the lower so using Carter the for that initial coverage maps and many other uh open source libraries primarily on non way so the planet labs API and this is something that we encourage people to to Apple cat in and get involved with using we have an open data sandbox that project that you can come to our table on doctors about you can sign up give me the IT and start accessing uh making use of the and both the scenes explorer and the defined API the planet guide primarily exposes scenes the scenes resources we've been looking at also mosaics the mentioned the child renderings of a boat scenes mosaics and then resampled thumbnails another image products in addition there is support for workspace management so you can save all those filters were being generated you can save filters and it get notification this new imagery that matches your workspace criteria you could do that all with API as well you have to go through the the the you I've so at quick glance at the time at the API this seems that these scenes real again a called authors scenes we also of Landsat scenes and be introducing more scene sites in the API by the structure of the data is all due Jason that's all you need to get out of here so it's easy to work with with the tools that you're accustomed to but and you can drill down get an individual scene same goes for mosaic so you can page through all mosaics get individual mosaics and the mosaics are broken up and acquired so you can get quiet data and then you can draw them further and get information of the scenes that went into those squads so I encourage you to check out the that find that comes slash stocks as API documentation and if you're interested in using it please come and sign up for this Open Data sandbox yeah to support accessing the API i I would put together a number of client libraries so the application we just we're looking at User JavaScript-based client library that facilitates interacting with the data from the planet API that also has a command line component so this a node based tool you can use to that can experiment with the API that the download things to save workspaces to do all the things you can do API there's also a Python client that's been developed in reusing internally and encourage people to explore both of these and then they're both up on the planet labs website with the names edition of so so the main challenge that we face in developing the scenes was getting good performance with this real time filtering and sign action and getting the you know the ability Distrib through time and use metadata filters effectively and see everything that happened in the browser the quickly now the problem is that filtering is a relatively slow OK relatively in the sense that it's about 100 ms when you're working with that 10 thousand scenes on the average laptop so if you're trying to do that in real time scrubbing through things and and every most move filtering again 100 ms long time and it won't work because everything will be very Janke and not plus so we 1st that's what we're doing of course and that was very good so we decided to try and move the filtering to this would free up the main thread of the application and have everything happened on side and the you I would be superfluid everything would be great and so we did that and it turned out that's not exactly the case because the things were happening in different thread but transferring the data back to you I was something that took time this is because the data is transferred between the thread through a copy operation and so that would block the threat of the block the UI thread receiving this data for not as long as before but still in a noticeable way and so the solution to that is to use a transferable objects from transfer or uh are a buffer that objects can be transferred between the 2 threads without any copy penalty because they can be transferred mediately and we decided to use this to transfer them between the worker and the you like we used a g above libraries library in that box for the scenes because these are duties and data features and in that way we get the additional benefit of not only including them in an array buffer but also they data structure becomes a small and the indexes will put a lot of networks are transferred its history buffers and
last optimisation we did we only transfer incremental updates between the feature set and the indicators that the worker creates when filtering so that were transferring the minimal possible amount of data between 2 and this is a slide that is not that great but work more on it but it can explains how this works and then the the main of thread explorer sends the messages to the Web Worker going and you know we we need these scenes for this area and the filters are updated such and such the Web if necessary makes at http requests to the API using the client libraries can was talking about and receives the data back and works through it filters that creates the although indexes that the the front and we'll need to actually display all the filter data and it sends it back with these incremental responses and very few Buffon coded for the seems data or just simply 0 I store a buffer for the indexes so that proved to be relatively performance and we can do handle 20 thousand scenes easily without having much of you healthy and regular laughter and so on if there is discussion who have been doing this kind of so front we figured that 20 thousand scene is a reasonable that 1st of all we want to to get that to be better and we want to be able to use to more scenes all at once and but we also feel that that is kind of false problems that you need to be able to if you're working with 20 thousand seems to begin with the probably the something in in in your workflow that is not correct we need to drive you to the point where you have a reasonable so quiet it seems to work from before you find what you want so I that and this is and and alright so display room for for further improvements here become 1 things we prototype that and I that we may go forward with is to add G about support today the ISO so currently the API is responding with 2 Jason and that ends up being pretty verbose and are pretty bulky payload so we put together at prototype of a cat it works with about so we have G about all the way through the i worker thread makes request gets back these Protocol Buffer included scenes that they need to be passed there and and turned into objects of and can be filtered but then that same proto buffer is sent over to the main thread and I then that the U I works with those has to pass those and and and openers future and geometry objects were close but ends up being we can squeeze out the 2 kind use it do Jason and use it give up we can end up squeezing 2 times as many features provide into the G about the spread of buffer encoded features so that's something we're looking into doing we also have learned a lot about the API in use it where we have and 1 really trimmed down the payload the if you remember from the side way back there's a bunch of that traverse all information that sent with all the scenes of how you get to the next seeing the IT page through the responses and we may have a more compact form of the data were client can say I want I want this extra and metadata as what the minimum to work with in addition we want to transfer more work over to this uh the worker thread were still doing a lot of sorting in the main thread in time the user clicks on a new heading heading in these metadata panels we sort through that potentially thousands of scenes in 1 move the sorting generating indexes uh and for different sort of scenarios over to the worker thread and sending those back so you can really quickly I show representation so it seems an inference and in addition the name was significantly for the Scott work were excited about open-sourcing more of this so we we just started with the client libraries that are open source and were working to take the components that we've used to build the application and make them more of a more generic and more usable and so we look forward to that making those of source and and see if we can have yes provided companies that others could use in somewhat that applications the so that said thanks for the attention in the new evidence on the questions what it the the the are you also open source in the planet labs API bigger itself is publicly accessible and the public said public if you signed up for a key you have an API key Xavier Mena no that's using the API but they actually this the code of the the article I we should pressure of the folks on that side to do do that that's in other parts of it I think it broken and and sort of reusable libraries there's also a lot of there are no other secrets and that in the current hundreds of deploying API and so so take some work to make it reuse although I think that would be nice to do DAPI is always useful for a lot of things yeah good point In the so so what's the and of know there is an licensing limits I think it may be that you have unlimited access but you can just use it for your own enjoyment and again uh and distribute them that we should know that I guess most of you couldn't and yeah we tried it on over all right about it's it's sort of the American the bacteria with it you engage in the back of the it's something of themselves or very interesting that was called the uh this the thing you have to it of this and and this that the owners of the nodes are the and like that in the rules of the game uh thanks for coming thank you few
do
Satellitensystem
Subtraktion
Browser
Natürliche Zahl
Browser
Orbit <Mathematik>
Statistische Analyse
Kombinatorische Gruppentheorie
Computeranimation
Satellitensystem
Motion Capturing
Demoszene <Programmierung>
Spezialrechner
Orbit <Mathematik>
Prozess <Informatik>
Rechter Winkel
Minimum
Mereologie
Demoszene <Programmierung>
Programmbibliothek
Gruppoid
Bildgebendes Verfahren
Satellitensystem
Programmiergerät
Punkt
Metadaten
Quader
Hausdorff-Dimension
Mathematisierung
Keller <Informatik>
Iteration
Computeranimation
Demoszene <Programmierung>
Spezialrechner
Metadaten
Stichprobenumfang
Speicherabzug
Programmbibliothek
Flächeninhalt
Abstand
Bildgebendes Verfahren
Minkowski-Metrik
Softwaretest
Nichtlinearer Operator
Filter <Stochastik>
Befehl <Informatik>
Raum-Zeit
Biprodukt
Frequenz
Mapping <Computergraphik>
Motion Capturing
Flächeninhalt
Mereologie
Demoszene <Programmierung>
Speicherabzug
URL
Instantiierung
Addition
Filter <Stochastik>
Punkt
Thumbnail
sinc-Funktion
Polygon
Computeranimation
Metadaten
Echtzeitsystem
Menge
Flächeninhalt
Trennschärfe <Statistik>
Stichprobenumfang
Abstand
Bildgebendes Verfahren
Demoszene <Programmierung>
Punkt
Bildgebendes Verfahren
Computeranimation
Extrempunkt
Browser
Datensichtgerät
Datenmanagement
Wärmeübergang
Computeranimation
Homepage
Metadaten
Client
Vorzeichen <Mathematik>
Polygonzug
Elektronischer Programmführer
E-Mail
Schnittstelle
Addition
Automatische Indexierung
Extremwert
Datennetz
Mosaicing <Bildverarbeitung>
Wurm <Informatik>
Biprodukt
Menge
Rechter Winkel
Client
Computerunterstützte Übersetzung
Tabelle <Informatik>
Objekt <Kategorie>
Subtraktion
Online-Katalog
Räumliche Anordnung
Demoszene <Programmierung>
Open Source
Knotenmenge
Informationsmodellierung
Bildschirmmaske
Globale Optimierung
Spieltheorie
Endogene Variable
Programmbibliothek
Thread
Indexberechnung
Gruppoid
Datenstruktur
Protokoll <Datenverarbeitungssystem>
Open Source
Indexberechnung
Schlussregel
Datenfluss
Echtzeitsystem
Thread
Parkettierung
Offene Menge
Demoszene <Programmierung>
Bit
Punkt
Inferenz <Künstliche Intelligenz>
Selbstrepräsentation
Versionsverwaltung
Kartesische Koordinaten
Datenmanagement
Wärmeübergang
Volumenvisualisierung
Prototyping
Parametersystem
Nichtlinearer Operator
Filter <Stochastik>
Template
Globale Optimierung
Abfrage
Applet
p-Block
Rechenschieber
Funktion <Mathematik>
Automatische Indexierung
Projektive Ebene
Information
Decodierung
p-Block
Schlüsselverwaltung
Message-Passing
Web Site
Thumbnail
Quader
Gruppenoperation
Zahlenbereich
Code
Puffer <Netzplantechnik>
Benutzerbeteiligung
Rendering
Mosaicing <Bildverarbeitung>
Inverser Limes
Zusammenhängender Graph
Bildgebendes Verfahren
Autorisierung
Quick-Sort
Objekt <Kategorie>
Mapping <Computergraphik>
Modallogik
Flächeninhalt
Mereologie
Vorlesung/Konferenz
Computeranimation

Metadaten

Formale Metadaten

Titel An image browser for the Planet
Serientitel FOSS4G Seoul 2015
Autor Isaacs, Alessandro
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben.
DOI 10.5446/32039
Herausgeber FOSS4G
Erscheinungsjahr 2015
Sprache Englisch
Produzent FOSS4G KOREA
Produktionsjahr 2015
Produktionsort Seoul, South Korea

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Scenes Explorer is one of Planet Labs' end-user web applications, which allows clients to select imagery of interest from our vast and ever-growing satellite image library. Each one of Planet's images, referred to as a "scene", is searchable and downloadable in different product formats using Scenes Explorer. It was developed using Planet's public APIs, OpenLayers 3, and the React framework. We have leveraged these technologies to create an application that allows users to browse Planet's huge data library and identify imagery of interest, all while maintaining the quasi-native level performance that is expected of modern web applications. To achieve this we had to devise strategies that allow us to present, but more importantly multi-dimensionally filter large amounts of geographic data in real-time. The presentation will start by describing Planet's public APIs and how they can be integrated into a web-based mapping application. This will be followed by a deeper dive into the challenges of representing and dynamically filtering millions of image footprints in the browser and the tools, strategies, and UX we have developed to overcome them.

Ähnliche Filme

Loading...