We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Satellite exploitation platform developed entirely with FOSS software

00:00

Formal Metadata

Title
Satellite exploitation platform developed entirely with FOSS software
Title of Series
Number of Parts
208
Author
License
CC Attribution 4.0 International:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
24
197
Game theorySigma-algebraComputing platformSatelliteDifferent (Kate Ryan album)Projective planePresentation of a groupComputer animation
SatelliteCoroutineIntegrated development environmentScale (map)Product (business)VolumeSpacetimeTerm (mathematics)Computing platformPrototypeHeat transferData storage deviceService (economics)Point cloudGame theorySigma-algebraRemote Access ServiceDigital rights managementTime zoneDisintegrationSoftware bugLocal ringINTEGRALGroup actionSampling (statistics)Water vaporFaculty (division)TestbedTotal S.A.Electronic meeting systemNormed vector spaceComputer clusterAreaFile formatExploit (computer security)Data managementSatelliteMereologyDifferent (Kate Ryan album)Time zoneTraffic reportingNumberNumeral (linguistics)Type theoryData storage deviceService (economics)State observerProcess (computing)SoftwareSeries (mathematics)Range (statistics)Computing platformNatural numberWater vaporIntegrated development environmentComputer programDenial-of-service attackCategory of beingComputing platformSpacetimeFile archiverProjective planePrototypeINTEGRALMultiplication signUniverse (mathematics)BuildingDialectPhotographic mosaicPoint cloudWordClient (computing)AreaTestbedArithmetic meanLocal ringCartesian coordinate systemAnalytic setInteractive televisionResultantWeb browserCoprocessorVector potentialComputer hardwareProduct (business)GeometryComputer animation
ArchitectureOpen setData modelComputing platformServer (computing)Stack (abstract data type)Remote Access ServiceStandard deviationService (economics)Client (computing)CoprocessorDatabaseRaster graphicsPhysical systemVector spaceProxy serverPoint cloudMetreAreaSurfaceOperator (mathematics)BendingExploit (computer security)NumberNormal (geometry)Component-based software engineeringSoftwareCASE <Informatik>SatelliteRange (statistics)Computing platformMultiplication signDifferent (Kate Ryan album)Parameter (computer programming)Cartesian coordinate systemComputing platformData storage deviceObject (grammar)Power (physics)MereologyOrder (biology)Latent heatResultantInformationElectronic visual displayService (economics)Type theoryInstance (computer science)Grass (card game)Open sourceAreaoutputCoprocessorMusical ensembleMedical imagingFunctional (mathematics)Standard deviationOnline helpWeb 2.0Right angleDirection (geometry)Classical physicsSet (mathematics)Computer architectureOpen setPhysical systemFerry CorstenClosed setServer (computing)Endliche ModelltheorieProcess (computing)Projective planeMetreExtension (kinesiology)Client (computing)BitPrice indexComputer animation
Remote Access ServiceService (economics)Implicit function theoremGroup actionEvent horizonData storage deviceMixed realitySystem programmingClique-widthTransformation (genetics)Level (video gaming)Decision theoryInformationDigital rights managementDenial-of-service attackNegative numberData recoveryChannel capacityLocal ringProgrammer (hardware)SpacetimeAuthorizationDenial-of-service attackResultantAreaMultiplication signPhysical systemInformationNumberReference dataReal-time operating systemMappingSatelliteSource codeComputer animation
Remote Access ServiceAreaComputer animation
DatabaseSoftwareCore dumpService (economics)CoprocessorMultiplicationTransformation (genetics)Type theoryMachine visionSicPublic domainSpacetimePixelWater vaporDatabase normalizationPrice indexThermal radiationSatelliteComputer-generated imageryMusical ensembleMereologyData structureGreen's functionSurfaceContrast (vision)Classical physicsIterationLinear mapOpticsStatisticsPhysical systemFunction (mathematics)outputProcess (computing)Server (computing)Military operationAlgebraRaster graphicsAreaFile formatClient (computing)Remote Access ServicePixelThresholding (image processing)Normal (geometry)MappingPrice indexPower (physics)Water vaporLevel (video gaming)Denial-of-service attackAnalytic setHypermedia
Product (business)Integrated development environmentComputer networkTime zoneComputing platformVector potentialCovering spaceoutputAreaRemote Access ServiceIntegrated development environmentComputing platformNumberObservational studyTime zoneReference dataLattice (order)Price index
ArchitectureComputing platformService (economics)Personal digital assistantClient (computing)ImplementationRemote Access ServiceCodeWebsiteEmailRight angleCoprocessorFile archiverCondition numberField (computer science)WebsiteFeedbackControl flowEmailLink (knot theory)Physical systemProjective planeMappingDenial-of-service attackNumber2 (number)PrototypeWeb 2.0Computing platformEndliche ModelltheorieSatelliteCASE <Informatik>outputMultiplication signXMLComputer animation
Transcript: English(auto-generated)
So actually, I'm giving this presentation in the name of a company called Terasigna. I'm also affiliated with the Met Office, but that's my difference. So today, I'd like to talk to you about a project I'm managing on behalf of this company in Romania, but within an international consortium. So I acknowledge also my co-authors.
Okay, what's the problem? So for many years now, the European Space Agency has provided the community with a wealth of, a lot of data coming from the satellites. And more important, the last couple of years, I know most of you are from US, but in Europe, we will have a program called Copernicus as a space program of the European
Union, whose main contribution till now is a series of satellites that are producing quite a big amount of data today. So each year, we have one or two more satellites becoming operational, and the amount of data is really huge. Only two of the satellites in this constellation, the Sentinel-1 radar satellites,
generates 10 terabytes of product daily. And the community does not know how to deal efficiently with this amount of data, especially in Europe. So that's the challenge and the problem that this kind of project is trying to address.
So the project I mentioned, called EO4C, is trying to envision the ESA, ESA's European Space Agency, concept of having regional exploitation platform for Earth observation data.
This project is trying to facilitate the creation of a platform that is enable users, mainly researching in the first phase, to use big amounts of data without leaving that platform. So not downloading data on their computer, using their software, but to have like a platform where it could do this kind of satellite processing
fast with huge amount of resources at our disposal and just download the final result and compare it and so on to reports and that's it. So this project, EO4C, is a kind of pathfinder project. So it will not generate an operational platform, but a prototype one.
And if it's successful, an operational platform will be built. So first, it's a technical problem. It's an IT problem to have this working. And then it's a thematic part to demonstrate that it's useful. So the concept, as I mentioned, we have this huge amount of data.
But not only that, we also have data generated by numerical models, networks of sensor, in situ data, very important and they are coming in big numbers today. So the idea to have this cloud-based platform where we have storage capabilities, the processing capabilities,
we have processors of toolboxes, application that know how to deal with data. We try to build this platform in an interoperable way, so we could discuss with other similar initiatives. So the data should be able to be used by others through the means of OGC services mainly,
but not only. And in the end, we have some kind of interfaces, some kind of clients. We try to be a web client for the users to access this large amount of data, do something with that and then to view the results as 3D maps,
interactive charts and so on, to do analytics on top of that and so on. So for this project, we have a consortium that is formed by a number of companies and the universities from three countries, so Romania, Czech Republic and Poland. We are leading this consortium.
The geographic scope of the project, CEE, stands for South and Eastern Europe. So the geographic scope of the project is the Black Sea, the Carpathian Mountains, Danube Basin and the countries that are part of the consortium. So this is more or less the area we are trying to fill in.
This area has a number of problems that are unique to this geographic space. It's a different challenge than the other parts of Europe or of the world, so that's the idea why we're building a platform that should work for this area. The target topics that were imposed somehow by the European Space Agency are the integrated water management and flood risk prevention,
environmental risk prevention, management of natural assets and protected area, natural resource management, integrated coastal zone management and some kind of cultural and archaeology heritage, so mostly environmental topics. We have quite an important range of potential users that are presented here in different
categories. I will not go through this because it's not that important. I'll just mention the strategic end user we have that are the International Commission for the Protection of the Danube River, ICPR, the Carpathian Convention and the Black Sea Convention.
These are regional institutions that play an important role in the environmental policies in this part of Europe. And we also have a number of smaller national, local end user, different type of institution from research to emergency situations.
So we are building this on top of quite a powerful infrastructure. It's called EO Cloud or EO Innovation Platform Testbed in Poland. It's a geo cloud built by ESA with ESA resources. So we did not start this project without hardware infrastructure.
And on this infrastructure, we have around three petabytes of data. So we have a full archive of the Sentinel-2. We have quite an important archive of Sentinel-1. Sentinel-3, full archive, a lot of Landsat data and some other.
So just imagine it's like three petabytes of data and each day new data is acquired especially from the Sentinels. But it's not only that, we have a long list of other datasets. Most of them are produced by the Copernicus Core Services and are related to the Black Sea, to the air quality and so on.
Quite a number of datasets. Now the important part in this project is the Sentinel Hub. One of our partners developed an instrument called Sentinel Hub which basically gives you access through OGC services to Sentinel-2
and Sentinel-2, Sentinel-3 and the Landsat data in a homogeneous way. You could have a word mosaic at any time for all this data. Very, very, very easy. And their role in the project was to develop this for the Amazon platform.
So they had to move it on this and to adapt it to our needs. So the idea we have is Sentinel, Landsat and Mary's full archive. They are published through WMTS, WMS, WCS and some custom EAP. And the user is able to get the data based on an area of interest,
time of interest, cloud cover percentage, data format, some mosaic in order. There are some parameters that are specific to satellite data not necessarily part of the OGC services. And there are a lot of functionality around this platform. It's possible not to get... So you could get individual bands from the satellite image.
You could, on the fly, you could create derived, I don't know, some kind of indices. I put an example here for the NDVI. So you could create new data on the fly. So it would be displayed on the client side.
The architecture of the platform is following the model proposed by European Space Agency. It's called Exploitation Platform Open Architecture. It's a specification document. It's not tied to any technology. We just got the most important components from this architecture and customize it for our specific needs.
It's quite a big document. It's available online. It has a number of components and subcomponents. And the idea was to use this and try to put it in practice because at that time was just a document. So then we use, of course, I'm here. We use only open source software to implement this.
There's a whole range of applications. So for the geospatial stuff, we use, of course, Postgres, GeoServer. Using R, Zoo for the WPS. OTB, that's our toolbox. And Grass and GDAL for data processing.
Use Marathon and Mesos to orchestrate the containers and to scale the applications. And Nagios and Promethos to monitor the system.
And Spark and GeoTrails and a number of open source tools available on the market. And we try to stick close to OGC standards. I know this is sometimes not very efficient, but it's something that from the start we went on this way.
So everything we do is compliant, at least with OGC. In Europe, we have a directive related to geospatial data which is called Inspire. We are not really compliant with Inspire, but we are compliant with all the important OGC standards. And everything we produce is handled through the OGC standards. The exploitation scenario of the platform is like this.
So we have the users. They use a client. Right now, we don't have a fully fledged web client, but they use a client. And with the help of the client, they are defining an area of interest, a time of interest. They select the data set they want to work with and a processor. Actually, it's the other way around. You have to select a processor first.
And then the type of input that processor will get and of course, other inputs are the area of interest and the time of interest. And then this is passed to the WPS service where we have the Zoo instance. Not only one instance, but we have Zoo that is managing this. And then it's triggering the processor.
And the processor then will look what data it needs for input. We do connect to the WCS or the WFS service. We'll get the data for the area of interest and the time of interest. We'll process that data and we'll throw back the results into the WMS WCS server
or maybe into a database or in a file, whatever, depends on what type of results we are getting from this. And then the client knows to retrieve back the results and display it for the user. And the user could inspect the results and if it's happy, he could keep it. If not, he could run the whole thing again with different parameters or different area.
Like in architecture, as mentioned, we are running on this EO cloud, open stacked power infrastructure. We have all the data as object store or NFS. Then we are using Marathon and Mesos
to scale the applications if the need for more resources is needed to automatically scale. So we have GeoServer working like that. We have Zoo working like that. We have the Postgres and the GeoPortal is working this way.
Yeah. And the front end, which now is more a command line, but by the end of the project will be only web-based. And because we're working with the European Space Agency, they require to work with the end users and to define some use cases to prove that what we are developing is good
and is working and is feasible to be developed further. So we have a number of use cases that each use case have a number of information services. And I will not go through all of them related to the Danube River Basin, to the Black Sea, to the Carpathian Mountains, urban monitoring, all kinds of classical things. I would just want to present you
two or three of these information services, not the most important one, because they are normal ones, but I'll get emotional a little bit and I'll present you some of the things I consider important, even if it's not important for a large number of users of all populations. So excuse me if this is not so relevant.
So first of all, this is river ice extend information service. We try to monitor the river ice extend on the Danube and Danube Delta. And this is why, because our end user, ICPDR, at last winter presented with this situation. So we had very thick ice on Danube. And if you look in the media,
with almost four meter thick ice, and the people who are transporting merchandise over to Danube were very pissed off, because they have some numbers here, millions of years or lost due to that. This was a big pressure, and it seemed Danube has no service to monitor the ice using satellite data.
But that's not the main reason. Close to the Danube Delta, to select this kind of information service, close to the Danube Delta, we have this small village in the upper part. And the kids, for instance, and the people are working in a bigger town in the south. So the ice block, there was a ferry each day to travel,
for instance, and this ice block, of course, the ferry, and if you see the picture, even the kids was using to cross the ice to cross the river to go to school, or I don't know, normally people to go shopping and to go to work, and this was quite dangerous. And if you look at the Danube Delta, that is a very, very beautiful area,
but it's also very remote, and all the villages highlighted in red are accessible only by boat. And this was totally frozen. You can see this for more than one month. And those people isolated, and the authorities did nothing. They did, they removed the ice for the big important ships, but they did nothing for these guys.
So that's why we had the idea to use SAR data and also optical data to create an automatic system that will monitor the ice and provide the authorities with the information. And I'll skip this. And in the end, to publish the data freely
and even to go further and to include this to the navigation maps that are available in real time for the vessels. So this is the idea. This will be starting this winter, so we have a procedure, very easy one, using SAR data. We have a reference data set, and when you detect something, that could be only ice on the season.
The amount of data we are using is quite high, but we are confident that it will give good results. The second example I want to show is related to the floods and to the Danube restoration or ecological restoration. There is some big discussion right now in Romania about removing the dikes. Danube flows around some dikes,
and we have some major floods in the past. And if you look in the old map, 100 years ago, the floodplain of the Danube was like this. And now it's, of course, very narrow between two dikes. In 2006, we have major floods. On the right, you can see the area covered by floods in 2006. And that triggered this kind of discussion or re-inflation.
And there are a number of people in some institutions that are in charge now to provide information for the decision-makers why it's more important to remove the dikes or keep it. And the floods are the most important thing they are looking for. And they need a tool. They could do this kind of thing of using satellite data to map the floods.
I don't have time to go through this example, but I was just, again, going back with this major flood. You see how it was in the past, that there was an area full of lake, swamps, and the villages were not affected in the past. And today, we have a big problem with that.
I had some explanation, but it's difficult now because of the time. The idea is that we also want to provide a tool for these people. This village was almost flooded and destroyed. And at the TV, I was seeing news that the people constructed things when the areas were not supposed to do things, and you could see the flooded area.
But if you look on this affected area in the south, and if you look on the 100-year-old map, you could see that the village didn't really change. So they didn't build anything. But it's the kind of thing that are thrown into the media and poor guys over there have no power to do something. So we are using, I'll skip the things we implemented,
on how we could produce this kind of flood maps based on some kind of indicators. And through, it's a normal process, process, stuff like that. But once you do it from the browser, just the user could adjust the threshold and the data used until you produce a good map.
And then we have analytics on top of that. You could go and, for instance, in this example, you have a pixel of water. You could select and you could see how the NDVI did in the year of the flood and other years, and not only NDVI, but other indices too. And the last example is related to the Black Sea. We are working with the Black Sea Commission.
They have a number of indicator to report on the status of the environment. And, of course, Earth observation data is also very useful. And we try to provide them some reference data on the platform, not for processing, but just some maps. This is for the urban sprawling. And I put this example on a meeting three weeks ago
in Istanbul with them. And then I show how bad Turkey did on the coastal zone. So this is 1992. This is 2015. And then they said, OK, maybe it's bad, but we want each country to make his own analysis, not to provide it by you, because this is sensitive things. And yes, now we're working to provide a tool
that we could allow them to use the platform to do their own country, not go through these examples. So at this moment, the project is in the last quarter. So it's time to put all the puzzle pieces together to finish the use case development, to have a web client,
because now we have just small prototypes. Want to unify all the demo, to have a coherent thing, to implement WCS 2.0. And we start experimenting with different technologies to do the same thing. For instance, we geo-trail this, which is very powerful. And we are amazed about what this could do.
So if you have questions, I think I've finished 15 seconds before the time. So yes.
So much stuff. Where do I even start? Do you have a question? Yes, I'm... Um, the...
It's right from satellite. No, no, no, no, no model. It's right from satellite data. Okay. The thing is there are a number of systems that could provide a flood maps in Europe and in the world, but they are not calibrated to the condition in Romania. The fields are very patchy, small, narrow. We have agitation, all kind of thing that would influence this. So the people that have to produce these kind of reports,
they really need to be able to adjust to local condition. And they know the terrain and so on. The global things are not really working. And the code for this,
it will be available in September on BigBucket. I've put a link there, the project website and the email if you want to have other questions later or give me a break. So this Pathfinder project will end in March, I guess.
And then ESA will decide if the feedback is good enough from the user and it would be an IDT for a operational platform. And we'd like to go and take that, to win that bid and implement the operation one. We went far in the Pathfinder, the initial condition also.
We have a lot of advantages because we did more than they requested us. So we are confident that we'll be the one to implement this operation one.
So it depends, we are just providing a few use cases, but yes, all the data from Sentinel Archive and the Landsat Archive is available as WCS. So if you have a processor that know how to use this data,
you could just deploy your processor, adapt the data input to our system and it will work with any kind of the data you have. It depends on your processor. We don't have such a processor right now, but if you have it, it could be deployed. Yes, that's the idea.