We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Satellite Snow Cover Products Evaluation and Validation Platform Developed Entirely With Floss Software

00:00

Formal Metadata

Title
Satellite Snow Cover Products Evaluation and Validation Platform Developed Entirely With Floss Software
Title of Series
Number of Parts
183
Author
License
CC Attribution - NonCommercial - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language
Producer
Production Year2015
Production PlaceSeoul, South Korea

Content Metadata

Subject Area
Genre
Abstract
The monitoring of snow cover extent is important for the management of natural resource, extreme events prediction such as snowmelt floods, avalanches etc. The current status is that the network of weather stations is too sparse in regions with seasonal snow cover to provide reliable snow monitoring and impact applications. Remote sensing can regularly provide maps of snow cover extent, under limitations imposed by satellite cycles or cloud cover. A number of daily or synthesis snow cover extent products, covering Romania, with different resolutions and specifications, are available for free (e.g. GLOBSNOW, CryoLand, H-SAF, IMS). These products were homogenized and included, along with reference and in-situ data, into an application that make possible for user to inspect, process, analyze and validate the information, using a web based interface. The platform, created by National Meteorological Administration of Romania offers services based on Open Geospatial Consortium standards for data retrieval (WMS, WCS, WFS) and server-side processing (WPS, WCPS). The services were built upon open source solutions such as GeoServer, OpenLayers, GeoExt, PostgreSQL, GDAL, rasdaman. The application is composed of several software modules/services. The modules are split into two categories: server-side modules/services and client side modules - responsible for interaction with the user. A typical usage scenario assumes the following steps: 1. The user is operating the client functionality to select a temporal and spatial slice from a product cube (e.g. 5 months archive of daily CryoLand FSC data); 2. The users select a statistic method to be applied; 3. The request is sent to the server side processing applications wrapped as WPS or WCPS calls; 4. The process will trim/slice the coverage cube, perform the statistic operation for the pixels within the ROI for each day in the selected time interval; 5. The results are sent back encoded in a standard file format; 6. The web client display the results in a relevant form.
126
Performance appraisalComputing platformSatelliteProduct (business)Covering spaceNumerical analysisServer (computing)Process (computing)Workstation <Musikinstrument>NetzwerkverwaltungExtreme programmingEvent horizonPredictionComputer networkElectric currentLogic synthesisImage resolutionProcess (computing)Interface (computing)InformationOpen setInformation retrievalStandard deviationService (economics)Open sourceInstant MessagingSoftware frameworkHazard (2005 film)Parameter (computer programming)Perspective (visual)Manufacturing execution systemGeometryGauge theoryPixelPhysicsRegular graphApproximationMIDIWeb serviceMenu (computing)Type theoryMeasurementInterpolationArchitectureUser profileDisintegrationQuery languageVisualization (computer graphics)Interactive televisionRemote procedure callServer (computing)Open sourceAreaProduct (business)Library catalogLevel (video gaming)Point (geometry)Cartesian coordinate systemMereologyDenial-of-service attackClient (computing)CASE <Informatik>Extension (kinesiology)Projective planeLine (geometry)BuildingWeb browserInformation retrievalDatabaseWeb 2.0Archaeological field surveyEstimatorCountingPlanningGrass (card game)Moment (mathematics)MetrologieMetreNumberDataflowSurfaceReal numberProcess (computing)Computer programStandard deviationFile formatSet (mathematics)Open setMultiplication signTraffic reportingInformationCore dumpImaginary numberLatent heatWorkstation <Musikinstrument>Active contour modelSoftwareParameter (computer programming)StatisticsPhysical systemDigital electronicsFunctional (mathematics)ResultantReference dataWeb serviceGoodness of fitField (computer science)MeasurementSpacetimeImage resolutionQuery languageTorusAngular resolutionEquivalence relationComputer simulationVirtual machineInteractive televisionScripting languageVisualization (computer graphics)File Transfer ProtocolFunction (mathematics)Type theoryLink (knot theory)System administratorSubsetForm (programming)INTEGRALWeb applicationTemporal logicCubeSatelliteNumeral (linguistics)Raw image formatMultilaterationRepository (publishing)SharewareDifferent (Kate Ryan album)Software frameworkWater vaporComputer animation
AlgebraFrame problemInterface (computing)Special unitary groupVisualization (computer graphics)Process (computing)Modul <Datentyp>Military operationChainSingle-precision floating-point formatImage resolutionComputing platformOperations support systemHuman migrationJava appletScripting languagePhysical systemDisintegrationPerspective (visual)GeometryNetzwerkverwaltungSoftware frameworkAsynchronous Transfer ModeData fusionParameter (computer programming)Hazard (2005 film)MappingProduct (business)Metropolitan area networkPixelType theoryServer (computing)Physical systemGroup actionSemantics (computer science)MetrologieWeb 2.0Process (computing)Temporal logicCellular automatonQuicksortClient (computing)AlgebraMereologyModule (mathematics)Extension (kinesiology)NumberFood energyStreaming mediaMultiplication signSummierbarkeitWeb serviceInstance (computer science)AdditionWater vaporMobile WebEquivalence relationSatelliteMedical imagingTable (information)MomentumHuman migrationOperations support systemNeuroinformatikMathematical analysisPoint cloudIdentity managementInformation retrievalResultantStatisticsMaxima and minimaVelocityProjective planeElectronic visual displayBit ratePerformance appraisalState of matterMetreImage resolutionProfil (magazine)Different (Kate Ryan album)Validity (statistics)Core dumpForestLogic synthesisDatabaseNetwork topologyLevel (video gaming)Operator (mathematics)Angular resolutionAreaGraph coloringUser interfaceProgram slicingMeasurementLibrary catalogAlgorithmParameter (computer programming)FreewareRaster graphicsCollaborationismModule (mathematics)CASE <Informatik>Repository (publishing)CalculationRight angleMixture modelTouchscreenWebsiteXML
Computer animation
Transcript: English(auto-generated)
My name is Vasily Krechunescu, I'm from Romania. I'm working at the Remote Sensing and GS Laboratory inside of Romanian Meteorological Administration. I would like to present you a few things we did, a few tools we played recently on managing snow related data, mostly satellite derived products.
So as you may all know, snow is a very important parameter, a very important resource, at least in my countries. A number of things depends on the amount of
snow we have, so it's very important for the hydropower, it's very important. We want to know if you could have floods or avalanches or if you're able to go skiing during the winter and so on. And that's why it's necessary to have a good estimation of what's the amount of snow, what's the
extent of the snow at any moment. And of course we have a network of meteorological stations, but they are sparse, we could not cover all the areas and especially in the mountains they stand and the depth of the snow
varies a lot. So at this point Remote Sensing data could offer you some extra information to be able to map the snow. So in the framework of a few projects we created a database of different products created by some other
projects, but just created a database for our country or for our area with interest and created some tools to be able to analyze, inspect this kind of multi-dimensional database like a cube of data. So we use only open source
for this and these standards from OGC, kind of the same things were shown earlier by Andrea. So as I mentioned we did this kind of work in
some research projects. So the main project we are doing this is Snowball who started last year at this time, so it's like the work is almost one year. But we also rely on some data sets and some knowledge we obtained from some other projects. And the most important is the Skyland project I'll
mention later. So if we go to the data as mentioned we have like a cube of data that is for snow. So we have these online repositories with open access that you could get snow-related parameters like snow water, snow
snow extents or water equivalent or snow temperature and so on with very different resolutions. Mostly the results are scores of course, but there are a number of these kind of repositories online. Each is different with different
specifications so what we did was to get all this data in our own homogenized format. And we have this kind of data for the last 10-15 years with a different resolution in temporal or in spatial resolution. I mentioned
something about the Cryoland that was a FP7 project that ended this year and the aim of the project or a partner in this project and the aim was to create a downstream service in Europe, a Copernicus downstream service to provide
products related to snow and ice. Copernicus is the European program for space. He has a number of core services which like land, marine, atmosphere and so on and has this concept of downstream services for more refined needs where you could have like this kind of on the same principle
you could have this kind of services that offer data that are more specialized or more refined in many ways. So if you want some good satellite drive data related to snow from Europe, you could use the Cryo-NGO portal. It's all built on open source technology as well. Okay, but we don't have only this kind
of data. We also have data from the meteorological stations, from all kind of measurements in field and also interpolated data and outputs for the numerical models, all kind of outputs like snow depth, amount of
precipitation, temperature and so on. So as I mentioned, we have this service in our institution and we created some links to the existing services. So to be able to connect to that service and integrate data from that
service to our service or to download data subset and so on. In case of the cryo-land, it's very easy because they expose OGC services so you don't really have to get that data on your machine, but it depends. But for some
others you have just to find some easy scripts to get data via HTTP, FTP and so on. And then once it's in a form, it could be accessed from our service, then it could run all kind of server side processing that we define
for this kind of data. And you could also integrate with other meteorological parameters that are published through our SDI. As I mentioned, we use a totally open source stack for this. So we get the data in this kind of
sources, numerical model, satellite data, in situ data and some reference data. And in the left part we have the users that actually are my colleagues in the institute. We don't have external users yet because it was not the case.
And they could use three types of clients, web clients, not so much work until now, desktop client and the command line client. And at this stage of the project, most of the things you could do are through the command line, but that will evolve. So for this kind of client, you could go look in the
database and you could get back a number of products like a map, a chart, some raw data, statistics, animations. In the other part on the server, we keep data in plain files, in Postgres and PostGIS. We have some kind of GIS
servers, like we use Razdaman to store our cube of data. And we use GeoServer and the GeoWebcache for all the reference data we have. And for the web
processing services, we use PyWPS. Then we have a catalog of products and that catalog is based on GeoNetwork. And as I mentioned, we have a number of, it's not one client, but we have some, at this stage of the project, we have some demos built for web applications. So not only command line, but some very
early web clients that are based on OpenLayers, XGS, GeoX, Bootstep, this is a whole mess of things here. We use DT, GS for chartings and GQuery and so on.
And of course, we have some other core software that we'll find on every server. So to summarize, using one of the clients, our colleagues, they could get results as maps, charts, mostly charts, statistics or animations. As I
mentioned, the functionalities are mostly standalone demos right now. It's very consistent on the command line thing, but otherwise we could not say we have a real client for everything we have there. But as an idea, we could do, of
course, interactive visualization of the data cube I mentioned. We could integrate this data with other typical meteorological data sets. We could do all kind of queries. We could navigate, temporarily navigate for the data sets and extract data on this criteria. Of course, we could download data and do
some kind of processing with this data. So we could extract slices. We could re-sample. We could create profiles. We have a raster algebra module. We do synthesis. So when you talk about products from satellite data, which are
derived from optical data, you always get this cloud problem and usually the daily data. So we store daily data mostly. So most of all the products we have is based on daily data. But when you consume that, it's not really going to work like that, especially during the winter. You'll not
find sunny days or days to see what's the extent of the snow. So then you have to create this kind of synthesis. That was a pain for my colleagues from some time. So now we have this kind of module. You could
just select your interval or you could select like, you want to do 10 years of 10 day synthesis and you just run a simple command and you'll get this kind and you could have the files and reuse it in some other analysis and some other software. So that was one of the most important things we did. And
also kind of statistics. So as I mentioned, we have some small web interfaces. It would allow you just to connect to the catalog and display some of the some of the temporal slices. Not very much. You could integrate with all kind of products, as I mentioned, for the entire
catalog of the institution. You could query directly the data here. And as I mentioned, you could do a number of processing things through the web interfaces. Most of them is available only for command line, the complex one.
But you could do synthesis from the web interface. You could do profiles. You could do, I'll put an example for the snow water equivalent products we are using at the European level. On the left, you could do profile on one coverage or you could get a pixel and go through 10 years
of data in that pixel and put this in the chart and see what has happened with that parameter in that pixel. Or you could digitize on screen an area and you could get statistics for all the pixels inside the area. So as I got to my conclusions and the future work, so now we have
more than almost 50,000 course resolution coverage in this database. We have very variable temporal and spatial resolution. So it's not difficult to work with data, which is roughly at one kilometer, but
could go up to six kilometer in cell size. Our colleagues are pretty satisfied with what the system could do right now. We try to move as many of the processing operations from the
command line to some web interfaces, get all the standalone demos on the same interface, migrate from the XGS clients we have now to something more friendly. We have a mock-up
using Bootstrap, but it's not so easy. Add some more processing modules and the possibility to chain the modules in one single operation is to select a number of processing steps and be able to run in a chain. So, I don't know, I think that's it.
Thank you if there are questions.
How do you find snow pixels in remote sensing image? You know, there are many remote sensing techniques to find some values like classification, mixture analysis, or... As I mentioned, we are not doing the processing of the
remote sensing data. We saw the table of the repository. So, for instance, the CRIRAM project where I was involved. So, this service, European service, was producing this kind of snow extent, snow water equivalent, or snow temperature maps. And there's quite a complicated infrastructure behind and tested
and validated algorithm for the Europe. So, to find this kind of snow pixels and it's not like so easy. We are not doing, what I presented here, we are not doing this kind, we just get the products already processed by other services.
So, the spatial resolution of your data is very low, so have you compared with high resolution sensors? Yes, and while doing the validation, the validation was done with high resolution data. The problem is you don't have daily high resolution data to produce snow maps.
You could get MODIS data daily, which is 250 by 250, the maximum resolution of the sensor, but you could get daily data with high resolution images that are even expensive or you could not get this kind of temporal resolution because the snow is quite dynamic.
You cannot build a monitoring service using, I don't know, monthly images. In my opinion, I don't think you need a daily high resolution data. No, not daily, but not monthly or something like that. And also, it's quite expensive to have like, for entire Romania to have like spot data at 2.5 meter
resolution, it's expensive and it's not easy to create a mosaic, you will not make it in one day, it's difficult. The satellite could not take all the images in one day. I think Landsat is fine because it's free and it's much higher.
Yeah, but you could have clouds in the day, that's not easy. You will not get a snow map from Landsat, only if you're lucky, because otherwise it could be cloudy that day. When you have daily data, you have a good chance to have in 10 days or in seven days to have a full image of the
country, for instance, in our case. But there are many problems, of course, in the forest, maybe there is snow, but snow is falling from the trees and you see, it's a lot of problems with this kind of... Any more questions?
We have some time left, you promised to sing. I said if I'm going to sing, people will leave, so I... We have one more question. Great.
Then we are saying... Our first question is, are you sharing any snow maps on your sites or something? Not... No, now it's internal, but it's not a problem, it will be open, because I think it's still open even if we don't advertise,
I think it's on a public server right now. But yeah, it's only Romanian language all over. Are you collaborating with the Norwegian Meteorological Institute? No, not the Meteorological, but someone in Norway. No one in Norway, oh, yes, but not in the Met Office, but in Tromsø, the Norwegian Computing Centre.
Okay, because I'm using snow maps for my calculation of UV radiation in Norway, so I know there's an institute in Norway, which is working with a mobile for snow cover in Norway. Yeah, this project, Snowball, is a collaboration between Romania and Norway.
So it's probably the same institute. Yeah, they're specializing in developing algorithms to extract snow from the remote sensing data. Yeah, okay, thanks.
I can see now.