We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Global land cover mapping and assessments

00:00

Formal Metadata

Title
Global land cover mapping and assessments
Title of Series
Number of Parts
57
Author
License
CC Attribution 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language
Producer
Production PlaceWageningen

Content Metadata

Subject Area
Genre
Abstract
Martin Herold is Professor of Geoinformation Science and Remote Sensing, Wageningen University. In his talk, he explored different projects and initiatives, also at European level, that aim at enhancing the availability of global land cover mapping and the progress in their assessment.
Keywords
Texture mappingMultiplicationTemporal logicData structureMeasurementVideo trackingAerodynamicsProduct (business)Event horizonFile archiverOpen setMobile appSoftware developerBitCovering spaceSatelliteAreaArithmetic progressionHybrid computerGeometryField (computer science)RhombusNumberGoogolKey (cryptography)Reading (process)Computer animation
Standard errorMathematicsSurfaceMusical ensembleForm (programming)CausalityTrailPoint (geometry)Service (economics)EvoluteFile archiverOperator (mathematics)Multiplication signContext awarenessField (computer science)Arithmetic progressionDynamical systemPhysical systemLevel (video gaming)Covering spaceSatelliteMereologyTerm (mathematics)Population densityMathematicsTime seriesSlide rulePredictabilityComputer animation
Continuous functionPressure volume diagramCovering spaceFraction (mathematics)Social classPhysical systemService (economics)Independence (probability theory)SatelliteValidity (statistics)MetreCovering spaceWave packetPhysical systemNetwork topologyInformationProcess (computing)Function (mathematics)Social classFraction (mathematics)Electric generatorMereologyContext awarenessDiscrete groupBitReference dataOnlinecommunityIndependence (probability theory)Level (video gaming)System callEqualiser (mathematics)HypermediaDiscrete element methodState of matterPairwise comparisonScaling (geometry)File formatComputer animation
Random numberComputer wormCovering spaceImage resolutionSocial classGamma functionNetwork topologyInformationService (economics)Product (business)Data streamMetreExpected valueScaling (geometry)Multiplication signObservational studyMathematicsElectric generatorSystem callSource codeState of matter
Phase transitionMoving averageTelecommunicationCovering spaceMultiplication signMathematicsTerm (mathematics)Social classAreaPhase transitionDynamical systemTwitterThermal expansionObservational studyProcess (computing)Device driverWeightReal-time operating systemData streamGroup actionState observerError message
Standard errorPopulation densityUniverse (mathematics)Level (video gaming)Population densityState observerCovering spaceInformationTime seriesPhysical systemProcess (computing)MathematicsComputer animation
FrequencyForestImage resolutionForm (programming)Scaling (geometry)EvoluteForestTrailNetwork topologyArithmetic progressionAreaBasis <Mathematik>CASE <Informatik>BuildingSource codeMultiplication signKey (cryptography)InformationOperator (mathematics)File archiver
AerodynamicsVideo trackingSoftware bugMathematical analysisPopulation densitySeries (mathematics)SatelliteGroup actionInformationAlgorithmPrice indexSurfaceSocial classSimulated annealingView (database)MultiplicationTemporal logicMeasurementData structureProduct (business)Different (Kate Ryan album)Distribution (mathematics)Water vaporInformationMathematical analysisFile archiverCovering spaceSpeciesType theorySocial classMeasurementSurfaceLevel (video gaming)Field (computer science)Data structureCharacteristic polynomialTerm (mathematics)Spectrum (functional analysis)Point (geometry)Physical systemSatelliteBitGoodness of fitService (economics)Core dumpPhysicalismMassComputer animation
Data structureInformationMeasurementData structureRepetitionDynamical systemMultiplication sign
Computer-generated imageryVertical directionTransport Layer SecurityPrice indexAreaData structureForestProcess modelingHeat transferGraphics tabletAngular resolutionMeasurementGEDCOMSample (statistics)Temporal logicTime domainSurfaceHypercubeSpectrum (functional analysis)Hausdorff dimensionSatelliteLevel (video gaming)Product (business)Video trackingAerodynamicsMultiplicationComputer clusterSeries (mathematics)Scale (map)Data miningNetwork topologyObservational studyStandard errorQuadrilateralComputer wormSatelliteInformationDynamical systemSpacetimeArithmetic meanPredictabilityCASE <Informatik>Observational studyEndliche ModelltheorieService (economics)Dimensional analysisProxy serverType theoryCartesian coordinate systemContext awarenessMathematicsSpectrum (functional analysis)Data analysisTerm (mathematics)Data managementCharacteristic polynomialPhysicalismAreaMathematical analysisLine (geometry)Data structureDomain nameComplex (psychology)Group actionCombinational logicServer (computing)Covering spaceFile archiverPoint (geometry)SurfaceThomas BayesState of matterFraction (mathematics)Workstation <Musikinstrument>ArmData storage deviceFilm editingInternet service providerNetwork topologyMeasurementOrder (biology)QuicksortForcing (mathematics)DiagramComputer animation
Fraction (mathematics)Image resolutionStandard errorComplex (psychology)PredictionParameter (computer programming)SurfaceEndliche ModelltheorieTemporal logicMultiplicationMeasurementData structureProduct (business)AerodynamicsVideo trackingGamma functionEuclidean vectorPhysical lawMenu (computing)Maxima and minimaSummierbarkeitDuality (mathematics)Clique-widthCurve fittingPlot (narrative)SpacetimeMathematicsCovering spacePredictabilityValidity (statistics)InformationMappingGrass (card game)Uniform resource locatorPhysical systemBitGoodness of fitWave packetService (economics)Stability theoryCommunications protocolLevel (video gaming)Marginal distributionDatabaseState observerMultiplication signMoment (mathematics)ForestPlotterConsistencyMatching (graph theory)Reference dataDifferenz <Mathematik>Fraction (mathematics)Variable (mathematics)Function (mathematics)Time seriesTheory of relativityDifferent (Kate Ryan album)Term (mathematics)Network topologyAreaSet (mathematics)Mixed realityUsabilityExecution unitProper mapSampling (statistics)Point (geometry)Virtual machineFile formatArmState of matterForcing (mathematics)SpacetimeLimit (category theory)Drop (liquid)Inheritance (object-oriented programming)Range (statistics)Product (business)Fault-tolerant systemSelf-organization2 (number)Computer animation
MultiplicationTemporal logicSeries (mathematics)SatelliteConsistencyStability theoryLemma (mathematics)MeasurementData structureVideo trackingAerodynamicsProduct (business)Mathematical analysisData streamTexture mappingCovering spaceDatabaseoutputChannel capacityAreaEstimationForestSource codeMappingDifferent (Kate Ryan album)Covering spaceSoftware developerSocial classLevel (video gaming)ImplementationPredictabilityTraffic reportingExpert systemForestMathematicsMultiplication signProduct (business)AreaSource codeSatelliteOpen sourceValidity (statistics)Moment (mathematics)Reference dataBlack boxService (economics)EstimatorType theoryDomain namePhysical systemChannel capacityState observerGoodness of fitDistribution (mathematics)ExpressionOperator (mathematics)Arithmetic progressionWave packetSet (mathematics)Directed graphInformationShared memoryVirtual machineGroup actionProcess (computing)NeuroinformatikHeat transferForcing (mathematics)KnotView (database)State of matterSpacetimeRoutingInternet service providerRow (database)TorusProjective planeWorkstation <Musikinstrument>Metropolitan area networkComputer animation
CAN busPhysical lawExecution unitPermianEmpennagePredictabilityInformationLevel (video gaming)Pattern languageoutputRight angleForestReal-time operating systemOperator (mathematics)Different (Kate Ryan album)State observerTemporal logicInsertion lossAreaPhysical systemComputing platformPairwise comparisonMultiplication signComputer animation
AlgorithmPopulation densitySatelliteMathematical analysisFocus (optics)Hausdorff dimensionComplex (psychology)Euclidean vectorAerodynamicsGroup actionInformationSatelliteSign (mathematics)Connectivity (graph theory)MathematicsCartesian coordinate systemSocial classGroup actionMappingSampling (statistics)Temporal logicInformationPoint (geometry)Procedural programmingBuildingDigitizingAreaProduct (business)Complex (psychology)Multiplication signState of matterSpacetimeGraph coloringDoubling the cubeCovering spacePredictabilityEstimatorType theoryExecution unitMixed realityProper mapAlgorithmDimensional analysisOnlinecommunityAnalytic setTime seriesComputer animation
Transcript: English(auto-generated)
Mooi, thank you for the invitation to present here at this workshop, very appreciated. Also thank you for the open geo app team for giving me the opportunity to talk here
for organizing such an important and interesting event virtually or hybrid with a few people in the room and a lot of people actually online. I'm going to talk about global land cover mapping and assessments and it's an area I've been working on for a number of years. My name is Martin Herold. I'm from Wachen University in the Netherlands and from the GFZ Helmut Center in Potsdam
and working in this field for a number of years. It's nice sometimes to get the opportunity for such a, let's say, keynote talk, to reflect a bit on the latest progress, dive into some new developments and looking a bit ahead on what's coming, what's new and hopefully we can have a useful discussion afterwards.
I'm going to mainly talk about five different things, starting with some recent progress and then diving into some more specific aspects. And we all know that we have been improving global land monitoring, land cover monitoring for a number of years.
We have been able to do that because we have a large array of satellite archives at our hand and I guess most people working in the field, the longest and deepest archive we have is the Landsat archive and that's what's shown here on this slide where you see on the left hand side the Landsat archive. You have the dense time series that is part of that and based on this dense time series
you can produce a land cover map. You can produce a land cover change map if you look at it consistently over time and you can produce basically a land cover prediction at any of these points in time for the last 30 to 40 years depending on how deep the archive is.
And that's been basically our approach. That's been our workhorse in terms of data to really make big progress in tracking land cover and land cover dynamics. We also have an evolution of moving these concepts into a more of a service or operational context. For example, the Copernicus Global Land Cover Monitoring Service has been providing annual
land cover data since 2015 at 100 meters globally. What's important here, it's not only providing discrete classes of land cover but also land cover fractions. So tree cover, shrub cover, herbaceous cover, cropland cover, all of these things which are a bit more flexible information to not just provide classes but really also
fractions so there's more flexibility to some user communities to create their own land cover data based on that. What is also very important about that system is that there's of course a land cover data generation system processing the satellite data, turning them into land cover information but that's accompanied by a training data collection system and also an independent
validation system. And so basically reference data for improving the training and also validation and then being continuously updated through the process is an important part of the system and that's something we should also realize that with all the machinery we have at hand, the data we have at hand, the quality of our outputs really very much depend on how well
we can train it and of course how well we can independently assess it to tell the users how good a map for certain purposes and why. We're moving also to more detailed scales, the first global 10 meter land cover data based on Sentinel-1 and 2 will become available next month.
It's a product for 2020 and that demonstrates that this can be possible. I mean processing basically an annual 10 meter coverage of both Sentinel-1 and 2 globally is quite an effort but it can be completed within basically half to three quarters of a year after the last data tag has been done and I think that's quite a big achievement.
So these data can be produced more regularly now and will be available and I think the expectation is also that with some of the Copernicus services will move from 100 meters to more detailed scales using the Sentinel data streams. We're also using all the historical data and all kinds of information to also look back
in time combining it with other data sources. A study for example looking at global land cover land use change basically for the last 16 years and of course if you use those observation data we look at land cover change processes in more detail, we look at it in more detail in terms of classes and also dynamics growth versus net changes for example and what we are finding is there's
actually much more change in terms of land cover than we previously anticipated and of course we see also over these long times is that we have phases of acceleration and deceleration of global land cover change which I think we still have to explore further but longer times here is what's actually behind these trends.
Of course land cover changes are connected to certain drivers and some of the big processes related to land use change, agriculture expansion, deforestation, expansion of vegetation in dryland areas and so on. We're also having in terms of the temporal detail increasingly near real-time data streams
that we can use to update our land cover information. Here you see basically a dense time series with Sentinel-1 radar data which always give you an observation and with that data you can basically check anytime a new observation becomes available whether the land cover is anomalous or there's an anomalous behavior
in the time series that can hint at okay just maybe a land cover change process ongoing. And these are the systems that are behind providing information more on a weekly to monthly level. An example is shown here. This is a Sentinel-1 based forest disturbance monitoring which is done on a weekly basis
and basically tracks whether there are new disturbances in this case in the Congo basin forests and we see here basically a weekly evolution of a logging activity within the Congo basin starting with road building and the tree extraction surrounding it and this kind of data also becoming available now in a particularly valuable form for example
to track activities, track potentially illegal activities in certain areas for example logging was not supposed to be happening. And this is not just small scale exercises but these are really now operationally done for currently 35 countries in the humid tropics but will expand to the whole pan tropics also later this year.
So with that being said, we have seen a lot of progress. We are building on increasingly long and increasingly dense archives that not only give us good ideas on land cover characteristics but also on dynamic information that can also increasingly be linked to human activities and that's also what's interesting
now for all the field communities that we're able to track certain activities that are important. When we think about it technically most of these analysis still use relatively simple spectral or remote sensing indices. It's mostly an empirical approach and we're still dealing with a few broad classes and at the end there's still relatively limited physical understanding on what's actually
happening and that triggers me a bit for my next point I wanted to talk about is what I see evolving is really that we're moving to let's say higher quality measurements of land surface in a particular referring to here as more structural information for example vegetation structure information but also hyperspectral information in particular
for hyperspectral we have with the end map system now that's coming up next year that's launching next year the chime which is one of the next Copernicus satellite missions we're increasing the availability of deep spectral information for these kind analysis and so we can expect that more spectral information allows us to really dive into
more detail into certain classes for example when it comes to soil characteristics vegetation species distributions and for example water characterization in terms of different types of water water quality and so on we also will have more information
on structural information that's an animation that's currently not working but you can look at it on the on the website basically detailed structural information so something about height and vegetation structure is becoming available not only as one-time efforts for example you know having for example laser measurements from drones from
from terrestrial laser scanning but also to have this information more temporally available so we're really getting temporal structural information which are really telling us about dynamics in vegetation in much more detail and that refers back to what i mentioned more physical understanding on what the dynamic earth surface is really meaning and how does it
translate into satellite data that we are acquiring we also have now structural information coming from space the jedi nasa mission is basically providing a sample-based approach on on vegetation structure and height this is just the data for the first six months 2019 this data is still being acquired today and we really develop a thick archive of vegetation
or structural information based on lidar space-based data and since this is a sample-based server you can then use for example sentinel one and two data to extrapolate that and it's just a prediction of for example vegetation height a combination of the jedi height data with sentinel
one and two as proxies to spatialize it so this is another dimension that we are going to be able to explore also for large area applications in terms of improving land monitoring getting more detailed information about physical and physical characteristics and also physical changes in the
physical structure and spectrum behavior tell us more about how and why land services are changing and that is of course going in line with both improvements in the new sensing domains of terrestrial with drone-based sensing to provide the quality data to understand for example
changes in structural and spectral behavior and we can relate that to some of the new satellite missions we see jedi i already showed an example mneps coming up shine hyperspectral missions and also for example the isa biomass mission also that structural dimension and the hyperspectral dimension is really something we'll be able to
use much more in the future then let me move on to the full point which is about advancing approaches i mean this conference is of course a lot about also how can we look into new approaches and yeah deep learning or some of these more data science approaches big data analytics we're talking about global analysis of increasingly amounts of data but we're also
talking about deriving increasingly complex information for example this is in a case for looking at land use following deforestation and it's one of the few very global or large area studies we really looked at predicting land use and we know land use is a relatively complex
feature it's much more complex than land cover island cover it's just a physical cover of the surface land use has much more dimensionality so it's basically a human action and you need much more information or you let's say if you think about remote sensing signals to characterize land use with much more complex characteristics for example it's a lot to do with bulk context the temporal dynamic explain important role and stuff like that and but in this study what was
shown is really that you can use some of these deep learning models and there's different one optional to really make good predictions for some of these land use types and that's basically good that we will be able to if it's probably trained we need to get into much
more land use or management information that so far we've not been able to easily get from our satellite data information the same is actually true for these fractional predictions also fractional cover tree shrubs herbaceous crops i already mentioned that they are also improving and here the trick is i think we are we're pretty good at predicting when there is
a hundred percent of a certain cover or zero percent but the big uncertainties are what's in in between huh so we have a situation where we have tree shrubs and grasses in the same location and there are some of these fractional predictions are really helping out and with some of these approaches and good training and you really can improve there there's still some remaining
uncertainties also this area of mixed mixed land cover units is still an area where further improvements can be done but we're getting better if we use the right approaches that are well trained and also validated that leads me also to some of the next points is that when
novel approaches the critical issues we need to have proper reference data to underpin them i mean i think the end will not so i mean yes good approaches we have good machines so in terms of machine learning i'm not so worried too much about the machine i'm actually more worried about the learning and investments in good training data reference data is really the key because at the moment that's the bottleneck one of the bottlenecks one of the key bottlenecks
that determines the qualities of our outputs so in terms of then assessing and reducing uncertainties and multi-temporal data sets we know that for example validation of land cover also global land cover has been done for many years we see you know there are ceos to committee earth observation calibration and validation protocols that have been published
i think a lot of global land cover maps have a proper validation along with them that's a bit of a requirement in fact it's very hard to publish anything these days without the proper validation i think it's good like like like that because producing global land cover data within with these large data can become relatively fast and relatively done with relative
ease but the quality is really the question so how good is then a certain map particular for a certain purpose and so now that we're moving also from static maps into more multi-temporal mapping we also have to adjust our validation systems a little bit and just to show an example
of that that validation of course requires usually a proper sampling design so you have to basically have a good set of reference data for validation and if you want to do multi-temporal you also have to also have a sampling design that really takes into account that you have multi-year for example data that your validation or reference database stays up up to date that you're
really validating change or you optimize your sampling design to capture changes be able to and then you can actually start to look at the stability of land cover maps or annual land cover maps over over time the stability basically means how consistent is the accuracy over time and that's this is some of the requirements or some of the user requirements
are clearly advocating in that and just to give an example for the global Copernicus land cover service the stability is is actually quite good the one thing we noticed is for example because the system started in 2015 and there was a map produced in 2015 and then 16, 17, 18, 19 was
produced in one go and that then we did see a little bit of instability here between for example 2015 and the other data sets that comes from the fact that we probably used a bit you know varying approach for all these things it's still within margins but I think that those are usually the kind of features that you see if you want to look at increasingly longer annual time series but the concepts for this multi-temporal validation are really important
and their you know understanding how consistent the accuracy is over time is also telling us a lot on how well also change predictions can be done from this annual maps and that relates to land cover as map as it relates to other things and just show an example here of global biomass mapping there's also a something that's evolving at the moment also relates to land cover
or particularly forests and also here you see a large plot reference database of biomass compared to some of these multi-date biomass predictions from space-based data and you also there see that for example for the lower biomass ranges you have a relatively good match so there is relative
consistency in terms of the accuracy and also let's say in terms of differences between plot-based and space-based predictions but for example for higher biomass ranges there are 200 tons of biomass we see that for the years there's actually some variability still and there we also expect then any change predictions or any differencing between the maps will lead to false
changes and so that's also why we at the moment have to be very careful for example doing biomass change from this annual biomass data for the time being and but this is also important information for the people who are producing these data then to increase the consistency
and improve their change predictions when it comes to for example biomass change and the same is actually true also for land cover change where we know we still have to we have ways to go basically to really have very accurate global land cover change predictions we're getting there global comparing this land service has produced some change validation or change to validation
of the change predictions they're still they're they're higher than 50 percent so most of the changes are actually true but they're still a good share of changes that you know omitted or common and that has to be taken into account then so what we're seeing is really this concepts
for multi-temporal and change validation uh evolving and there we um yeah we expect that these concepts are in strongly then we move to more change prediction in the land cover so biomass domain as an example that I've shown and then of course we're all doing it for purpose huh so I mean low global land cover data and the increasing detail coming from these data
the more the more detail you have the more users actually interested in these type of data not only global users but also increasingly regional national and local users and I think we'll have to have a role as a community to also be the advocate for explaining what we're doing and why and for for whom but also about transparency and that's also about transparency
for our data and we have really good concepts now how we can arrange more user-driven mapping if you ask 10 different users they will want 10 different maps and in fact with the data we have at hand you can actually produce data there's not one map that fits all purposes but you can produce targeted maps based on certain user requirements and that might be you know an
agricultural user wants agricultural distribution, crop land distribution but wants maybe more cropping classes a forest user maybe wants more detail in the forest maybe not only forest cover but also forest biomass or height and stuff like that and then you have a biodiversity users who
really looking at the diversity of different vegetation types and stuff like that and you can actually produce data sets for these different purposes particularly if the user has a role in for example providing training data and reference data has a role in the validation and stuff like that basically we can let the machine produce targeted products and that's I think something that's evolving and decreasing computing facilities that becomes also
increasingly feasible we also see an increasing uptake of these technologies by countries this is for example a recent assessment of the FAO's Forest Resources Assessment 2020 which is a global assessment and when you look at the data sources the countries have been using a
particular now in the tropics we see a large increase of use of satellite data and they're for example landscape data increasing also sentinels for their national forest area and area change reporting also we're not we're not only seeing that this is an an expert research
commercial kind of progress but we really see these operational uptakes also in country agencies and so on and that's nice that's good that's good to good to see and see that this step of transferring technology into operational processes is really happening and that's good and of course that comes goes along with you know serious investments also in capacity development
where we as community I think also have a role to play we also see then this land cover land cover change data underpinning more let's say spatially expressive greenhouse gas inventories this is a data set that's been produced by the World Resources Institute with some international partners really trying to turn forests and land change data combining with
biomass maps fire estimations all kinds of earth observation outputs into an annual prediction of forest sinks and sources and that's actually one of the first large area earth observation driven greenhouse gas inventories and we do see this increasingly from the policy side globally
but also particularly on Europe for more spatially explicit greenhouse gas inventories because they are more suited towards policy development implementation of certain for example climate change mitigation actions in the forest sector but also for performance reporting of you know all kinds of activities and this is the first product that shows how this becomes
increasingly feasible and that's good this is I would say at the moment a research product but we know that these products can evolve over over time they will get more accurate and methods will consolidate I think these systems these earth observation systems are modular in the sense
that if you have improved data sources you can easily ingest them and they're relatively open source so basically also people can do with them it's not just the black box kind of information but it's it's relatively collectively well presented and explained and transparent what's done and you know for example some of these systems are now also being explored to be
taken up by countries to integrate their data sources to come up with more spatially explicit greenhouse gas predictions on an annual level or even on a more detailed level and this is what's shown here earlier showed for example these forest disturbance alerts and if you combine these forest disturbance alerts that are available on the on a weekly level with
some biomass prediction space-based biomass prediction you can get into you know basically near real-time carbon losses at a certain area and you can see basically also the different temporal pattern on the left hand side you see a selective logging pattern which has a certain also seasonality of emissions in particular you know basically a multi-year
logging operation to the left and then on the right you see a shifting agricultural system where you see much more seasonality it's much more driven by the let's say the dry season availability and you basically can compare the different temporal patterns of emissions which these very temporal detailed data and again the temporal detail the having information very much
up to date very quickly yeah that's the that's where we need to earth observation input can make it different it can make this data much more actionable so with that i'd like to summarize my main point so we have seen a large array of global examples global improvements and land
cover land cover change monitoring the building apart upon the heavy satellite archives time series analysis algorithms big deal and analytics so you know working in global land cover for yeah almost 20 years now it's we're basically in the golden age that's never been that good
and there's still room to improve though but we can really now produce a lot of useful info information that asked for by the user community in the next years we'll need to focus much more on incorporating the structural and the hyper spectral dimension of the data these data become increasingly available also for large area application and should further improve our ways
to assess land cover land cover change the same is true for incorporating some of the new data science ai type approaches in particular for complex issues such as land use or certain complex land changes for example and reducing some of the uncertainties we still have for example
in these mixed unit classes accuracy estimate procedures are very important it's very easy to produce maps these these days but it's it's the proper accuracy estimate i think that really makes them valuable for the usual and there we also have to increasingly incorporate the temporal component component and these assessment of changes or change prediction from the satellite
data and because the data becoming more detailed in both space and time particular the temporal detail is very important to link this dynamic information to applications and actually actions for example for an enforcement for transparency for basically exposing certain types of changes
that are important and need to be tracked in high spatial and temporal detail and with that i'd like to close thank you very much