Merken

Spatial tools for LiDAR based watershed management and forestry analysis integrated in gvSIG

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
check
the OK let's start with the last talk of the session system of interest from Italy after company hydrologists she's going to talk about the spatial uh that she brought into her researcher you also PhD student at the Free University of without and and where should this somewhat management and forestry tools which are based on like to date you and thanks for coming forward to this presentation on as and
they're already introduced me I am my an environmental engineer specializing ideology and Alexander morphology and I am a cofounder of federal adjacent which is and the smaller than engineering company based in Italy the very nothin but telling end of produced from this year we are part of the TV dissociation and 90 90 work is mainly dedicated to the development of scientific models as saying that the itself ideology idolic some forestry but I'm also a PhD student and that noise dear to member of the tools that we developed in ontologies as an integrated in that our library is a free and open source library that it is called a chain this is an open-source dispatcher library and focus on almost on either to morphological analysis and environment modeling in general also forestry and the some geological toward the data to library as being of developed since 2002 the contribution of the University of Trento over they often machine package in grass and then and some important step of uh the evolution of the coded uh in 2005 and we definition of ideology and data the main development and coordination of the project passed from the university to either virologists and then we decided we we continue to develop this library and we there's in region the force parts of shape Sam posteriors since before all the modules where uh Assamese uh in 2007 we decided to mold tool that you did framework and we integrated uh these data to seniority goods and then we continue adding 2 of the developed uh the library integrating new models but those new formats like CDF and there also some an environment and framework to link uh scientific models and tool to obtain a simulation to finalize some of the environment that uh answers recently in 2013 we um integrated sample was for um because the connection with mobile phones and preparing and reading the data for digital field mappings and finally in 2002 it 14 14 when I started my PhD I know we started to implement their less the library which is the R library dedicated to override the analysis In the field of um not use in the in the field of forest and then the last step of the user has been done at the end of 2015 and 2016 with the integration of the library in the G. D. C. as the um did this integration and is that making the library as a plug-in into see again
if you will in end you can download the data as to library and then use this bound planning installation of GTC again once you inside is that the library you will find another uh voice in the manual and that it is the data stores and manual ending reach you will find some other entities 1 of these entries it is the special toolbox if
you select special toolbox i dedicated we know will be on the main program the window uh as you see here in the left side of over there we know that that is the list of the modules and modules sad divided in sections and subsections and there and then and you can select the model it or you can search the modular in and I think there a special field that he and in the top of the of this section and in the bottom in the bottom that quote it here you cannot see them but there is the possibility to select some different uh options which are uh to select tool ran an experiment the modules that this module and modules that works but they did not finished yet there's complete testing procedure and then be if there is the possibility to ever come to run in the bottom model which can help you will to find a possible problem in the models or can help you to uh and ask for the support in the community mailing list and then there are the most important option is to set the maximum memory if you have more memory available on you compute that you can give them enmity to their modules and then so you can uh Iran's module some big area or all meta data with a better solution high the solutions and when you select the module on the left side of the in in the right side of the spine and uh you will have all of the parameters the field and the possibility to selected to delay at that they're loaded in there uh the GVC view and then finally you can run the mother and so use in 1 of the options that Running options and we choose running the model has the keys or creating escaped with the model so you then can edit the script or uh makes some loop this created or you can in a random experiment state that you already edited but these molecules so and during the next month and we will always think of it the modulus that in in a way that that they will be available also understand the script and language over to see if that this in Python so maybe some of you is more practical on so let's see some of the tools so that can be helpful for uh my analysis for what should analysis and in particular for the analysis of natural has actually know what a ship the 1st section I will prevent you is that they are hard for machine this is the core section of uh there's data store was set and it is maybe it as uh mainly modules on their regarding geomorphology and uh for example there are modules for expect it to this section of the diner directions that other contributing ideas it's impossible to estimate the network the watershed and so hierarchical enumeration of network and want to share then the images for the evaluation of that escape distance to the outlet and out of to work for the evaluation of some logical until morphological had to be attending indexes some in
some important clues are always available inside their statistics section as that a we have almost modular for interpolating to that them uh 1 is that clicking a we implemented 2 modules that tool different I believe support people taking that 1 is based on a given value of M that you can also create with your data inside the data stores and that the other one is called Jenny because it is just the model meant and can be used for especially for the temperature pressure humidity and wind velocity at because it takes into account the uh both the position of the matter station but also there are different in elevation so now with this kind of metadata and a little bit different elevation between station is very important for the um point the interpretation
the next module users that picture the truth is that it has to be distributed at a logical model for the evaluation of the maximum discharge within forward and the small catchments and each word so a given um and for a given precipitation and even so and but also for use in some statistical information of rain and I like the output of the intensity duration curves so um and could be again create the maximum flatness uh the maximum of the steps for uh for for that the section uh at different rates time so it is useful for uh creating the data for the fact that effective at least in Europe we have also a
very simple one-dimensional I doubted model in integrated in the data stores I believe this is based on this and then and equations and that could see all the input and output data GAS basis so we you have the geometries with a lot of effort to bullets there the 1 that has that is needed by the program and also the output of the program is given as to put the data the section shortest path and it calculates the wanted that the velocity and defraud Parliament those have in each section that it is very simple but they're in fact is not so simple because it can handle also like contributes of book uh him for all and but like a at conferences and out-states maybe some power plants so there's some some stuff like that the in this section of some paid Alec who we also have a module used to prepare the data as for Hey cast 1 indeed uh model as
orders and if you know not that I'm not that we have that the possibility of nodes are module tool evaluates the his stability it says we did is modular at this possible to obtain matter we stability condition for a given precipitation but also and mapped to read that a map with the critical a precipitation in each but the peaks and so you can no it's deixis are all is a stable unconditionally stable unconditionally and something the
and another 1 that since we were we've been mounted in a mountain region we have a lot of the events of uh wheat flour the reader high volume of sediment so we implemented those simple module it is not that be dimensional idolic when you look at it is uh based on morphology it is somewhat for uh the evaluation of the triggering points the propagation in the network and then this spread over there uh they're Brie on there will be a funds as you can see here this module even if d is very simple it gives the idea of where they did every will flow into 1 so there and the amount of the segments that transported sediments here is that there is the result of the simulation and image and there you will find that they are that as the previous events simulated with this uh softer then there in
the data so that we we have also a lot of tools for the evaluation for analysis and the extraction of the town grasses them back the for example we have for arrested combat mop calculation mosaic of France there's and for vector models so for the analysis of and at last date they're oneself for a lie that like the point cloud to an addictive realization and also 2 was tool connected graphs about the inductive data and give a attributes to 1 or arrested eyes it was as a
tool to is that at that he's very useful in the analysis of our what uh island is what 1 is this and and then we presented the other 1 the 1st 1 is that the profile extracted and the pulley is a bit different from the other GAS where be you can give input that shape with that aligned stream we don't alliancing what you want to have uh the profile and output the is that CSV file with all profile for each of these slides in lighting and then you can use the CSV file a new ones in Europe spreadsheets to um an attribute to all see that perhaps of on this so
it is very interesting and i is called Bob the Builder uh it is said it we use it for modeling uh artifacts in by way in India elevation models as so you can uh create an artifact and then make assimilation uh considering the in simulation at that I have not stealing the GTM
and you can be a things so also uh auditing so you can definitely is have whatever you want to know that at each step of the modern then there is the
mobile section where you can prepare your own data it and to use as background you know that the application double for each step the mapping and there then you can the vise extract the data from the dope apart after project and either the transform me to initiate file then
we ever and 1 of the last part of the that we implemented is there Lester library less tho use their status for lie that empowers science hope hope to box open source it is developed in collaboration with a few University of bullets are known and it is mainly dedicated to the analysis of light at the
time it contains the different sections so as you can see here
dedicated tool of the theory in analyzing and managing and managing of light at the but also to was to rationalize the data
that point clouds for example uh based on different time itself like the adaptive 18 or they invest some way and so on to create digital terrain models and also to the to the sort of small
uh since building that uh usually IRB beat the arms and make some confusion in the values of education we decided those support extracted to make a model to extract the buildings sent from the lied about the set Saul and that these model is that the 1 in did this is that is out of the model that is the possibility to make that it is a two-step model and their red that being that I had the polygon said the dealings in output of the 2nd step of the model that are quite uh desired quite cold and then there is the
main section of the unless the library that is dedicated to the destruction of the vegetation and competitive we decided to follow the approach of the individual uh 3 s so um we will retain school back the position and the main characteristics of each single and we implemented for now and him to model is based on a on that that algorithm of local maxima 1 is arrested base and money is
point out that the end of the last class is the it is there and is a set of modules dedicated to the evaluation of the logic would contribution during flooding events so that they developed during their Summer of Code of 2004 and 16 and uh these tools side a GIS-based tools for predicting the magnitude of logical of logs inside the dairy there's during fighting events and then it's considered they considered 2 kinds all of our main processes that are related to what there it's 1 is their equipment from the his notes and is that transportation overdo it locks bouncing
and what is that because the main characteristics of these modules and it this this is not and based on their a that the section directly from the GTM but that is also the possibility to in the inset um they fix it uh this section we fix that we've so like bridges and the answer and then it they are based on an uh and they consider their body condition so run inside the and value divided 1 D model for the evaluation of the water that then the velocity of the fuel to be able to carry their log and the vegetation can be even as a contribution of uh model when considering canopy height model and forest then the volume or as a vector of seeing the trees so and then the states will be propagated bouncing and the propagation and is based on a double check on the length of the logs and the way that the fire of this section and on the Gammatone of the logs and the water and these are
almost their results this is
integration in GBC again you can see the input and output section
and the section ideas so where you can see of the club section and that the volume of the log so those that have stopped in each section so thank
you if you have any question or if you want to download the things that you can Oscar and follow this link or ask thank you questions and the question on the
building detection algorithm uses a two-step could just give a bit more detail on how you detect the again it is that is that because the 1st suspect best actor on must bear the main polygons that could be that it could be the islands and the 2nd step of the there's this polygon are now considering how many atoms elected the presenters over the points that lie on the ground and the stuff like that so it is definitely of the passing by libraries but it is it for a slight I was curious how are and how well you doing and in defining a single trees in finding single trees in in in your life of often there's also were well share reforms of errors more isolated trees none of that we would come now and slice of the area and and therefore now we have a good results with this too was because we also integrated in an of but and out of a particle swarming optimizers for these so we have we agree we on on a lot of simulation ultimately end uh the detection of single tree is there in colony for I must 80 per cent or most from much more than the 80 % back at least 80 % and there we have now finishing their implementation of another tool that it is not based on local maxima as it is based on another algorithm and we this it's that the performance and better on the colony 1st we have now alkyne testing it on broadening forest but coniferous forests that even if the structure is the multilayer is uh the results of good neural so able to give the of the size of the the system from a node in all OK we didn't I use that we decided we then they and there is still a lot of the soul you would be using this a bit easier because you just cottage and then but so we didn't I with that letter I have feelings stages so you only take the the classifier for the 1st 1st our inference when you're detecting single or or do you take all points 0 so in general we take all points uh but it there really depends on the kind of forest legal and on the day a very the density of forest but that we will use all and that we don't use the classification given by their a provider we know just done as some step for our problem have you tried and mixed areas with mixed mixtures that we are now trying to do it that's it is part of my PhD is this is this is the end of the i I have good results and but not so with the liking coniferous forests that to be very algorithm that there's not stated published and there's outside the interval so if married next the the next month you will see also that the article exist in in the lab thank you any other questions we have to understand hello these processes scale with like size of datasets it depends on how you do it if you have a big datasets in they will they will take a lot of time so and that after a compute a powerful computer if we ever want to share the best time to a small you want to share 1 to square kilometres it their 1st it depends also on the memory you give so if you have a moral Mamadi much power you can work it was on a laptop I can I do some elaboration was something the thank you regarding the last is the last issue about that a lot of contribution to the fault of the string of the how do define how do you find out that the volume and how it how for you coming in that case so um it depends on the input data vegetation that that if you have a day and have a canopy height model and forests a standard that is that on simplified modules to evaluate their volume the volume of vegetation that you need to read section we extract their subbasin referring to each section of that event so that we have parts all this subbasins and all the area the area that is unstable inside the subbasin or Daddy flooded we contributes to their would so n the vegetation that 2 years in this big out that contributes to their the the amount of water that is in each section and we do a statistical elaboration on their height for the length of the logs and then on the volume we calculate the volume with some other metric function at the input of the model so if you have your little problematic function you can put it in that and so and then we calculate the volume and that let's let's say the average length of the vegetation in each section and then we 1 D idolic model we go we propagate these logs stance downstream and we say we verify the condition for their i got a condition and 0 to matter conditions for passing each section in the 1st section some announced will be will be bigger than the width of the section on that they did gamut 30 years that began done uh day with that's so they stop and they a a forced to stop or you must 5 of their things behind the room for using of of the system is so you have good algorithms for estimating the uh how most materials will lead the losing from the from the stone and from the from so we can see that unstable and connected areas so we have I'm going we have asked us to the category so arms 1 like the 1 I showed in the economy the inequality inside it is based on this local and then the distance of from its yes we thank you very much during the service of food so thanks to his future
Datenmanagement
t-Test
Datenmanagement
Physikalisches System
Kombinatorische Gruppentheorie
Analysis
Computeranimation
Automatische Handlungsplanung
t-Test
Geräusch
Analysis
Framework <Informatik>
Computeranimation
Virtuelle Maschine
Freeware
Informationsmodellierung
Mathematische Morphologie
t-Test
Stichprobenumfang
Programmbibliothek
A-posteriori-Wahrscheinlichkeit
Speicher <Informatik>
Softwareentwickler
Grundraum
Analysis
Einfach zusammenhängender Raum
Shape <Informatik>
Wald <Graphentheorie>
Ontologie <Wissensverarbeitung>
Open Source
Güte der Anpassung
sinc-Funktion
Datenmodell
Programmierumgebung
Zwischenwertsatz
Fokalpunkt
Modul
Integral
Mapping <Computergraphik>
Assoziativgesetz
Verkettung <Informatik>
Datenfeld
Forcing
Digitalisierer
Evolute
Mereologie
Ruhmasse
Dateiformat
Projektive Ebene
GRASS <Programm>
Simulation
Modelltheorie
Programmbibliothek
Programmierumgebung
Koordinaten
Bit
Punkt
Formale Sprache
Extrempunkt
Computeranimation
Richtung
Metadaten
Abzählen
Minimum
Bildschirmfenster
Maskierung <Informatik>
Statistische Analyse
Skript <Programm>
Gasdruck
E-Mail
Softwaretest
Parametersystem
Interpretierer
Statistik
Sichtenkonzept
Vervollständigung <Mathematik>
Datennetz
Algorithmische Programmiersprache
Konfiguration <Informatik>
Druckverlauf
Hierarchische Struktur
Datenfeld
Einheit <Mathematik>
Automatische Indexierung
Festspeicher
Garbentheorie
Schlüsselverwaltung
Aggregatzustand
Subtraktion
Mathematische Logik
Ortsoperator
Virtuelle Maschine
Loop
Informationsmodellierung
Mailing-Liste
Modul <Datentyp>
Arbeitsplatzcomputer
Abstand
Optimierung
Speicher <Informatik>
Bildgebendes Verfahren
Leistungsbewertung
Analysis
Hierarchie <Mathematik>
Abzählen
Mailing-Liste
Modul
Zufallsgenerator
Abstand
Flächeninhalt
Attributierte Grammatik
Räumliche Anordnung
Speicherabzug
Interpolation
Geschwindigkeit
Subtraktion
Extrempunkt
Leistungsbewertung
Kurvenanpassung
Gleichungssystem
Information
Extrempunkt
Räumliche Anordnung
Computeranimation
Informationsmodellierung
Konditionszahl
Statistische Analyse
Optimierung
Speicher <Informatik>
Gleichungssystem
Leistungsbewertung
Funktion <Mathematik>
Soundverarbeitung
Statistik
Krümmung
Datenmodell
Übergang
Wasserdampftafel
Bitrate
Ein-Ausgabe
Modul
Garbentheorie
Funktion <Mathematik>
Ein-Ausgabe
Basisvektor
Wort <Informatik>
Garbentheorie
Information
Geschwindigkeit
Resultante
Stabilitätstheorie <Logik>
Punkt
Datennetz
Leistungsbewertung
Ausbreitungsfunktion
Modul
Ereignishorizont
Computeranimation
Stabilitätstheorie <Logik>
Mapping <Computergraphik>
Knotenmenge
Mathematische Morphologie
Datennetz
Konditionszahl
Diskrete Simulation
Flächeninhalt
Simulation
Spezifisches Volumen
Ereignishorizont
Bildgebendes Verfahren
Leistungsbewertung
Bitmap-Graphik
Shape <Informatik>
Datenmodell
Vektorraum
Profil <Aerodynamik>
Vektorraum
Ungerichteter Graph
Elektronische Publikation
Rechnen
Ein-Ausgabe
Computeranimation
Eins
Rechenschieber
Streaming <Kommunikationstechnik>
Informationsmodellierung
Tabellenkalkulation
Mosaicing <Bildverarbeitung>
Streuungsdiagramm
Lie-Gruppe
Funktion <Mathematik>
Attributierte Grammatik
Analysis
Leistungsbewertung
Bitmap-Graphik
Informationsmodellierung
Datenmodell
Vektorraum
Computeranimation
Quader
Open Source
Kartesische Koordinaten
Elektronische Publikation
Computeranimation
Mapping <Computergraphik>
Open Source
Kollaboration <Informatik>
Mereologie
Programmbibliothek
Attributierte Grammatik
Mobiles Internet
Garbentheorie
Lie-Gruppe
Analysis
Open Source
Garbentheorie
Streuungsdiagramm
Physikalische Theorie
Computeranimation
Anpassung <Mathematik>
Informationsmanagement
Ortsoperator
Extrempunkt
Gebäude <Mathematik>
Gebäude <Mathematik>
Polygon
Computeranimation
Netzwerktopologie
Informationsmodellierung
Algorithmus
Charakteristisches Polynom
Modul <Datentyp>
Menge
Programmbibliothek
Garbentheorie
Charakteristisches Polynom
Funktion <Mathematik>
Größenordnung
Geschwindigkeit
Prozess <Physik>
Wasserdampftafel
Klasse <Mathematik>
Ausbreitungsfunktion
Vektorraum
Bridge <Kommunikationstechnik>
Login
Mathematische Logik
Transportproblem
Code
Computeranimation
Netzwerktopologie
Informationsmodellierung
Datennetz
Code
Konditionszahl
Durchmesser
Ereignishorizont
Cliquenweite
Leistungsbewertung
Algorithmus
Dicke
Wald <Graphentheorie>
Prozess <Informatik>
Datenmodell
Vektorraum
Modul
Ereignishorizont
Web log
Garbentheorie
Menge
Konditionszahl
Attributierte Grammatik
Garbentheorie
Größenordnung
Charakteristisches Polynom
Geschwindigkeit
Aggregatzustand
Resultante
Verschlingung
Verbandstheorie
Garbentheorie
Spezifisches Volumen
Ein-Ausgabe
Computeranimation
Homepage
Spezifisches Volumen
Integral
Resultante
Bit
Punkt
Prozess <Physik>
Inferenz <Künstliche Intelligenz>
Extrempunkt
Minimierung
Program Slicing
Computer
Farbverwaltungssystem
Login
Service provider
Computeranimation
Netzwerktopologie
Algorithmus
Mixed Reality
Softwaretest
Zentrische Streckung
Lineares Funktional
Dicke
Kategorie <Mathematik>
Güte der Anpassung
Cliquenweite
Ein-Ausgabe
Dichte <Physik>
Zusammengesetzte Verteilung
Dienst <Informatik>
Konditionszahl
Festspeicher
Garbentheorie
Message-Passing
Fehlermeldung
Zeichenkette
Standardabweichung
Wasserdampftafel
Besprechung/Interview
Implementierung
Ordinalzahl
Kombinatorische Gruppentheorie
Polygon
Informationsmodellierung
Knotenmenge
Ungleichung
Notebook-Computer
Programmbibliothek
Spezifisches Volumen
Datenstruktur
Leistung <Physik>
Videospiel
Wald <Graphentheorie>
Materialisation <Physik>
Einfache Genauigkeit
Physikalisches System
Modul
Flächeninhalt
Mereologie
Simulation
Partikelsystem

Metadaten

Formale Metadaten

Titel Spatial tools for LiDAR based watershed management and forestry analysis integrated in gvSIG
Serientitel FOSS4G Bonn 2016
Teil 176
Anzahl der Teile 193
Autor Franceschi, Silvia (HydroloGIS - Free University of Bolzano)
Antonello, Andrea (HydroloGIS)
Lizenz CC-Namensnennung 3.0 Deutschland:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/20315
Herausgeber FOSS4G
Open Source Geospatial Foundation (OSGeo)
Erscheinungsjahr 2016
Sprache Englisch
Produktionsort Bonn

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to: preprocessing of LiDAR raw data (LAS), creation of raster derived products, normalization of the intensity values and tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach and implemented the extraction of tops and crowns from local maxima, the region growing method and the watershed method, all can be applied on LiDAR derived raster datasets as well as point clouds of raw data. An automatic validation procedure has been developed considering an Optimizer Algorithm based on Particle Swarm (PS) and a matching procedure which takes the position and the height of the extracted trees respect to the measured ones and iteratively tries to improve the candidate solution changing the models' parameters. On a watershed level, the resulting extracted trees with position and main characteristics, can be used for forestry management or for the evaluation of natural hazards (hillslopes stability, large wood transportation during floods).
Schlagwörter HydroloGIS
Free University of Bolzano

Zugehöriges Material

Ähnliche Filme

Loading...