Merken

# New GRASS modules for Multiresolution Analysis with wavelets

#### Automatisierte Medienanalyse

## Diese automatischen Videoanalysen setzt das TIB|AV-Portal ein:

**Szenenerkennung**—

**Shot Boundary Detection**segmentiert das Video anhand von Bildmerkmalen. Ein daraus erzeugtes visuelles Inhaltsverzeichnis gibt einen schnellen Überblick über den Inhalt des Videos und bietet einen zielgenauen Zugriff.

**Texterkennung**–

**Intelligent Character Recognition**erfasst, indexiert und macht geschriebene Sprache (zum Beispiel Text auf Folien) durchsuchbar.

**Spracherkennung**–

**Speech to Text**notiert die gesprochene Sprache im Video in Form eines Transkripts, das durchsuchbar ist.

**Bilderkennung**–

**Visual Concept Detection**indexiert das Bewegtbild mit fachspezifischen und fächerübergreifenden visuellen Konzepten (zum Beispiel Landschaft, Fassadendetail, technische Zeichnung, Computeranimation oder Vorlesung).

**Verschlagwortung**–

**Named Entity Recognition**beschreibt die einzelnen Videosegmente mit semantisch verknüpften Sachbegriffen. Synonyme oder Unterbegriffe von eingegebenen Suchbegriffen können dadurch automatisch mitgesucht werden, was die Treffermenge erweitert.

Erkannte Entitäten

Sprachtranskript

00:01

show and I want to taken presentation which is generated in the university both of our followed of dependency and understand from analog it would be presented by and then implement on and it is about in 0 graph models all more beautiful overnight it is wavelet the head but of was you can really we know the the amount of fuel format I him that were of work

01:37

was the creation of the world for the analysis that the golf ball and mission both on the market and additional and it should be and it should be that of course and so that entire framework of J J L by the use of the signal processing technique compared case where performed just to prove some of the things that that's right around compression or removing all wrote them or what is more interesting than the annotation and the fraction of an object that certain came from the and moreover recognition of geomorphologic independent phenomena on FIL hitting reflecting the from a given but ability to highlight certain features area and tied give the possibility to high electric certain features and had a signal which otherwise would not be perceptible and wealth to a well-known technique are assured that the Fourier transform which is local in frequency and spline which is the local and state the problem if you want to analyze the signal which is variable in the ending pregnancy you needed to which is which is localization in both the main 1 being possible to with the use of wavelet which is that gives a good but I'm about of the conditions in space and frequency so well we've let without the family of this function here to much too hard 1 of the functions which are generated from a mother function regulation and translation and if we the the mother function is seen in the and M and N are indicated for dilation and translation then is they're all rest of the wavelet our formed by the mother function and go to form the basis of the and the square integrable functions and I don't mean not so 1 example of wavelet if the Mexican had to be the form of the sombrero N and curated with the from the later than translated a version how appear when 1 man for the but it was not a sort so the method used to define wavelet is the multiresolution analysis and the method consists in the definition of a ladder of family of subspaces which I have been told that to are and for which hold certain characteristics such that the the intersection of 2 speakers contains the null value the closure of the union of result in the L 2 will fire and the all the translator of certain functions as being the same space whereas for the last condition which is the multiresolution condition if the function start there is and the basic multiresolution state and labeled by and then of functional quashed solution with the in they live by and plus 1 but there are reported here can be resumed and can be shown that the graph is the that the think cardinal function which is shown it is unbiased had levels and you can see how it gets around their and their functions you can see here is the 1 used by dilation and translation again with to indicate to to build a base of every at different resolutions so do the way that are constructed by the but the definition of the space of the different you called W and it is that the difference between 2 sets of subsequent level of pollution and in this case that it is defined as the orthogonal complement of Yemen via minus 1 so the uh the direct sum of BMW and give minus 1 and the other 1 is orthogonal to w and it is still possible to build a function of the before that deal deals only with big M and running that the bane of the W and in this case would run them as the basis for everything bill so it is the presence of all the orthogonality that makes it possible to have such an efficient of computationally efficient so to understand how many moderate pollution work we can here on figure ground on which at the 1st level we have the original signal is that it is for the discrete space and in the for decomposition you get to that the court so orders version of the original signal and the details and if applied to every sublevel so how did this I think is applied to every problem from all requested and this made by the use of a recursive function which is the dilation equations an example could be made by the use of the you have the harvest which have coefficients such that you as possible to upon the calculation so since the competition applied by means of of averaging and subtracting in the it is possible to show the number the number of samples the same and so the information in every moment you can reconstruct the signal without loss so and if you were to put in jail and we will talk about the infinite signal and monodimensional in this case we have to extend the carrier due to the fact that the right amount of dimensional and of any kind of language we have extended character the bi-dimensional case which is sold by the use of tension of the tensor product of 2 one-dimensional and then move on multiresolution analysis perspective before finite length handled by techniques of 0 padding on extension by reflection and special-education and and afterward about the use of the of symmetric function they can be used to better improve and a better and better efficiency of about the program in theory and open source tools he was used the and open-source library and research in the net which was found in the major with project which is a French project open source project and in about image processing so if someone is interested in the region inside the code was adapted to fuel interface functions to systems where it needs to have a stand-alone program on which suitable that could be performed after that the whole scene would it to jail 32 to graph so what would happen if all modules were created so I don't believe that the same correct which are for orthogonal wavelet and the same thing the composition and reconstruction with by active in a way that the that where 1 by the use of hobby which we know is somehow predictable and a very trivial example if you see a map of

09:37

constant value you average constant value you get the same value and subtracted get there In the 2nd case analysis and synthesis were carried out with 3 levels of decomposition and it was

09:51

clarified the difference between the original image and the reconstructed as well as some of you can up some of the applications I think everybody was waiting for them so we have about compression of what the structure removing across and so on so the first one is the brief observation about compression to methods where you used so In the 1st phase and those of final can did you can often be found taking on to compute the 1 level the government the the aim was that did you don't and put all the bodies inside the range around a certain range around 0 2 0 so that the compression algorithm of right to be and gain a better compression factor so that the not be the 1st type is the same thing but you should know that in the the different threshold values were used so the legal matter where 1st so the value 0 . 0 0 1 blackening everything outside that range was that this there and that gained 40 % of compression In this case you can see here that the same thing with and without . 1 blackening and gained 60 per cent of compression and the methods the same it sure final a judgment about the compression should be taking consideration they're on the measurement of the signal power the lost control of quality control the 2nd case really worried about the conversion to jpg this format that handles better compression which moved it images in this case they what's the image what he was the 1st to composition and the game by conversion to 7 % just which is to try the 2nd for for the 2nd application I on situation that was created just in all exactly didn't dimensions of the object that on so the virtual object you see it are all full-time formula then the 2nd of thing here 15 times 15 and 150 densification they're talking resolution on the right and that was of 1 time 1 meter and what should have happened in the 1st level of resolution we have a resolution of the competition we have a resolution oftentimes to 3rd level 2nd level full-time 4 and so on it mean so in the 2nd level of resolution we suppose that the cleaning of the details of the the the object that can use that match that he should have been removed so what happened was in proper cleaning up the the government for a 1st level of the competition if nothing happened no 1 of the object it wouldn't match the case 1 of the 1st level of decomposition but he a great deal because in the 2nd level of decomposition what happened was expected that and cleaning up the details of the 2nd level of the composition of the patrol disappear and in the 3rd connected you get in the 3rd level of the composition of the 1st role grow again and in the 2nd the 2nd robot I think is really important because it makes it possible to prove the possibility but you can manage and how change the object at a certain location without interfering too much with the rest of the landscape and Annex application deals with a lot of time that which you know very dense that I get template every 3 to 5 fundamental but they are kind of error-prone almost error-prone so we started from this now you can see the hybrid orbitals to 302 and can be assumed that measurement error so since yacht high-frequency birth and inside embedded inside of landscape the

13:52

the kind of fellow you warrant model of error not

13:57

abundant and some of the things that the range of certain minus 1

14:01

2 values where you put the value do everything that had once

14:04

again put to 0 and about what this map where you can see the bird add them to the birth so a 2nd try was made transcoding heavily heavily this is the thing of only a few meters and then what we're going

14:20

to work and that much more prominent in DPM so going to better understand if you have people the mapped the difference maps between the original and the reconstructed 1 so in the case on the hybrid were removed and in the 2nd case that some of the minor which could be that's created something on the surface so another application which is the 1st the germ of logic implication or we tried to get some information about what happens if we removed some minor scale features from the so the mapping the on the rest of the graph on the right is that the map which localizes the topography and convex concave and this map was carried out with the use of the Houghton Library which represent represented tomorrow so someone interested in anything that I we wanted to see what's happening in the going down the resolution and we don't the full force of the 1st approach it was not we know that actually some kind of self similarity of the pattern of convex and concave so you can hear the number of solutions don't

15:41

back application handled badly and then there are a fractal and keep which are a perfect example publication of different feature and and it is it is the phenomena that appears on 1 side of the line of the valley around here the them on geographic laughter and since they uh the data that a little came to to directed along the DNA and somehow how a recognizable by multi-resolution analysis at the competition was performed and she I overlaid on the map of the details of the 1st level of resolution you can the only exception that the the polite specially the the

16:34

last 2 application and I think it is more interesting and if there tried to somehow and recognize that in an objective automatically certainly shape again there is a lot of fun with him before and assumption was that they can always some features from the minus key feature again somehow accounted as she could remain so the the shape was a smooth it down but provides a level of decomposition and then a mathematical definition of a character shape is the 1 that the Shannon in the line of my model of information the tangential cool the you're doing he created and we that and so what use 1 the error profile are provided with you to perform some section and here you can see in the original and the version that seems to be very nice but it should be the best thing but what happened day the wavelet decomposition generated from artificial patterns which made it really impossible to get a good the condition of the tangent and curvature so the 1st thing to do in the future is to search the future ball way that they have to to be able to carry out such that that that your job and what i presented you here uh just blueprint and should be just came to the great possibilities that are behind it is to and conclusion point models happened before it can be used in and other rats and you in the future development and test of the proof of the potential work and for part future developments and the Fed enriched library to make it possible Due to get the this kind of application would the automatic shape recognition be which would be very important and 1 other thing is do the creation of compression procedure instead of grant the use of the way that on with which to the analytical and the power lost control but in the middle and time OK questions or comments and do you like your

19:28

presentations and I like not the problem of the ultimate of automated you morphological mapping mean automated feature extraction from 2 D N M and then and that could have any any specific question because some of the of the presentation I just do not have the knowledge about the wavelet domain so I can ask about I would like I would like to ask you as an do you see any and

19:58

possible future limitations of using the

20:01

wavelet for ultimately to feature extraction extraction the welcome quantity of we might be the future problems or the limitations which will help them approach to I mean to the future development of the application I think the biggest limitation in this case of the whole problem in the book are a model of everything they've ended being you choose and to understand which they could choose you need a huge knowledge I don't have a have to admit I have such knowledge so it can't go on with the work through a really good at the end something good you need to really of that knowledge to find the right because I think because we got this around and we had to be the from the major we project on the what we we got certain that at the beginning all you can do this you can do this so that everything can be possible but then again the recommendation and in a lot of money but when we recognize it should be become how possible variable but that the program more knowledge behind the creation and the definition of the I think that great can Uriah a of the kind

21:58

that then there are other is that all over air in the area of the and on and on but it was going on the program in a little bit the contrary I am

22:19

a litter going on there and just the comment that the question is that there are so far the limitation on that user on that need DME themselves because they are so bad quality that it is not possible to extract features and we got in a proper manner usually all the I have a great bias song curvatures and this is fundamentally many pattern recognition so and then that's all I have had so in a you you can see the lini because some techniques that because you don't have a good that that you need to it is not supposed to be here you with all the

23:14

modules and if I intend to all take kind of innovation innovation models which have been developed from the serial data for example the opt-out that a light year which are available and derived from the tools you channel that that you need a resolution but a lot of problems in the overall data for there to be greater than the frequency it based approach look very promising but but companies like a catch and I don't know anything for the frequency of the level I have to to fill for in what way the possibility to fit the data sort of unsupervised why regard God where we have to get the knowledge about things to be it and then

24:06

there's you mean about the the object you want there when I was looking into that your synthetic so that when you

24:14

add or maybe it's a and want to and if you have a really good elevation model which was derived from the raw data you have all the for the building inside and whatever there is also evidence of course and the question of how to use them with such a module if you don't always have the knowledge of what to do and and and that's in problem we have so you have that 1st of all to analyze somehow how did you come out knowing that they had the differences between what was the original and the cost of 1 we we made a few tried about a range of tasks holding but to understand how it is that what you even because a measurement error is that a single point which was completely out of the rest of the the bank in the moment you you build learn from the fact that got the DPM 1 yeah it gets a bit larger city because of interpretation but it come out the the minus came and you find and so I think it because of that can be there a lot note that show along with you here because we did right resolution and we wanted to see if it can effectively possible you can and it is moment what was another job I hope that future work with would be carried out in the world the thanks man got off your work I would like to ask you I'm especially interested in dynamic modelling cold and the amount of heat and the

26:12

thing about that these methods use technology usable for for temporal analogies from my until about them the from from from knowledge of using of the metal in the front and the and the 2nd shot question is is it possible to test your model God models is there already

26:34

available for for just there are on your testing for further improvements all of the that people will be thinking in where the fact that I I don't know exactly how to answer the 1st question that it has to be tried in somewhat different from that of the had just the blueprint with a blueprint for us to work with and for the 2nd question and request had but as the market before we will need probably afterward that they are not implementing and will hopefully be implemented sometimes yeah from it all I can a combination of content and

27:29

training of the OK thank you for this annotations

00:00

Informationsmodellierung

Graph

Wavelet

Dateiformat

Vorlesung/Konferenz

Kombinatorische Gruppentheorie

Grundraum

Analogieschluss

Informationssystem

01:23

Resultante

Einfügungsdämpfung

Spiegelung <Mathematik>

Gewichtete Summe

Momentenproblem

Formale Sprache

Familie <Mathematik>

Versionsverwaltung

Symmetrisierung

Raum-Zeit

Computeranimation

Übergang

Richtung

Translation <Mathematik>

Vorlesung/Konferenz

Spline

Quellencodierung

Figurierte Zahl

Regulator <Mathematik>

Schnittstelle

Metropolitan area network

Bildauflösung

Bruchrechnung

Addition

Lineares Funktional

Dicke

Analoge Signalverarbeitung

Freier Ladungsträger

Stellenring

Mustererkennung

Rechnen

Frequenz

Fourier-Entwicklung

Tensorprodukt

Unterraum

Helmholtz-Zerlegung

Menge

Wavelet

Konditionszahl

Mehrskalenanalyse

Projektive Ebene

Information

Ordnung <Mathematik>

Aggregatzustand

Algebraisch abgeschlossener Körper

Subtraktion

Gewicht <Mathematik>

Zahlenbereich

Framework <Informatik>

Physikalische Theorie

Code

Demoszene <Programmierung>

Bildschirmmaske

Webforum

Perspektive

Stichprobenumfang

Programmbibliothek

Maßerweiterung

Optimierung

Zeiger <Informatik>

Ganze Funktion

Analysis

Graphiktablett

Graph

Logiksynthese

Stochastische Abhängigkeit

Open Source

Finitismus

Orthogonale Funktionen

Bildanalyse

Physikalisches System

Modul

Quick-Sort

Unendlichkeit

Objekt <Kategorie>

Flächeninhalt

Basisvektor

Zeitdilatation

09:51

Umsetzung <Informatik>

Subtraktion

Messfehler

Hausdorff-Dimension

Kompressibilitätsfaktor

Kartesische Koordinaten

Übergang

Ausdruck <Logik>

Informationsmodellierung

Spannweite <Stochastik>

Spieltheorie

Datentyp

Meter

Luenberger-Beobachter

Vorlesung/Konferenz

Datenstruktur

Drei

Quellencodierung

Phasenumwandlung

Bildgebendes Verfahren

Einflussgröße

Bildauflösung

Leistung <Physik>

Schwellwertverfahren

Matching <Graphentheorie>

Orbit <Mathematik>

Roboter

Helmholtz-Zerlegung

Objekt <Kategorie>

Rechter Winkel

Gamecontroller

Dateiformat

URL

Fehlermeldung

13:53

Mapping <Computergraphik>

Meter

Vorlesung/Konferenz

14:18

Zentrische Streckung

Subtraktion

Graph

Konvexer Körper

Zahlenbereich

Ausnahmebehandlung

Selbstähnlichkeit

Kartesische Koordinaten

Mathematische Logik

Übergang

Mapping <Computergraphik>

Perfekte Gruppe

Forcing

Rechter Winkel

Flächentheorie

Mustersprache

Vorlesung/Konferenz

Mehrskalenanalyse

Information

Hybridrechner

Gerade

Bildauflösung

16:28

Vektorpotenzial

Punkt

Besprechung/Interview

Versionsverwaltung

Kartesische Koordinaten

Kombinatorische Gruppentheorie

Übergang

Domain-Name

Informationsmodellierung

Mathematische Morphologie

Prozess <Informatik>

Mustersprache

Programmbibliothek

Vorlesung/Konferenz

Tangente <Mathematik>

Softwareentwickler

Quellencodierung

Gerade

Leistung <Physik>

Softwaretest

Shape <Informatik>

Krümmung

Profil <Aerodynamik>

Mustererkennung

Algorithmische Programmiersprache

Mapping <Computergraphik>

Arithmetisches Mittel

Helmholtz-Zerlegung

Wavelet

Beweistheorie

Konditionszahl

Mereologie

Garbentheorie

Information

Schlüsselverwaltung

Fehlermeldung

19:57

Informationsmodellierung

Wavelet

Rechter Winkel

Besprechung/Interview

Inverser Limes

Vorlesung/Konferenz

Kartesische Koordinaten

Softwareentwickler

Optimierung

21:53

Bit

Flächeninhalt

Krümmung

Inverser Limes

Vorlesung/Konferenz

Mustererkennung

Optimierung

23:05

Objekt <Kategorie>

Informationsmodellierung

Grundsätze ordnungsmäßiger Datenverarbeitung

Vorlesung/Konferenz

Serielle Schnittstelle

Frequenz

Quick-Sort

Bildauflösung

Übergang

24:11

Interpretierer

Bit

Subtraktion

Punkt

Momentenproblem

Benutzerfreundlichkeit

Messfehler

Güte der Anpassung

Temporale Logik

Modul

Task

Spannweite <Stochastik>

Informationsmodellierung

Prozess <Informatik>

Grundsätze ordnungsmäßiger Datenverarbeitung

Mathematische Modellierung

Vorlesung/Konferenz

Analogieschluss

Metropolitan area network

Bildauflösung

26:32

Softwaretest

Wellenpaket

Schaltnetz

Vorlesung/Konferenz

Inhalt <Mathematik>

### Metadaten

#### Formale Metadaten

Titel | New GRASS modules for Multiresolution Analysis with wavelets |

Serientitel | Open source GIS - GRASS user conference 2002 |

Anzahl der Teile | 45 |

Autor |
Antonello, Andrea Zatelli, Paolo |

Lizenz |
CC-Namensnennung - keine Bearbeitung 3.0 Deutschland: Sie dürfen das Werk in unveränderter Form zu jedem legalen Zweck nutzen, vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. |

DOI | 10.5446/21765 |

Herausgeber | University of Trento |

Erscheinungsjahr | 2002 |

Sprache | Englisch |

#### Inhaltliche Metadaten

Fachgebiet | Informatik |