LiveDroneMap - an Automatic Real-time UAV Mapping System
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Alternative Title |
| |
Title of Series | ||
Number of Parts | 208 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/41020 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | |
Genre |
FOSS4G Boston 201795 / 208
13
14
16
23
24
25
26
27
28
36
39
48
53
67
68
73
76
80
96
101
104
105
122
127
129
136
143
147
148
150
152
163
178
188
192
194
197
199
200
201
00:00
Real numberTexture mappingObject (grammar)Real-time operating systemResultantPhysical systemComputer animation
00:34
Physical systemTexture mappingTexture mappingDependent and independent variablesArithmetic meanAreaData acquisitionMedical imagingFunctional (mathematics)Visualization (computer graphics)Uniform resource locatorRoutingElectronic data processingPhysical systemImage resolutionOperator (mathematics)TwitterNatural numberPlanningMathematical analysisInterpreter (computing)SatelliteBuildingBridging (networking)Event horizonFirewall (computing)XML
02:38
Texture mappingPoint cloudAreaComputer-generated imageryInternetworkingSystem programmingStandard deviationSoftwareWeb browserMobile WebComputerPhysical systemConfiguration spaceImage processingSampling (music)Parameter (computer programming)WhiteboardTexture mappingPhysical systemGame controllerAreaMedical imagingDistanceWeb browserPlanningMereologyHeegaard splittingSoftwareInternetworkingDigitizingReal-time operating systemRight angleStandard deviationComputer animation
04:12
WhiteboardControl flowComputerGame controllerWhiteboardTelecommunicationServer (computing)Computer animation
04:33
Euler anglesComputer-generated imageryServer (computing)Image processingVisualization (computer graphics)Process (computing)Real numberImage processing softwareOpen setLibrary (computing)AlgorithmGeometryPhotographic mosaicPixelPhysical systemAxonometric projectionCoordinate systemMedical imagingLibrary (computing)MereologyPhysical systemTexture mappingProcess (computing)Image processingPosition operatorCoordinate systemProjective planeRaw image format2 (number)Shared memoryAlgorithmVisualization (computer graphics)PixelReal-time operating systemComputer animation
06:25
SoftwareVisualization (computer graphics)DatabaseDigital signal processingProcess (computing)PlanningMilitary operationComputer configurationFunction (mathematics)Image resolutionProduct (business)Function (mathematics)Texture mappingData acquisitionProduct (business)Image processingElectric generatorProcess (computing)Computer configurationMedical imagingImage resolutionComputer animation
07:30
Data transmissionMilitary operationField (computer science)Computer-generated imageryTexture mappingOperator (mathematics)AreaField (computer science)Real-time operating systemTexture mappingServer (computing)Physical systemPoint cloudComputer animation
08:12
Task (computing)Multiplication signField (computer science)Computer animation
08:36
Computer-generated imageryData transmissionVisualization (computer graphics)WhiteboardControl flowServer (computing)SoftwareDigital signal processingPosition operatorEuler anglesFile formatImage resolutionIntegrated development environment2 (number)VideoconferencingServer (computing)Cycle (graph theory)Procedural programmingData acquisitionRaw image formatData transmissionPlanningSampling (statistics)Image processingMedical imagingVisualization (computer graphics)SoftwareState of matterPosition operatorComputer animation
10:13
Image processingVisualization (computer graphics)Image processingMedical imagingVisualization (computer graphics)SoftwareServer (computing)Computer animation
12:28
Visualization (computer graphics)PlastikkarteComputerMedical imagingSmartphoneResultantRight angleComputer animation
13:50
Image processingComputer-generated imageryAverageAbsolute valueGround controlPoint (geometry)CalculationStandard deviationSequencePhase transitionPhysical systemProcess (computing)Point cloudDirectory serviceLocal ringServer (computing)Visualization (computer graphics)DatabaseMultiplicationImage resolutionTask (computing)AreaMilitary operationInformationFunction (mathematics)Simultaneous localization and mappingOrientation (vector space)Raw image formatMedical imagingSpline (mathematics)PixelTexture mappingCoordinate systemTime zone2 (number)Server (computing)Simultaneous localization and mappingTelecommunicationImage resolutionTask (computing)Physical systemSequenceRange (statistics)InternetworkingAverageSoftware testingParallel portOperator (mathematics)AreaWebsitePhotographic mosaicProcess (computing)Local ringReference dataPhase transitionFunctional (mathematics)Point cloudDirectory serviceMultiplication signDiscrete element methodReal-time operating systemPosition operatorInformationConstraint (mathematics)LaptopVideoconferencingCorrespondence (mathematics)DistanceStandard deviationOrientation (vector space)GeometryReal numberComputer animation
Transcript: English(auto-generated)
00:00
This is Chang Chang from University of Seoul. Today, I'll introduce the Live Drone Map which is an automatic real-time mapping system. I'll explain the background and objective in the introduction and explain the Live Drone Map
00:21
as well as explain the experiment and which results at the end, conclusion and future work. Natural disasters such as flooding, firewalls and earthquakes and the world
00:44
can dramatically change the landscape or infrastructure of an area. These disasters can lead to rerouted rivers, flooded or damaged roads and a collapsed bridge or buildings make no more routes that first responders use unavailable.
01:07
It is crucial that the first responders have an updated map to find safe routes in their affected areas and know which location needs it.
01:21
The GIS provides the functions of analysis in the interpretation and visualization based on geospatial information. It helps us to understand the spatial trend of the event or plan for the responses for the event efficiently.
01:42
Unusually, on GIS system, the existing area or satellite images are out of date and retain low resolution. The mission area such as a disaster area may not have the existing area and acquiring data nearly take much time.
02:03
The geospatial information should be generated so rapidly that the GIS can be established and applied to mission operation immediately. The UAV mapping systems are very spacious means to acquire high-resolution data on the mission area.
02:24
So, we propose an automatic mapping system based on UAV, which can operate in fully automatic way from the data acquisition to the data processing.
02:42
The live-run map is a UAV-based real-time mapping and sharing solution. We combine the real-time mapping solution with a cloud-based geodata sharing solution. Users in distance areas with Internet can assess the image maps updated in real-time using UAV system during split.
03:07
The image maps are visualized in 2D and 3D with existing geodata without any plug-in software through the standard web browser on the desktop or even mobile devices.
03:24
The total system consists of the area segment and the ground segment. The area part is to acquire the sensory data during flight and ground part is to generate maps and visualize them to the geopoto.
03:44
The area segment is to operate UAV according to flight plan and to acquire the sensory data. It consists of the UAV digital camera and the GPS IM sensor and control board.
04:01
They are loaded on the UAV using customized container like right images. To control a sensor, the control board collects the sensory data with GPS time
04:21
tag and transmits them to the ground server through the long-range Wi-Fi. The long-range Wi-Fi has a 1-kilometer communication coverage. The sensory data is transmitted from UAV to the server in real-time.
04:40
The ground segment consists of two parts. One is the image processing part is to generate individually direct found imaging with raw imaging and the position and altitude data of UAV. The second is the visualization part is to visualize the processed imaging on the geopoto.
05:03
It takes a few seconds to process two procedures. The image processing algorithm is developed using OpenCV library. On image processing software, individual also images are produced automatically and rapidly
05:24
using images from camera and position and altitude data from GPS IM sensor. It takes several seconds after each image acquisition to generate individual geo-rectified images.
05:41
So, before acquiring next raw imaging, previous individual geo-rectified images already generated and visualized through the software. In image processing software, perform the geometric correction process. Geometric correction is a process of transforming an uncorrected raw image
06:08
from an arbitrary coordinate system into a map projection coordinate system. The image pixels are positioned and rectified to align and fit into a real-world map coordinate.
06:23
In sharing and visualization software, the processed individually geo-rectified imaging is uploaded to the GeoPortal Margo3D. At the same time, the geodatabase is updated.
06:43
So, the processed imaging maps are visualized through the GeoPortal. There are some advantages for live drone map. First, a user just specified the region of interest on the map and the output options with the required resolution.
07:07
Second, all the remaining processes from the data acquisition through data processing to the final product generation can be performed automatically. So, the user does not require expert knowledge about the UAV, photogrammetry, or image processing.
07:31
The live drone map is applied to this scenario about the UAV field operation. First, the disaster is offered.
07:41
The second, the UAV system collects the data in the disaster area and transmits the collected data to the cloud server in real-time. Third, the transmitted data is processed and mapped on the GeoPortal in real-time. At the end, many people share the geospatial information about disaster area on 3D GeoPortal, although they are not in disaster area.
08:16
We conducted field experiments two times. One is in Korea on riverside, the other is in Italy on a UN-based camp.
08:26
These experiments are very big tasks as a demonstration. We already conducted several small tasks. This is an experimental step.
08:41
First, data acquisition and transmission. Second, the data is stacked on the server and the image processing end. At the end, the visualization and updates the geodatabase.
09:08
Before the flight, we confirmed the flight plan. UAV acquired the sensor data according to flight plan. The data acquisition cycle was 4 seconds.
09:23
We got raw images and position altitude data of the UAV. These captured images are samples of the data.
09:41
Let me show you this paramount procedure in video. First, ready for flight. We checked the state of the drone, battery or server in the software.
10:03
Then, we checked the weather and surrounding environment for safety. The drone took off.
10:29
We checked the server on the ground for the weather data were transmitted.
10:44
The software was well done. Image processing and visualization. The transmitted data were processed and visualized on the geobot.
11:07
This is the Mavo3D. The images were captured and processed and stacked on the geodatabase and visualized it.
12:17
This is what you see while the drone is in the air.
12:30
And the drone is landed.
12:43
During the flight, the processed individual or so images are visualized on the geopodal. The left image is the result of experiment in Seoul on the desktop computer. The right image is the result of experiment in Prentissing on the smartphone.
13:04
We saw it first, but now they are. Does it afterwards go through more? The first step, the sensor is not aligned well.
13:28
After that, the sensors are aligned well. So the sensor's value is really cracked. The images stack over the wrong images.
13:46
It's the right. It is different between the raw images from this camera and individual georectified images. The image pixels are positioned and rectified to align and fit into real-world map coordinates.
14:06
We checked the absolute accuracy of the individual georectified imaging using ground control point. Ground control point was acquired using GPS or RTK.
14:23
Its accuracy is 1 cm. To follow these steps, we calculated the RMSE of all images. The average of the absolute accuracy of the images is 1.5 m.
14:44
We also checked the relative accuracy among 5 images. To follow this step, we calculated the standard deviation of all images sequences. The image sequence is 1st image to 5th image, 2nd image to 6th image, and 3rd image to 7th image.
15:05
The average of the relative accuracy of the 5 images is 70 cm. We checked the processing time at each phase of the system.
15:21
It takes much time to download raw images from server to local directory because of raw image size. If the ground sampling distance is 10 cm, it takes about 1.6 seconds to
15:41
visualize the individual zone images on the geoporter after downloading raw images to local directory. The Lightroom map can acquire multi-sensory data and produce high-resolution dispersion information in real-time and visualize them rapidly.
16:10
With Lightroom map, we can quickly acquire high-resolution image maps of mission area with no map or old map only. The accuracy of the image maps are about 1.5 m depending on the flight altitude or sensor's value without any GSP.
16:33
These latest maps will be useful for many important tasks in urgent operation such as monitoring and restoring disaster area.
16:48
We will try to utilize cloud processing more and test with a cheaper drone such as DSai1, which are widely used in the field.
17:02
We will develop the function of stitching individual geo-rectified image in real-time with parallel processing using the GPU. Also, we plan to develop the function of adjustment of position and orientation of imaging through some methods such as SLAM.
17:25
We would like to test Lightroom map in real mission areas next year with UN teams to improve it for more practical uses. Thank you.
18:07
We use the reference data in the DEM for the geometric correction process.
18:28
Without the battery, it is 7 kg with the battery and 14 kg.
18:44
You are recording 12-megapixel images with the drone. You send them to the ground server?
19:05
No, we send original images to the server. So it takes much time to download and unload it.
19:31
It is not using the internet to communicate with the server and the drone.
19:44
But it is needed to communicate with the server and my computer.
20:11
In the bring-to-sea, there are two blows.
20:27
So we can fly the drone. What is the pixel size?
20:46
The pixel size is 6-micrometer spatial size. Ground size is 1 pixel.
21:04
1 pixel corresponds to 1 cm, I think. 70 cm.
21:28
You have 0.68 in your accuracy relative to your images. But that is not the ground size.
21:40
I told you after this time. If you are considering SLAM, are you going to have to do both? We will use SLAM with the video.
22:04
It is not like an image sequence. It is a very basic system. You can make a real-time mosaic while the drone is in the air. The constraint is that your lifelining between your drone and your laptop on the ground is constrained to a 1 km range.
22:23
So my question is, presumably you can store the data and process it later if you are flying in a large area. Would that be safe to transmit the images? For example, radium and some other transmission method. We conducted experiment for the communication with the drone and server.
22:45
Now it is using the long range Wi-Fi, but we want to use the LTE communication. It is, I heard, 10 km of communication coverage.
23:05
Like disaster management, do you think about flying with a thermal camera? Our lab uses the thermal camera for marine site. But I don't experiment with the thermal camera in the live drone lab.