GStreamer on the Magic Leap One
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 490 | |
Author | ||
License | CC Attribution 2.0 Belgium: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/47283 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
FOSDEM 2020422 / 490
4
7
9
10
14
15
16
25
26
29
31
33
34
35
37
40
41
42
43
45
46
47
50
51
52
53
54
58
60
64
65
66
67
70
71
72
74
75
76
77
78
82
83
84
86
89
90
93
94
95
96
98
100
101
105
106
109
110
116
118
123
124
130
135
137
141
142
144
146
151
154
157
159
164
166
167
169
172
174
178
182
184
185
186
187
189
190
191
192
193
194
195
200
202
203
204
205
206
207
208
211
212
214
218
222
225
228
230
232
233
235
236
240
242
244
249
250
251
253
254
258
261
262
266
267
268
271
273
274
275
278
280
281
282
283
284
285
286
288
289
290
291
293
295
296
297
298
301
302
303
305
306
307
310
311
315
317
318
319
328
333
350
353
354
356
359
360
361
370
372
373
374
375
379
380
381
383
385
386
387
388
391
393
394
395
397
398
399
401
409
410
411
414
420
421
422
423
424
425
427
429
430
434
438
439
444
449
450
454
457
458
459
460
461
464
465
466
468
469
470
471
472
480
484
486
487
489
490
00:00
Event horizonComputer animationLecture/Conference
00:48
CollaborationismOpen setVirtual realityVirtual realityAugmented realityReal numberElement (mathematics)Integrated development environmentComputer animation
01:58
Open setGame controllerBefehlsprozessorNeuroinformatikRoundness (object)CodeObject (grammar)Game controllerComputer animation
02:40
Open setSoftware development kitArmBefehlsprozessorGraphical user interfaceAndroid (robot)Core dumpPhysical systemHypermediaComplete metric spaceJava appletBitBuildingComputer animation
04:08
Open setWeb browserVideoconferencingProjective planeComputer animation
05:03
Open setReal numberElement (mathematics)Computing platformVideoconferencingPhysical systemJava appletComa BerenicesPhysical lawComputer fileDuality (mathematics)Addressing modeProgrammable read-only memoryMultimediaComputer fileScripting languageBuildingPhysical systemSoftware frameworkAndroid (robot)ArmCASE <Informatik>SynchronizationVariable (mathematics)MereologyBitFrame problemBlogArtistic renderingElement (mathematics)Java appletVideoconferencingCartesian coordinate systemSinc functionKerberos <Kryptologie>Right angleProjective planeException handling
09:38
Physical systemAndroid (robot)RootOpen setCollaborationismVariable (mathematics)XML
09:58
ImplementationSoftware development kitComputer iconBoss CorporationOpen setCollaborationismComputing platformImplementationPlug-in (computing)Computer fileMultiplication signPhysical systemSymbol tableSemiconductor memoryFlagSystem callProjective planeMaxima and minimaComputer animation
11:54
Open setPredictabilityAndroid (robot)WindowComputer configurationComputing platformAndroid (robot)Cartesian coordinate systemComputer animation
12:40
CollaborationismOpen setComputer fileTape drivePlug-in (computing)Object (grammar)Open setComputer filePatch (Unix)Software developerInformation securityBitRule of inferenceCodeCartesian coordinate systemArrow of timeDefault (computer science)XML
14:25
Open setPlug-in (computing)Computer fileLibrary (computing)Link (knot theory)Open setFunctional (mathematics)Computer animation
14:59
Open setCollaborationismDirectory serviceACIDFluid staticsCodierung <Programmierung>Physical systemBuildingSystem programmingHypermediaAndroid (robot)ImplementationMathematicsCodeWrapper (data mining)FactorizationJava appletIdentical particlesCodecComputing platformOpen setBuildingHypermediaComputer fileAndroid (robot)Fluid staticsJava appletCodierung <Programmierung>Functional (mathematics)Revision controlLibrary (computing)Plug-in (computing)Cartesian coordinate systemJSONXML
17:26
Open setMathematicsAreaImplementationCodeWrapper (data mining)Patch (Unix)Java appletComputer configurationAndroid (robot)CodecComputer animation
18:33
Open setPlanar graphTexture mappingData Encryption StandardOpen sourceMobile appGUI widgetSynchronizationVideoconferencingCodecContext awarenessCartesian coordinate systemTexture mappingShared memoryMobile appCodePointer (computer programming)WindowMessage passingPlanar graphMereologyBitDirection (geometry)File formatDifferent (Kate Ryan album)EmailComputer animation
21:19
Open setCollaborationismElement (mathematics)Message passingRun time (program lifecycle phase)Wrapper (data mining)Right angleRun time (program lifecycle phase)Streaming mediaOpen sourceSpacetimeGUI widgetDifferent (Kate Ryan album)Wrapper (data mining)Position operatorSystem callVideoconferencingImmersion (album)Computer animation
22:52
Open setMenu (computing)Demo (music)Element (mathematics)Demo (music)Cartesian coordinate systemVideoconferencingObject (grammar)SynchronizationMessage passingRight anglePointer (computer programming)Computer animation
24:37
Open setDemo (music)CollaborationismSoftware testingVideoconferencingComputer animation
25:09
Multiplication signComputing platformSlide ruleTouchscreenAugmented realityInternet forumWebsitePolygon meshSynchronizationLecture/Conference
26:56
Point cloudOpen source
Transcript: English(auto-generated)
00:12
So, we move to our next talk. I remind you that if you want to ask any questions, either from the room or remotely, you can go to onlinequestions.org,
00:22
event 2147, and ask questions there, or vote for questions that have already been asked. Our next talk is about Gstreamer and the Magic Clip 1. Please welcome Xavier. Hi. So, I'm Xavier, working at Collabora.
00:42
Today, I'm presenting Gstreamer. So, the work I've done last year to port Gstreamer on a Magic Clip 1 device. Showing it there. So, those are Augmented Reality glasses. So, Magic Clip is building those glasses
01:02
and they are selling it online. Augmented Reality is glasses you can see through. So, you see you own the real world environment and you can add virtual elements inside your living room or anywhere you are.
01:23
And, for example, you can add a TV screen on the wall in your living room and you can watch TV like that with the glasses. That's not the same thing as VR because virtual reality is a completely opaque device.
01:42
So, you can't see anything from the real world. You see only virtual environments. And so, if you walk, you just walk into the world and that's not happening with AR. Magic Clip has three elements.
02:02
So, first you have the glasses and then the glasses is the light way and the light pack is connected with a cable. You cannot remove the cable. So, the light pack is the computer itself doing all the CPU and GPU.
02:23
So, all the stuff I'm going to code is going to run on the small round computer there. And, of course, you have a controller to manipulate virtual objects.
02:41
So, a bit of specs. The light pack is where the OS is running. It's an NVIDIA Tegra X2 chipset with six cores and it's ARM64. The OS is called Lumine OS.
03:01
It's based on Android, but there is no Java. So, basically, as far as I know, they are doing that to get vendor support from NVIDIA probably. And, the media stack is what I really concentrate on and it is staged right from Android.
03:20
There is a public SDK. It has a complete C API for everything. So, on Android you have Java API to write Java code, but they write everything in C for the middleware. And, you have C++ API for all the UI toolkits
03:43
and the more advanced features. For the audio side, they have custom APIs. That's something I've never seen on Android. So, they wrote it from scratch. And, they also wrote from scratch their own build system
04:03
called Mabu because we don't have enough build system yet. The project I've been working on with Collabora was sponsored by Mozilla. Mozilla has their new browser called Servo.
04:22
Servo is written in Rust. It's a brand new browser and they want to port it to various VR and AR devices. And, they had an issue because they port it on Magic Clip device, but they cannot render any media. So, the video and audio was not working
04:43
because they are using Gstreamer on desktop to play all the media, but they couldn't port Magic Clip, Gstreamer to Magic Clip device. So, they contacted us at Collabora and we helped them to port it.
05:05
So, Servo is written in Rust. So, the build system is cargo, so yet another build system. They have various Python scripts on top of cargo to drive the build.
05:22
Yeah, so Alan Jeffrey already did, before contacting me, they already did all the porting of Servo to Magic Clip, except for the multimedia parts. So, I won't be speaking about that journey from them.
05:40
They have a blog post already telling all the story for that side, concentrating on the multimedia part here. Yeah, so for the video, they are using AppSync. That means that they don't let Gstreamer rendering the video.
06:03
They get the frames out of Gstreamer and render them themselves in their application. And for the audio, on the opposite side, and for the audio, they let Gstreamer do everything itself.
06:20
So, Gstreamer is supposed to detect the platform and plug the right audio sync. Of course, there is no audio sync for Magic Clip yet. So, Gstreamer, that's a multimedia framework written in C. They had another tools build system. Thankfully, it's removed now, and it's fully masoned build system now.
06:41
So, that's the third build system in my presentation. Yeah, so Gstreamer already has support for Androids, but they are using the Java API through GNI. So, yeah, Gstreamer is like, for people that don't know Gstreamer,
07:04
it's like a pipeline when you have elements you can connect together to write your rendering. So, the first step is to actually build Gstreamer using the SDK. There are two ways possible.
07:21
Either you use GST builds or you use Cerbero. GST builds use mason and has many sub-projects to build every single dependency you can, at least the art dependency, but there are some optional dependencies that it cannot build yet. On the other side, Cerbero,
07:43
they can build every possible dependency with their own build system, but it's more complicated and less integrated. So, since I'm a mason developer, I decided to go with the GST builds new way of doing things.
08:02
I've not been using Cerbero for this work because one of the main reasons is that, magically, I didn't need any external dependency. So, my project was really like a small scope, so I don't have to depend on any auto-tools library,
08:21
external libraries, et cetera. So, I decided to go with GST builds and use only mason for that. So, the first thing to do is write a cross file because there is a toolchain in the Magic Leap SDK you can download.
08:42
There is a toolchain there. You write a cross file, compile, and hopefully it works, right? So, the cross file is something like that, a bit simplified here. So, you pretend it's Android
09:01
because Gstreamer has many special cases for Android, and you want to use those special cases. You find that it's an ARM 64 architecture. Like you see here, a little bit trick I did,
09:22
I don't write the full path to all those binaries. I've got a variable there, Magic Leap SDK. At the beginning, I was using this file that was actually processed by SED
09:40
just to replace all the variables before passing that to mason, but spoiler alert, I have a merge request on mason to support that kind of syntax inside mason itself. So, you don't have to repeat all the full path where you installed your toolchain anymore.
10:00
So, the first issue when building Gstreamer is that glib depends on iconv, and the SDK from Magic Leap actually has iconv.h, the header, but surprisingly, they don't have the implementation.
10:20
Usually, the implementation comes from the ellipse, but no symbols there. Don't know why. So, I had to build iconv from the GNU project. Of course, that's yet another build system. AutoTools now, sad face, but I can handle it. Configure, make install, all the good stuff, and it works.
10:45
It builds. You can install that in the path somewhere, and you can add the dash l and dash i flags in your mason cross file to pick the iconv implementation you just built.
11:03
Next step. So, with that iconv issue fixed, you can actually now build the full Gstreamer hit pass, and the problem is it does it too well. You have many, like more than 100 plugins built, but you don't care about them.
11:22
One trick I use, it's a really nice feature you have in mason, is you can disable all the features altogether with the auto features equals disabled. That means that with GST builds, if you do that, it will disable every single plugin and build really the minimum
11:41
the leap Gstreamer call and nothing else. And then you can add yourself the exact plugins you want to enable, and it will be just that. So you save a lot of time and a lot of memory because the full build takes almost 200 megabytes
12:03
for the application you want to ship, but if you enable only the few things, it's down to less than 10 megabytes. So there are a few options you have to pass. You want to enable GL, of course, the GL support in Gstreamer.
12:24
On Magic Leap device, they have GL ES2 on EGL platform. And the Windows system, I fake it to Android because Android is already implemented in Gstreamer, and it's really similar on Magic Leap.
12:47
Yeah, and that trick again, I had the idea when working on this project, is when Gstreamer builds itself, you get every plugin as a separate library,
13:03
and that's not really convenient if you want to package that in your application because you have many files to copy inside your package, and Magic Leap device was actually not really happy with that because they don't let you have any DL open
13:21
on any shared objects outside the binary folder. So you cannot split, like on distribution, you have a Gstreamer-1.0 subdirectory in your slash lib, and so Gstreamer will only look for plugins there, but you cannot do that trick on Magic Leap
13:41
because they have security rules that forbid that, so you cannot DL open those files. So one trick I've done is if you build, that's a patch still waiting for review on GitLab if someone from Gstreamer developers want to give it a try.
14:02
If you build Gstreamer with default arrow against static, it will build every plugin as a static plugin, and at the end, it's going to add a bit of code in the Amazon file that brings all those static libraries,
14:23
build them into a single libgstreamer full shared library, and it also generates a small C file that registers all the plugins you just built. So you have a single function to call
14:42
to initialize all the static plugins, and you can use one single shared library, and you don't have to manage all those plugins and DL open anymore. You just directly link to that library.
15:01
So now that we have Gstreamer built, a small version of Gstreamer with one single library, the next step is to build that into a MagiClip application package. So as I said, that's the Mabu build system, custom from MagiClip.
15:20
I don't think there is anyone else using that. I think they wrote that themselves. Luckily, it's pretty easy to use. You define the include path you want, so I gave them the path where I installed my Gstreamer, glib, and everything that Gstreamer builds,
15:45
and you can just copy some files. So that's the data list. Those files are just copied inside the application package. So as you see, there are only two files to copy, the Gstreamer full shared library
16:04
and that Icon-V shared library because they don't have it. As I said, in your main application, you just have to call GST-init and that new function GST-init static plugins.
16:22
That new function is implementing to the Gstreamer full shared library. And that's it. You can actually already, with that, you can already run Gstreamer on your device, but you have no codecs.
16:41
So for the codecs part, of course, you want to use the codecs from the platform. You don't want to use software decoders. So if you look inside the public SDK, you will find media codec and media codec lists. And if you open them, it's really surprising.
17:01
You actually have one from one exactly the same API as the Android Java API. They just translated it in a C, and it's exactly the same API. So that's actually a good thing for me because Gstreamer already has
17:21
that Android media codec AMC plugin working. The only issue with that is that so Gstreamer used the Java API, so had to move all those GNI codes, separate it behind the wrapper layer,
17:41
and implement that wrapper layer with the Magic Leap API so you can, at build time, select if you are building it for Magic Leap or for Android. And all the rest of the code is exactly the same, so that's a really good way of sharing that code.
18:02
And those patches are already merged in Gstreamer master, so that's coming soon in 1.18. Yeah, so now that's an extra option you have to pass to Meson
18:21
to enable, in GST plugins, the Magic Leap option that will select that new implementation for the codecs. Video sync now. So as I said, the codecs produce GL texture,
18:43
external OES, but Servo was expecting texture 2D, so I did not work on that, but Mozilla had to modify their application to actually support both formats for the textures,
19:03
so they have their app sync that gets those textures, those GL textures, and they had to modify their code to support that. To render that, you actually write a Magic Leap application with a planar resource object. That's like a widget that exposes you an EGL context,
19:23
so you can draw yourself with the EGL API on that surface. Yeah, so you have to handle the EGL context sharing with Gstreamer, because Gstreamer has its own context for the decoder,
19:45
and so the decoder produces a texture with its own context, but you have to tell Gstreamer about your application context, so Gstreamer can share those both contexts, so you can use the texture inside the application context as well.
20:05
Sadly, you cannot use glImageSync directly. I tried to use it, but there is some slight missing API there, because they don't expose the native window.
20:21
I'm pretty sure inside that planar resources, there must be a native window, but the API doesn't give you a pointer to it, so you cannot pass. Usually, you would pass that native window down to glImageSync, and the sync will do all the rendering for you,
20:41
but since you don't have that pointer, you cannot do that, so you must use that app sync and do the rendering yourself. A bit frustrating, because it's really missing just one single getter just to get there. Maybe they will have it later.
21:00
The audio sync part. There are two completely different objects, header files, MLAudio.h and AudioNode.h. One is C low-level API, and the other one is C++ high-level API. The reason for that is
21:21
there are two ways for rendering audio on Magic Leap device. The device itself has stereo speakers right onto your ears, but since the source of the audio is somewhere in the 3D world, either you can just play the audio plainly
21:41
as a stereo audio, but you cannot really know from where the audio comes, and that's the C API, what the C API implements. The C++ API is smarter because that's actually a widget that you can plug inside your UI. It's an invisible widget,
22:03
but that tells the audio stream exactly from where in the space the audio comes, and that widget is capable of modifying the audio you send to sound like if it comes from that position. So if there is a video wall here
22:22
and you turn your head, you hear the audio coming from there. It's really impressive. That makes all the difference for the immersion you have. If you look at those headers, it's weird because it's exactly the same API implemented twice,
22:41
one in C and one in C++. It's exactly the same calls. I've wrote in Gstreamer a wrapper again so you can pick one of them at runtime. If you want to use the 3D spatial audio,
23:06
you have to pass that audio node object, C++ object, down to Gstreamer so Gstreamer can use that object instead of using the plain C API.
23:21
So I've wrote a new element in Gstreamer called ML Audio Sync, and when that audio sync wants to render audio, it's going to pop a message on the bus, and the application is supposed to reply to that message with the pointer
23:43
to that audio node object, and if you do that, it's going to use that C++ API and you get really nice spatial audio. Upstreaming. So thanks to Mozilla for sponsoring all that work,
24:02
and they even sponsored the upstreaming of all that work. So everything is already merged, so you can use that with Gstreamer Master right away. Thanks to Olivier Kretz who did all the review. You can also find a standalone application demo at that URL,
24:22
and that demo is not using Servo, so it's just a plain video player you can try, and we have a demo at IBC showing that coming that next Gstreamer release.
24:40
And now the demo, because Gstreamer is always tested with video testers. As you can see, Magic Leap is capable of detecting the surfaces and get the video at the right place.
25:07
There we go. Any questions? We have no questions online. We have a question from the people
25:20
who attend remotely asking if you can put the slides in full screen. It's difficult to do it all the time because we want to see you. We try, but your slides are downloadable from the forum website because we're well behaved. That's good. All speakers should do that. Any questions from the floor?
25:41
We can take one question. Yes, please. Because the Magic Leap platform is closed source. Oh, sorry.
26:00
So the question is why don't I just make a pull request to add the missing getter to be able to use GLE Mesh Sync. The reason is that the platform is closed source, so I cannot just do a pull request. To have an issue tracker, actually some of the APIs
26:21
I'm using right now were missing, so I did some requests to get them. But yeah, you're right, I could request that API as well. I don't think I reported that issue yet. Very quickly, is this augmented reality helmet widely available?
26:42
Widely not, but I think you can buy it in the US only, just for a few thousand bucks, so anyone can buy it. Thank you.