We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

GStreamer on the Magic Leap One

00:00

Formal Metadata

Title
GStreamer on the Magic Leap One
Title of Series
Number of Parts
490
Author
License
CC Attribution 2.0 Belgium:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Magic Leap One is an augmented reality glasses. Let's run an Open Source Browser (Mozilla Servo) using GStreamer multimedia framework on it. The Magic Lean One device runs a custom OS called LuminOS, derived from Android with JAVA stripped off. Servo is Mozilla's browser written in Rust that uses GStreamer to render multimedia content.
33
35
Thumbnail
23:38
52
Thumbnail
30:38
53
Thumbnail
16:18
65
71
Thumbnail
14:24
72
Thumbnail
18:02
75
Thumbnail
19:35
101
Thumbnail
12:59
106
123
Thumbnail
25:58
146
Thumbnail
47:36
157
Thumbnail
51:32
166
172
Thumbnail
22:49
182
Thumbnail
25:44
186
Thumbnail
40:18
190
195
225
Thumbnail
23:41
273
281
284
Thumbnail
09:08
285
289
Thumbnail
26:03
290
297
Thumbnail
19:29
328
Thumbnail
24:11
379
Thumbnail
20:10
385
Thumbnail
28:37
393
Thumbnail
09:10
430
438
Event horizonComputer animationLecture/Conference
CollaborationismOpen setVirtual realityVirtual realityAugmented realityReal numberElement (mathematics)Integrated development environmentComputer animation
Open setGame controllerBefehlsprozessorNeuroinformatikRoundness (object)CodeObject (grammar)Game controllerComputer animation
Open setSoftware development kitArmBefehlsprozessorGraphical user interfaceAndroid (robot)Core dumpPhysical systemHypermediaComplete metric spaceJava appletBitBuildingComputer animation
Open setWeb browserVideoconferencingProjective planeComputer animation
Open setReal numberElement (mathematics)Computing platformVideoconferencingPhysical systemJava appletComa BerenicesPhysical lawComputer fileDuality (mathematics)Addressing modeProgrammable read-only memoryMultimediaComputer fileScripting languageBuildingPhysical systemSoftware frameworkAndroid (robot)ArmCASE <Informatik>SynchronizationVariable (mathematics)MereologyBitFrame problemBlogArtistic renderingElement (mathematics)Java appletVideoconferencingCartesian coordinate systemSinc functionKerberos <Kryptologie>Right angleProjective planeException handling
Physical systemAndroid (robot)RootOpen setCollaborationismVariable (mathematics)XML
ImplementationSoftware development kitComputer iconBoss CorporationOpen setCollaborationismComputing platformImplementationPlug-in (computing)Computer fileMultiplication signPhysical systemSymbol tableSemiconductor memoryFlagSystem callProjective planeMaxima and minimaComputer animation
Open setPredictabilityAndroid (robot)WindowComputer configurationComputing platformAndroid (robot)Cartesian coordinate systemComputer animation
CollaborationismOpen setComputer fileTape drivePlug-in (computing)Object (grammar)Open setComputer filePatch (Unix)Software developerInformation securityBitRule of inferenceCodeCartesian coordinate systemArrow of timeDefault (computer science)XML
Open setPlug-in (computing)Computer fileLibrary (computing)Link (knot theory)Open setFunctional (mathematics)Computer animation
Open setCollaborationismDirectory serviceACIDFluid staticsCodierung <Programmierung>Physical systemBuildingSystem programmingHypermediaAndroid (robot)ImplementationMathematicsCodeWrapper (data mining)FactorizationJava appletIdentical particlesCodecComputing platformOpen setBuildingHypermediaComputer fileAndroid (robot)Fluid staticsJava appletCodierung <Programmierung>Functional (mathematics)Revision controlLibrary (computing)Plug-in (computing)Cartesian coordinate systemJSONXML
Open setMathematicsAreaImplementationCodeWrapper (data mining)Patch (Unix)Java appletComputer configurationAndroid (robot)CodecComputer animation
Open setPlanar graphTexture mappingData Encryption StandardOpen sourceMobile appGUI widgetSynchronizationVideoconferencingCodecContext awarenessCartesian coordinate systemTexture mappingShared memoryMobile appCodePointer (computer programming)WindowMessage passingPlanar graphMereologyBitDirection (geometry)File formatDifferent (Kate Ryan album)EmailComputer animation
Open setCollaborationismElement (mathematics)Message passingRun time (program lifecycle phase)Wrapper (data mining)Right angleRun time (program lifecycle phase)Streaming mediaOpen sourceSpacetimeGUI widgetDifferent (Kate Ryan album)Wrapper (data mining)Position operatorSystem callVideoconferencingImmersion (album)Computer animation
Open setMenu (computing)Demo (music)Element (mathematics)Demo (music)Cartesian coordinate systemVideoconferencingObject (grammar)SynchronizationMessage passingRight anglePointer (computer programming)Computer animation
Open setDemo (music)CollaborationismSoftware testingVideoconferencingComputer animation
Multiplication signComputing platformSlide ruleTouchscreenAugmented realityInternet forumWebsitePolygon meshSynchronizationLecture/Conference
Point cloudOpen source
Transcript: English(auto-generated)
So, we move to our next talk. I remind you that if you want to ask any questions, either from the room or remotely, you can go to onlinequestions.org,
event 2147, and ask questions there, or vote for questions that have already been asked. Our next talk is about Gstreamer and the Magic Clip 1. Please welcome Xavier. Hi. So, I'm Xavier, working at Collabora.
Today, I'm presenting Gstreamer. So, the work I've done last year to port Gstreamer on a Magic Clip 1 device. Showing it there. So, those are Augmented Reality glasses. So, Magic Clip is building those glasses
and they are selling it online. Augmented Reality is glasses you can see through. So, you see you own the real world environment and you can add virtual elements inside your living room or anywhere you are.
And, for example, you can add a TV screen on the wall in your living room and you can watch TV like that with the glasses. That's not the same thing as VR because virtual reality is a completely opaque device.
So, you can't see anything from the real world. You see only virtual environments. And so, if you walk, you just walk into the world and that's not happening with AR. Magic Clip has three elements.
So, first you have the glasses and then the glasses is the light way and the light pack is connected with a cable. You cannot remove the cable. So, the light pack is the computer itself doing all the CPU and GPU.
So, all the stuff I'm going to code is going to run on the small round computer there. And, of course, you have a controller to manipulate virtual objects.
So, a bit of specs. The light pack is where the OS is running. It's an NVIDIA Tegra X2 chipset with six cores and it's ARM64. The OS is called Lumine OS.
It's based on Android, but there is no Java. So, basically, as far as I know, they are doing that to get vendor support from NVIDIA probably. And, the media stack is what I really concentrate on and it is staged right from Android.
There is a public SDK. It has a complete C API for everything. So, on Android you have Java API to write Java code, but they write everything in C for the middleware. And, you have C++ API for all the UI toolkits
and the more advanced features. For the audio side, they have custom APIs. That's something I've never seen on Android. So, they wrote it from scratch. And, they also wrote from scratch their own build system
called Mabu because we don't have enough build system yet. The project I've been working on with Collabora was sponsored by Mozilla. Mozilla has their new browser called Servo.
Servo is written in Rust. It's a brand new browser and they want to port it to various VR and AR devices. And, they had an issue because they port it on Magic Clip device, but they cannot render any media. So, the video and audio was not working
because they are using Gstreamer on desktop to play all the media, but they couldn't port Magic Clip, Gstreamer to Magic Clip device. So, they contacted us at Collabora and we helped them to port it.
So, Servo is written in Rust. So, the build system is cargo, so yet another build system. They have various Python scripts on top of cargo to drive the build.
Yeah, so Alan Jeffrey already did, before contacting me, they already did all the porting of Servo to Magic Clip, except for the multimedia parts. So, I won't be speaking about that journey from them.
They have a blog post already telling all the story for that side, concentrating on the multimedia part here. Yeah, so for the video, they are using AppSync. That means that they don't let Gstreamer rendering the video.
They get the frames out of Gstreamer and render them themselves in their application. And for the audio, on the opposite side, and for the audio, they let Gstreamer do everything itself.
So, Gstreamer is supposed to detect the platform and plug the right audio sync. Of course, there is no audio sync for Magic Clip yet. So, Gstreamer, that's a multimedia framework written in C. They had another tools build system. Thankfully, it's removed now, and it's fully masoned build system now.
So, that's the third build system in my presentation. Yeah, so Gstreamer already has support for Androids, but they are using the Java API through GNI. So, yeah, Gstreamer is like, for people that don't know Gstreamer,
it's like a pipeline when you have elements you can connect together to write your rendering. So, the first step is to actually build Gstreamer using the SDK. There are two ways possible.
Either you use GST builds or you use Cerbero. GST builds use mason and has many sub-projects to build every single dependency you can, at least the art dependency, but there are some optional dependencies that it cannot build yet. On the other side, Cerbero,
they can build every possible dependency with their own build system, but it's more complicated and less integrated. So, since I'm a mason developer, I decided to go with the GST builds new way of doing things.
I've not been using Cerbero for this work because one of the main reasons is that, magically, I didn't need any external dependency. So, my project was really like a small scope, so I don't have to depend on any auto-tools library,
external libraries, et cetera. So, I decided to go with GST builds and use only mason for that. So, the first thing to do is write a cross file because there is a toolchain in the Magic Leap SDK you can download.
There is a toolchain there. You write a cross file, compile, and hopefully it works, right? So, the cross file is something like that, a bit simplified here. So, you pretend it's Android
because Gstreamer has many special cases for Android, and you want to use those special cases. You find that it's an ARM 64 architecture. Like you see here, a little bit trick I did,
I don't write the full path to all those binaries. I've got a variable there, Magic Leap SDK. At the beginning, I was using this file that was actually processed by SED
just to replace all the variables before passing that to mason, but spoiler alert, I have a merge request on mason to support that kind of syntax inside mason itself. So, you don't have to repeat all the full path where you installed your toolchain anymore.
So, the first issue when building Gstreamer is that glib depends on iconv, and the SDK from Magic Leap actually has iconv.h, the header, but surprisingly, they don't have the implementation.
Usually, the implementation comes from the ellipse, but no symbols there. Don't know why. So, I had to build iconv from the GNU project. Of course, that's yet another build system. AutoTools now, sad face, but I can handle it. Configure, make install, all the good stuff, and it works.
It builds. You can install that in the path somewhere, and you can add the dash l and dash i flags in your mason cross file to pick the iconv implementation you just built.
Next step. So, with that iconv issue fixed, you can actually now build the full Gstreamer hit pass, and the problem is it does it too well. You have many, like more than 100 plugins built, but you don't care about them.
One trick I use, it's a really nice feature you have in mason, is you can disable all the features altogether with the auto features equals disabled. That means that with GST builds, if you do that, it will disable every single plugin and build really the minimum
the leap Gstreamer call and nothing else. And then you can add yourself the exact plugins you want to enable, and it will be just that. So you save a lot of time and a lot of memory because the full build takes almost 200 megabytes
for the application you want to ship, but if you enable only the few things, it's down to less than 10 megabytes. So there are a few options you have to pass. You want to enable GL, of course, the GL support in Gstreamer.
On Magic Leap device, they have GL ES2 on EGL platform. And the Windows system, I fake it to Android because Android is already implemented in Gstreamer, and it's really similar on Magic Leap.
Yeah, and that trick again, I had the idea when working on this project, is when Gstreamer builds itself, you get every plugin as a separate library,
and that's not really convenient if you want to package that in your application because you have many files to copy inside your package, and Magic Leap device was actually not really happy with that because they don't let you have any DL open
on any shared objects outside the binary folder. So you cannot split, like on distribution, you have a Gstreamer-1.0 subdirectory in your slash lib, and so Gstreamer will only look for plugins there, but you cannot do that trick on Magic Leap
because they have security rules that forbid that, so you cannot DL open those files. So one trick I've done is if you build, that's a patch still waiting for review on GitLab if someone from Gstreamer developers want to give it a try.
If you build Gstreamer with default arrow against static, it will build every plugin as a static plugin, and at the end, it's going to add a bit of code in the Amazon file that brings all those static libraries,
build them into a single libgstreamer full shared library, and it also generates a small C file that registers all the plugins you just built. So you have a single function to call
to initialize all the static plugins, and you can use one single shared library, and you don't have to manage all those plugins and DL open anymore. You just directly link to that library.
So now that we have Gstreamer built, a small version of Gstreamer with one single library, the next step is to build that into a MagiClip application package. So as I said, that's the Mabu build system, custom from MagiClip.
I don't think there is anyone else using that. I think they wrote that themselves. Luckily, it's pretty easy to use. You define the include path you want, so I gave them the path where I installed my Gstreamer, glib, and everything that Gstreamer builds,
and you can just copy some files. So that's the data list. Those files are just copied inside the application package. So as you see, there are only two files to copy, the Gstreamer full shared library
and that Icon-V shared library because they don't have it. As I said, in your main application, you just have to call GST-init and that new function GST-init static plugins.
That new function is implementing to the Gstreamer full shared library. And that's it. You can actually already, with that, you can already run Gstreamer on your device, but you have no codecs.
So for the codecs part, of course, you want to use the codecs from the platform. You don't want to use software decoders. So if you look inside the public SDK, you will find media codec and media codec lists. And if you open them, it's really surprising.
You actually have one from one exactly the same API as the Android Java API. They just translated it in a C, and it's exactly the same API. So that's actually a good thing for me because Gstreamer already has
that Android media codec AMC plugin working. The only issue with that is that so Gstreamer used the Java API, so had to move all those GNI codes, separate it behind the wrapper layer,
and implement that wrapper layer with the Magic Leap API so you can, at build time, select if you are building it for Magic Leap or for Android. And all the rest of the code is exactly the same, so that's a really good way of sharing that code.
And those patches are already merged in Gstreamer master, so that's coming soon in 1.18. Yeah, so now that's an extra option you have to pass to Meson
to enable, in GST plugins, the Magic Leap option that will select that new implementation for the codecs. Video sync now. So as I said, the codecs produce GL texture,
external OES, but Servo was expecting texture 2D, so I did not work on that, but Mozilla had to modify their application to actually support both formats for the textures,
so they have their app sync that gets those textures, those GL textures, and they had to modify their code to support that. To render that, you actually write a Magic Leap application with a planar resource object. That's like a widget that exposes you an EGL context,
so you can draw yourself with the EGL API on that surface. Yeah, so you have to handle the EGL context sharing with Gstreamer, because Gstreamer has its own context for the decoder,
and so the decoder produces a texture with its own context, but you have to tell Gstreamer about your application context, so Gstreamer can share those both contexts, so you can use the texture inside the application context as well.
Sadly, you cannot use glImageSync directly. I tried to use it, but there is some slight missing API there, because they don't expose the native window.
I'm pretty sure inside that planar resources, there must be a native window, but the API doesn't give you a pointer to it, so you cannot pass. Usually, you would pass that native window down to glImageSync, and the sync will do all the rendering for you,
but since you don't have that pointer, you cannot do that, so you must use that app sync and do the rendering yourself. A bit frustrating, because it's really missing just one single getter just to get there. Maybe they will have it later.
The audio sync part. There are two completely different objects, header files, MLAudio.h and AudioNode.h. One is C low-level API, and the other one is C++ high-level API. The reason for that is
there are two ways for rendering audio on Magic Leap device. The device itself has stereo speakers right onto your ears, but since the source of the audio is somewhere in the 3D world, either you can just play the audio plainly
as a stereo audio, but you cannot really know from where the audio comes, and that's the C API, what the C API implements. The C++ API is smarter because that's actually a widget that you can plug inside your UI. It's an invisible widget,
but that tells the audio stream exactly from where in the space the audio comes, and that widget is capable of modifying the audio you send to sound like if it comes from that position. So if there is a video wall here
and you turn your head, you hear the audio coming from there. It's really impressive. That makes all the difference for the immersion you have. If you look at those headers, it's weird because it's exactly the same API implemented twice,
one in C and one in C++. It's exactly the same calls. I've wrote in Gstreamer a wrapper again so you can pick one of them at runtime. If you want to use the 3D spatial audio,
you have to pass that audio node object, C++ object, down to Gstreamer so Gstreamer can use that object instead of using the plain C API.
So I've wrote a new element in Gstreamer called ML Audio Sync, and when that audio sync wants to render audio, it's going to pop a message on the bus, and the application is supposed to reply to that message with the pointer
to that audio node object, and if you do that, it's going to use that C++ API and you get really nice spatial audio. Upstreaming. So thanks to Mozilla for sponsoring all that work,
and they even sponsored the upstreaming of all that work. So everything is already merged, so you can use that with Gstreamer Master right away. Thanks to Olivier Kretz who did all the review. You can also find a standalone application demo at that URL,
and that demo is not using Servo, so it's just a plain video player you can try, and we have a demo at IBC showing that coming that next Gstreamer release.
And now the demo, because Gstreamer is always tested with video testers. As you can see, Magic Leap is capable of detecting the surfaces and get the video at the right place.
There we go. Any questions? We have no questions online. We have a question from the people
who attend remotely asking if you can put the slides in full screen. It's difficult to do it all the time because we want to see you. We try, but your slides are downloadable from the forum website because we're well behaved. That's good. All speakers should do that. Any questions from the floor?
We can take one question. Yes, please. Because the Magic Leap platform is closed source. Oh, sorry.
So the question is why don't I just make a pull request to add the missing getter to be able to use GLE Mesh Sync. The reason is that the platform is closed source, so I cannot just do a pull request. To have an issue tracker, actually some of the APIs
I'm using right now were missing, so I did some requests to get them. But yeah, you're right, I could request that API as well. I don't think I reported that issue yet. Very quickly, is this augmented reality helmet widely available?
Widely not, but I think you can buy it in the US only, just for a few thousand bucks, so anyone can buy it. Thank you.