We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Real World HoloLens Mixed Reality Development with Unity

00:00

Formal Metadata

Title
Real World HoloLens Mixed Reality Development with Unity
Subtitle
Be Part of the Future
Title of Series
Number of Parts
96
Author
License
CC Attribution - NonCommercial - ShareAlike 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
With the fast developments of powerful Augmented, Mixed and Virtual reality devices like HoloLens science fiction movie technology is becoming reality for consumers. In this session, Rene Schulte will talk to you about the challenges that AR and VR pose and why 3D is an essential part of this experience. Comprehensible demos will show how every developer can develop outstanding HoloLens solutions with Unity and be part of this computing revolution. When you leave this session, you will understand how to setup your Unity environment, what skills you need to create compelling HoloLens applications and what best practices will help you move forward quickly. Starting with a simple “MR Hello World” demo, we will use this to understand all of the pieces required to run your app. Last, but not least, we will demonstrate some of the applications that Rene and his team have been working on in the past few months, to give the viewers a sense of what can be accomplished with the right skill targeting one of the most anticipated devices in a while. When you leave this session, you will know the challenges we faced while building HoloLens apps and how we solved them. You will also have learned best practices and recommendations to avoid pit falls and you will hopefully be inspired to build your own HoloLens apps.
1 (number)Value-added networkDemo (music)Augmented realityComputer graphics (computer science)Sinc functionProjective planeWindowMobile appNeuroinformatikSoftware developerMachine visionBitOcean currentImmersion (album)Identity management1 (number)Multiplication signData miningVirtual realityDampingGame theorySoftware development kitOpen sourceRight angleMetropolitan area network
Table (information)Data miningIdentity managementBuildingVideoconferencingComputer programmingCASE <Informatik>Real numberWindowConnected spaceSinc functionPixel
Mixed realityHead-mounted displayData storage deviceDifferent (Kate Ryan album)WindowFocus (optics)CuboidInteractive televisionPlug-in (computing)Quantum stateVector spaceDisk read-and-write headProgramming paradigmFreewareRevision controlEndliche ModelltheorieAugmented realityDatenhandschuhReal numberTap (transformer)Web browserMixed realityMappingoutputDemosceneGoodness of fitEmulatorMechanism designCASE <Informatik>Natural numberPattern recognitionSpeech synthesisTerm (mathematics)SphereCubePhysicalismMaß <Mathematik>Group actionSubject indexingProcess (computing)Mobile appBefehlsprozessorNeuroinformatikExecution unitElectronic visual displaySoftware developerAdditionCurvatureImage resolutionPixelFunction (mathematics)Cartesian coordinate systemLevel (video gaming)Point (geometry)Connectivity (graph theory)CollaborationismMiddlewareMultiplicationDivisorPlanningRight angleSoftware development kitComputing platformCross-platformPhysical systemObject (grammar)TouchscreenMachine visionOcean currentFrame problemDemo (music)RotationInertialsystem1 (number)MassMultimediaImmersion (album)Content (media)Table (information)Globale BeleuchtungVideo gameFeedbackSpectrum (functional analysis)User interfaceCoprocessorVirtual realityClique-widthStreaming mediaSmartphoneOrder (biology)Wrapper (data mining)Hydraulic motorSystem callBuildingEngineering drawingComputer animationMeeting/Interview
1 (number)CalculationPhysicalismDemosceneOperating systemClient (computing)Metrischer RaumWindowComputing platformCubeBitFunction (mathematics)Connectivity (graph theory)Run time (program lifecycle phase)Scaling (geometry)Set (mathematics)Data storage deviceUniverse (mathematics)Scripting languageFunctional (mathematics)SphereVisualization (computer graphics)Rigid bodyGraph coloringPlanningSinc functionGoodness of fitMaterialization (paranormal)Position operatorDisk read-and-write headRotationGroup actionFlagFrame problemMetreCartesian coordinate systemComputer fileGreen's functionDrag (physics)Drop (liquid)Artistic renderingExistential quantificationMappingReal numberField (computer science)Object (grammar)TouchscreenNear-ringGame theoryUniform resource locatorView (database)HierarchyEmulatorSpeech synthesisPattern recognitionCanonical ensembleSystem callCuboidMobile appRight angleComputer animation
1 (number)outputOvalOrder of magnitudeLipschitz-StetigkeitSource codeExecution unitScripting languageSoftware developer1 (number)WindowProcess (computing)Visualization (computer graphics)C sharpFormal languageFinite-state machineMassCodeEvent horizonFrame problemMultiplication signCASE <Informatik>Social classCubeSphereTap (transformer)Pattern recognitionMotion captureShooting methodText editorPosition operatorBitCalculationPhysicalismConnectivity (graph theory)Right angleTransformation (genetics)Field (computer science)DivisorVector spaceDirection (geometry)Impulse responseDemoscene2 (number)Rigid bodyLatent heatVariable (mathematics)Computer animationSource code
Set (mathematics)WindowData storage deviceSinc functionVolumenvisualisierung1 (number)Mobile appSelectivity (electronic)Cartesian coordinate system2 (number)Function (mathematics)Field (computer science)Visualization (computer graphics)Attribute grammarBitStructural load
EmulatorRight angleAsynchronous Transfer ModeShooting methodFunction (mathematics)Price indexPlanningVirtual machineIP addressMobile appCursor (computers)32-bitMenu (computing)CASE <Informatik>BitWindowCubeText editor
Event horizonSource codeOrder of magnitudeConvex hullLocal ringSinguläres IntegralMassText editorQuantum stateDemoscenePosition operatorDirection (geometry)Vector spaceScripting languageGame theoryObject (grammar)Set (mathematics)1 (number)SurfaceVolumen-RenderingDistanceCursor (computers)Source codeComputer animation
Polygon meshTable (information)Open sourceMappingVolumenvisualisierungCollisionScripting languageGame theoryObject (grammar)2 (number)Line (geometry)Multiplication signBitMereologyReal numberSpeech synthesisPattern recognitionDemosceneProjective plane1 (number)CubeLevel (video gaming)PhysicalismInteractive televisionCategory of beingVolume (thermodynamics)Cursor (computers)MetreProcess (computing)Power (physics)Default (computer science)CuboidSpherePrice indexPosition operatorMobile appRight angleComputer animation
Plane (geometry)String (computer science)OvalVoltmeterSpeech synthesisWindowCodeNamespaceResolvent formalism1 (number)DemoscenePlanningScripting languageString (computer science)Finite-state machineEvent horizonComputing platformFormal grammarParameter (computer programming)Existential quantificationBitConstructor (object-oriented programming)Pattern recognitionVariable (mathematics)Normal (geometry)Shooting methodNetwork topologySource codeComputer animation
PlanningScripting languageDrag (physics)Drop (liquid)EmulatorMultiplication signCycle (graph theory)BitCanonical ensemble1 (number)Source codeComputer animation
Graphical user interfaceEmulatorLevel (video gaming)PlanningPosition operatorSurfaceObject (grammar)Speech synthesisStructural loadDefault (computer science)Different (Kate Ryan album)MappingCubeSoftware testingSource codeComputer animation
Menu (computing)Execution unitMobile appMenu (computing)Source codeDemo (music)CollisionCubeShooting methodCodeSampling (statistics)Personal identification numberSinc functionGoodness of fitCursor (computers)Source codeComputer animation
Menu (computing)PlanningReal numberRight angleMappingScripting languagePoint (geometry)Goodness of fitLecture/ConferenceComputer animationSource code
Moment of inertiaMaxima and minimaSource codeDemo (music)Scripting languageTexture mappingVideoconferencingCASE <Informatik>MappingIntegrated development environmentGoodness of fitField (computer science)Cartesian coordinate systemComputer animation
Constructor (object-oriented programming)Software maintenanceReal numberMultiplication signReal-time operating systemCASE <Informatik>BitGame controllerCurvatureVisualization (computer graphics)Cartesian coordinate systemMobile appPlanningTheory of relativityUsabilityVolume (thermodynamics)InformationVideoconferencingOffice suiteScaling (geometry)Endliche ModelltheorieTouchscreenMetrischer RaumRight angleComputer animation
BitVideoconferencingCyberspaceInformationDifferent (Kate Ryan album)Table (information)PlanningReal-time operating systemLevel (video gaming)Data conversionTracing (software)Web 2.0MappingFront and back endsReal numberComputer animationPanel paintingJSONLecture/Conference
Maxima and minimaInformationCartesian coordinate systemCollisionPlanningVolume (thermodynamics)Software developerPosition operatorCyberspacePixelBitTimestampDifferent (Kate Ryan album)Polygon meshMobile appLevel (video gaming)Endliche ModelltheorieReal numberQuantum stateSphereArrow of timeMultiplication signWeb 2.0AlgorithmPlanar graphMappingVideoconferencingFlow separationBit rateObject (grammar)Image resolutionRight angleImmersion (album)DivisorFrame problemDimensional analysisReal-time operating systemRay castingGroup actionAlpha (investment)Context awarenessOverlay-NetzCurveTouchscreenDisk read-and-write headCASE <Informatik>CubeIterationSlide ruleInheritance (object-oriented programming)Game controllerPerfect groupData conversionSelectivity (electronic)Artistic renderingSoftware testingSign (mathematics)CirclePhysical lawLogic gateRegulärer Ausdruck <Textverarbeitung>Topological vector spaceField (computer science)QuicksortWhiteboardComputer animation
Hill differential equationInheritance (object-oriented programming)Endliche ModelltheorieMultiplicationPixelMultiplication signStandard deviationShader <Informatik>VortexRotationNoise (electronics)Scripting languageFrame problemMathematical optimizationProcess (computing)Thread (computing)MetreProgramming paradigmDataflowDisk read-and-write headOvalObject (grammar)Right angleDrop (liquid)CountingMobile WebNeuroinformatikDemosceneVolumenvisualisierungGoodness of fitTriangleBit rate10 (number)Software testingLimit (category theory)PlanningSoftware developerMobile appArtistic renderingPosition operatorData storage deviceUniform resource locatorOffice suiteLevel (video gaming)Multi-core processorMoore's lawPhysical system1 (number)Polygon meshTexture mappingBitGroup actionContent (media)Identity managementData miningContext awarenessExecution unitMereologyoutputVector spaceAlgorithmAlgebraResultantGame theoryPredictabilityCursor (computers)State of matterType theoryFreewareLoop (music)Computer programmingMathematicsTransformation (genetics)Matrix (mathematics)CalculationDemo (music)CubeHierarchyFeedbackRun time (program lifecycle phase)Graphics processing unitElectronic visual displayVirtual reality
Geometry1 (number)Open setView (database)CurvatureRing (mathematics)Cursor (computers)Menu (computing)Mobile appEmailTwitterBitLink (knot theory)Software development kitVisualization (computer graphics)Function (mathematics)Multiplication signDemosceneEmulatorMathematicsScripting languageGoodness of fitBlogShader <Informatik>Vortex2 (number)Slide ruleWindowComputing platformRun time (program lifecycle phase)BuildingText editorCentralizer and normalizerDirection (geometry)Cycle (graph theory)CASE <Informatik>Right angleSoftware maintenanceProof theoryData storage deviceCyberspaceMappingExecution unitLibrary (computing)Artistic renderingProjective planeUniverse (mathematics)Different (Kate Ryan album)NeuroinformatikCollaborationismMixed realityC sharpPlug-in (computing)Grand Unified TheoryProcess (computing)Speech synthesisPattern recognitionRevision controlSoftware developerVirtual realityCartesian coordinate systemObject (grammar)Software bugComputer fileCircleInformationReal numberTransformation (genetics)Acoustic shadowWechselseitige InformationTask (computing)Content (media)FreewareCompilerVector potentialCore dumpCorrespondence (mathematics)Disk read-and-write headComputer animation
Transcript: English(auto-generated)
Hey, welcome My name is Rennie Schulte. I'm director immersive experiences at identity mine, and I'm also Microsoft MVP for Windows development and I have a background in computer graphics and virtual reality and augmented reality and those kind of things since many years actually
I have a few open-source projects One of them is an AR toolkit port And was it like a few weeks ago when I had the 5 year anniversary when I was holding this AR toolkit Called over to Windows Phone Mango. Does anyone remember Windows Phone Mango? Yeah You're my man. That's that was a good time, right? So they finally added like camera access API so I could do some computer vision with that stuff
But that's history now. We're talking about Hololens and today I want to speak about Hololens development We have been doing since actually last year So first I want to set some terminology straight. What is VR? What is AR? What is MR? Because it's often mixed up
then I will tell you a bit about the Hololens device of the current development kit here with me and Then I will tell you how you can develop for it We will do a nice demo using unity from scratch and build a nice little game or app if you will yeah, and then at the end I want to spend some time to
Okay Yeah, and at the end I want to spend quite some cool awesome, yeah, I give a good hand for the technician So, yeah, let's get started at the end. I want to talk a bit about the Experience about the things we learned while building Hololens apps since a couple of months or since last year
Actually, if he wants about the company I work for identity mine is headquartered in Seattle in the United States But I will see I actually work out from Dresden in Germany Identity mine has always been at the bleeding edge of technology So I've been working with those big multi-touch tables those pixel sensed tables
We're doing Xbox one Xbox 360 apps Connect for Windows we have actually connect for Windows solutions deployed in the real world not just for you know try out but actually real use cases and We were also bought by an earlier program from Microsoft for HoloLens. So we're developing for HoloLens since last year
Here's a quick video I want to show you about the company it's just one minute It shows some of our HoloLens stuff we're building. So I think it's worth it. Okay enough of the marketing
Let's let's talk about some content. So what is virtual reality? virtual reality is a fully immersive multimedia solution which means you're fully inside a computer generated world and you don't see the outside the real world anymore and virtual reality is around since many years actually since think about flight simulator so many decades actually
But now we have consumer devices that are Available for the masses and we have on the low end like Google call port and in high-end spectrum We have devices like the oculus rift and HTC wife Which are not too expensive compared to what we have like five or ten years ago And they provide a very good quality and with quality
I especially mean latency because latency is a big challenge with those VR headsets because what you want to do is you want to render the scene very realistically and on the other hand You also need to provide it as a fast feedback So when the user rotates the head you need to quickly render a new updated frame and send it back to the device
To show it to the user. So this is the latency and what we are we're talking about three milliseconds So actually anything that's taking longer than three milliseconds can make a user sick They can throw up they can get headaches and other things you probably want to avoid for users Also important is We humans don't just have eyes, right? We also have ears we have hands and so on
So spatial sound is something that those devices already implement and they're also like those data gloves Which have those little motors inside where you can give haptic feedback and there are even experiments with virtual smelling which could be fun But could be also very awkward. I'm I'm thinking about the VR for that there right this this will happen. I'm sure
So what is what is a or what is augmented reality and what is mixed reality on the other hand It's not fully immersive. This is this is clear. So you still see the real world You don't just see the virtual world. You see the real world, which is augmented with virtual objects so this is the main difference compared to VR here and
The Microsoft HoloLens I have here is called a mixed reality device. Although it's actually also an augmented reality device It seems like everyone has its own definition. What mixed reality means, right? So but I just want to tell you why Microsoft calls it mixed reality in order to diversify from at the existing
Existing augmented reality solution we have these days So you probably all have a smartphone with some kind of AR app on it with an augmented reality app and what they do Usually they take the camera stream They analyze the camera stream run some computer vision on on of that and then Augmented with virtual objects and you see the real world not with your own eyes But through another screen you see that also a monoscopic as the camera is monoscopic
It's a cyclops basically in just one eye So you don't see a stereoscopic and you see it for another screen, but with the HoloLens you see it with your own eyes It's mixed basically because you have those nice see-through lenses here where you can see the real world with your own eyes And then the virtual objects are faded in there So in Microsoft, we want to diversify kind of so they call it mixed reality
challenging It's even more challenging to have low latency because you also need to analyze the real world You don't just generate virtual objects, but you also need to run the computer vision, basically So even more challenging to keep a good frame rate there and of course you want to seamlessly merge the real world with your virtual world and this is
Yeah, quite challenging if you want to have realistic rendering Cool, so let's talk about the HoloLens and I have the current development edition here with me The HoloLens is a mixed reality head-mounted device from Microsoft and it has a bunch of sensors integrated So it has an IMU unit, an Inertial Measurement Unit, which is basically used for the head rotation, right?
Then it has a bunch of cameras like those here. Those are environmental cameras, which basically scan the room They are used to provide a spatial mapping of the room. So this is nice We will use that in a demo later where we use the spatial mapping of the room So to balance some physical objects off
It has an RGB camera. It has a depth camera. What else microphone array? So it has a bunch of microphones here, which are used for speech recognition Does a very good job for speech recognition Yeah, and everything is self-contained. So the device contains everything inside here. It's actually computer. It's not just a display It's a computer There's a CPU in here, a GPU and a new coprocessor on Microsoft calls the HPU, the holographic processing unit
Which is basically responsible for doing the spatial mapping, speech recognition, gesture recognition and so on Yeah, and another cool thing is it actually multiple people can wear a hololens and they can see the same holograms So if you would all have a hololens, you all get one from me. I'm just kidding
So if all would wear a hololens, we could basically see each other, right? This is also a nice diversification factor compared to VR because you're just, with VR just in your own world With AR, you still see the real world. So multiple people can wear a hololens They can see each other They can still see the room and they could also see the same holograms So this kind of multi-lens collaboration feature is a very very nice unique point of the hololens
And it's a Windows 10 device. So you can even pin your Microsoft Edge browser taps On your walls and stuff like that Cool. So what is the input and output paradigm of the hololens? The hololens uses the so-called GGB Input paradigm which stands for Gaze, Gesture and Voice
So when I wear a device like this and I rotate my head, this is where I'm looking at, right? So this is where I'm gazing. I'm basically setting a gaze vector, a ray And this ray is giving me the interaction focus So this is where I know, okay, this is where the user is looking, this is what he's interested in So think about in a desktop world as a mouse move basically
And the next thing is the R-tap So I can have a couple of different gestures and one of them is the R-tap gesture So I held my index finger up like this and then I tapped like that So this is in your desktop world, the mouse click basically So you're gazing, setting the interaction focus, then R-tapping to trigger the action, like a mouse click if you will
And since this is quite limited in terms of input Voice and speech recognition is a very important natural user interface Input mechanism for hololens as well, and it does a very good job It's actually better than Xbox One speech recognition. It's using Cortana inside. It's Windows 10 like I said
For output you have those two see-through lenses here Each of them has one megapixel resolution And it's actually not just a simple flat screen or flat lens, it's actually multiple layers So the camera at the stream is split up into the channels like red, green and blue And each of them is fed into each layer basically
And spatial sound, and we will talk about spatial sound in a second later Those tiny speakers provide a very good nice spatial sound, which is an important addition to the visual aspect So how can you develop hololens? First of all you can just use DYKE 3D directly, DYKE 3D 11 with C++
Or even Sharp DX, which is a wrapper around DYKE 3D and C-Sharp Or you can use a middleware like Unity And Unity is not just a 2D or 3D game engine where it comes from It's actually also used to build applications And you have a ton of amazing components already available like a very good global illumination system where you can have very realistic shading
And also physics and so on You have this nice high efficient workflow, you can be very productive with Unity and develop quickly using Unity And of course it's cross-platform, they have right now 28 platforms they support as target output Which is pretty impressive They have built-in VR support since a few versions, so the Oculus SDK is integrated already
So you don't need to install a separate plugin to Unity, you can just use it right out of the box And Unity is a first-class citizen for HoloLens development, that's for sure Because Microsoft puts out all the tutorials and the holographic academy, it's all built using Unity basically
And actually a few of the applications you find in the Windows Store for HoloLens are also built by Microsoft And they are also using Unity for a few of them, so you can build some real cool stuff And yeah it's still free for personal use So I was at the Unite conference last week in Amsterdam And they announced a different pricing model for the professional and the plus version
But they're still keeping the personal version for free So you can use the free version for non-commercial use case And you can download the SDK for the HoloLens, the emulator, and Unity And all of that for free and can get started, it's pretty nice Cool, enough talking, let's build something, right?
What we will do, I will start with a fresh new Unity scene And I will configure it so it can be deployed to the HoloLens And we will implement the things we just talked about Like gazing, gestures, spatial mapping, and speech input And we will do this with a nice little setup of physics objects So we will have a nice plane, we stack some cubes on top of that
And then we can shoot a sphere from the camera Which is the user's head with the HoloLens basically And we can shoot spheres into the scene And they will bounce off physically correct basically Let's switch to Unity So this is Unity, you probably have seen it I don't want to go too much into detail about Unity here Brian had a nice session yesterday about it
But this is basically your scene view You have the hierarchy of the scene here And this is the game view And this is the assets view where you see all your files So first thing we need to configure for HoloLens is the main camera Because if we have a fresh new Unity scene The camera has a set position of 0, 1, and minus 10 Which can cause an offset if we use it for the HoloLens
Because like I said, if the user walks around with the device The position of the device is mapped to the main camera in Unity Right, so we don't want to have an offset here So we set it to 0, not 10, 0, 0, 0 And also the head rotation is mapped to the camera rotation
Okay, so we set this The next thing we need to set is a solid color black as a clearing flag Because every frame we want to clear the frame And the HoloLens uses additive blending Which means the virtual objects are blended into the screens using additive blending Which means black is transparent You cannot see black with the device
So we clear it to black And we set the near clipping plane to 50 centimeters Because we want to avoid that the user's eyes are crossing basically So if the virtual objects are rendered too close to the camera The user's eyes will start to cross Which is also uncomfortable So we want to clip off the rendering at 50 centimeters or 80 centimeters And we also don't need 1,000 meters
We need 10 meters And you notice it's all in metric space, right? This is also nice with the HoloLens All the spatial mapping is already in metric space in the real world scale Cool So we have our scene set up here And now let's also enable the player settings for HoloLens
You can find them here And we need to switch to this tab Which is Windows Store Applications or Universal Windows Platform And we need to enable just this checkbox So the Windows Holographic SDK is included in our target output Another thing we need to set is the publishing settings And if you have done UWP apps before
You probably are familiar with those capabilities And we need to enable some checkboxes here Internet client Because we want to use the emulator also for a bit Which is connecting with the hosting operating system Also microphones Since we want to use speech recognition And spatial perception Since we want to use spatial mapping Okay, so we have everything set up Let's actually add some objects
What I usually do I create an empty game object as container Where I can place objects that are belonging In the same physical location together and group them And I will tell you at the end why I do this And why it makes sense So reset the position And set it a bit further away from the camera So we set it like 50 centimeters lower
And 2 meters in front of the camera Which is usually a nice way when you start an application That the holograms are starting 2 meters in front of the users Cool So let's add the plane I mentioned So this is our plane A bit too large So let's reduce the size here And like this
And yeah We want to have a different color So we create a material And since we're good citizens We create a folder for that So let's add a folder for materials And yeah, let's go with Something's not right Try again in a little bit
Okay, wait a bit Cortana I'm good So let's use a green field Green plane Drag and drop That's so nice of Unity It's so easy to build that stuff Yeah, that was funny A cube So we will add a cube onto our plane like I mentioned Also reduce the size of this one a bit
Like that Pull it up a bit Like this This is good And add a rigid body component So physics, rigid body component from Unity basically Tells the Unity runtime that it should run rigid body physics calculations here
And so we just duplicate this one Place the other one here And the other one on top Like this And then we can hit the play button And see if our physics stuff works They should fall down on the plane
Once it's compiled Yep, that works Okay, nice So we have our basic scene set up Let's add some actually HoloLens functionality So we will add another folder for that For our scripts basically Because we will write some C Sharp scripts for accessing the HoloLens APIs So let's call it Canon behavior Because like I said
We want to shoot spheres from the camera into the scene So like a cannon, like a cannonball And we can attach that script to our camera I just drag and drop it to the camera here And you see it's attached here And then I double click So it opens in Visual Studio 2015 I really like that with Unity 5 they finally added full Visual Studio support
So you can use Visual Studio for editing your scripts And you can actually also debug your scripts Which is very nice Just set a breakpoint in Visual Studio Attach to the Unity process And you can debug your scripts Very convenient if you're a Windows developer So that's nice And you can use C Sharp scripting like I said And also JavaScript actually So that's nice
We were using C Sharp here That's my preferred language for this kind of stuff Okay cool So this is the basic Unity scene And sorry the basic Unity script It is generated for us We have the stop method for initialization Which is called once the script is installed the first time And the update method which is called every frame basically
So let's remove that stuff Because I have some code snippets already prepared Which we will plug in and then we talk about those So let's add the gesture recognizer for the art tapping And I have this class here Which I instantiate a gesture recognizer And like the name implies this is in HoloLens specific API Which is responsible for well recognizing gestures
It has a few events Like this tap event is of course fired when the user does this tap gesture And then we can also define what kind of gestures we're interested in In this case we just want to listen to out tapping to single tap So that's fine for our use case here But you could also set a few more gestures of course
And then we call start capturing gestures Which is nice because we can also start and stop this kind of captures Gesturing yeah just recognition like we want Okay let's add the tapped event here This is the event handler Which is triggered once the user taps We call the shoot method
And what the shoot method does is it creates a sphere in code So just like we created those cubes in Unity editor We can also create those dynamically in code And since we attached that script to the camera I thought those spheres are like you're shooting out your eyeballs right So that's why I'm calling the variable eyeball So let's set the scaling a bit lower
So it's not a huge ball but smaller And we attach the rigid body component to it as well To have a rigid body physics calculations We give it a mass which is a bit lower And we set the position the initial position of the rigid body to the transform dot position And this transform in the script is the camera transformation
Since we attached the script to the camera So the transform position is basically the camera position right And then we give it an impulse into the direction the user is looking So transform dot forward is the vector the user is looking And we multiply it with a constant factor Which is defined up here as 300 newton
And if you have not done Unity You might wonder why I define public fields here You will see in a second once we switch back to Unity So let's save it here And we can switch back to Unity Give it a few seconds to update And there we go And you see that field is now here So this is nicely done without any attributes or something
Just so face here on the Unity inspector And they do this public fields because they also serialize it and deserialize Cool So this is our basic setup Let's build it for the HoloLens So build settings Then we select Windows Store Since it's a Windows 10 device we select UWP 10
We want to render Dyke 3D because we want to have 3D holograms And then we hit the build button Select the folder And let it create the build output And what it does now It basically generates a new visual studio solution for us Which is the HoloLens applications Because the HoloLens application in the end is a UWP app
That's what it is with Dyke 3D rendering So it generates it for us And I did this before So it should be a bit faster now And I already opened the other solution So once that is done You will see this reload all here So we can hit reload all Yeah reload it And then we launch it in the emulator So you can select release
I usually just go with release mode X86 because the HoloLens is a 32-bit processor So we use X86 And then you can select the target output So you can go with device If you have the device connected via USB Or also remote machine If you have the device If you have an IP address You can deploy OBLDR basically
It's very nice And we will just go with the HoloLens emulator for this case So just hit the button here And I already opened the HoloLens emulator before So it's a bit faster And you have the nice Windows start menu here And you can simulate gazing basically with the mouse Just left mouse button down And mouse move is gazing
And right mouse button down is simulating an air tap So that's very convenient So our app launches now And there we go We have our cubes on the plane And then I can hit the right mouse button And shoot at those guys Right
But you see an issue here It's very hard to aim It's very hard to see where I'm actually shooting So I don't have an indication of my gazing So I probably want to have some indication of gazing So let's fix that Let's add a gaze cursor Okay Stop it here Switch back to the editor Yep Yeah
All of that Let's add a gaze cursor Plug it in What I define here is another reference We will set up from our Unity scene So we can create another game object And just plug it into our script here And you will see that in a second So we use another game object for showing the gazing For showing like a cross or something
You can see where you're basically looking at Okay And I will do the gaze update in the update method So this update method like I said is called every frame by Unity And what I'm doing here I will do array cast all Which is a built-in method By Unity So I can shoot basically a bunch of rays into the scene
From the camera position in the direction the user is looking at And where those rays hit in virtual object I get a ray cast hit back So and I sort those by distance Because I'm interested in the one object that is the closest to where I'm looking at So then I get this first hit basically And I use that ray cast hit the position where this was hit
To set the position of our gaze cursor to that place And also I want to orient the gaze cursor nicely to the surface of the object So I use the ray cast hit normal as a forward vector for our gaze cursor Okay let's save it Switch back to Unity And add our actually gaze cursor game object
So what we will do here We will add another cube just a simple cube We call it a flat cube Because we will make it a bit smaller Let's go with Yeah like this
And this So very small flat cube And you see if we would just use the white color and on the white cubes We wouldn't see why we're gazing So let's add a different material to this one And I also want to remove the box collider Because I don't want to have like physics interaction with the gazing It's just useful visual indication Add a new material
Create material red There you go And just apply it So we have our gaze cursor here Another thing we need to do of course We need to add the reference here So I just can drag and drop the game object From here to there
So our script has a reference to this game object basically Cool So but before we deploy it I want to add some more features I also want to add spatial mapping So right now we have out tapping We have gazing Now I want to have spatial mapping And spatial mapping is quite complex to implement There are multiple lines you would write And this is a session on its own actually
But I made it a bit easier And there's a nice project by Microsoft Called Holo Toolkit It's up on GitHub Open source project And they have a bunch of very nice scripts And a few like for spatial mapping as well So you can just use them Because someone already implemented it for you Very nice So I created a custom package
Which I can just import here Here you go So this is a Unity package Where I just extracted the stuff from the Holo Toolkit Which we are interested in Like the spatial mapping collider And we will talk about this in a second So let's import those
I have a prefab here I just plug it into my scene And this has the two scripts we're interested in already attached And there are two scripts from the Holo Toolkit Which I'm using here One is the spatial mapping collider Which basically generates a spatial mapping collision mesh So which we can use for physics interaction, right?
We can use the real world spatial map for physics interactions And this has a bunch of properties So I can say what kind of bounding volume I'm interested in And I set it to a sphere of five meters So everything that's five meters around me That's what will be part of the spatial mapping mesh Then you can also define level of detail Low, medium, and high Which basically means higher precision
But also takes more processing power of course And then the default for time between updates is two and a half seconds Which basically means this is the time The spatial mapping mesh will get updated So if lots of people are walking here by fast They won't be part of it But if I place the table in here and move it out This will be part of the updated mesh
Yeah, and there's also the spatial mapping renderer Which is similar to the spatial mapping collider But this is used for visualizing the spatial mapping mesh And I can use material here So I have a wireframe material attached to this one Which is just like a wireframe Looks like a net And you will see it in a second how this actually looks like
And I could also say occlusion So if I said occlusion You won't see the actual mesh You won't see the spatial mapping mesh But your virtual objects will be occluded They will be hidden by real world objects, right? So this is probably something you want to do Okay, cool So we have gazing, gestures, spatial mapping Let's add last thing for the session
Let's add speech recognition And we'll do this with a nice little script So I have a C Sharp script, another one And I just call that one speech handler There you go Let's remove that stuff
And plug in some code I prepared Yeah, it doesn't Yeah, ReSharple doesn't show me what namespaces I should add So let's switch here and copy those in There you go Yeah, it's Windows Speech
I always forget this one It's Windows Speech So the namespace is here Okay, cool This is the keyword recognizer This is similar to the gesture recognizer But of course it's used to recognize keywords, right? And the nice thing is you just give it in a string array, basically So you can see I defined some string variables up here
These are just normal C Sharp strings, right? Just a normal C Sharp string Like hide plane, shoot, reset scene And I pass those into the constructor of the keyword recognizer And if you have done speech recognition before with other platforms You probably had to define some XML and the grammar and whatnot
And this is really what I like They made it very easy You just use C Sharp strings And there you go And once it recognizes one of those gestures It triggers that on phrase recognized event And we have an event handler here And this gets in past the arguments and the text Which is basically recognized string So I just do a very simple string compare
It's so easy Yeah, really simple So I compare it with my hide plane command Something's not right Try again in a little bit It's all good, Cortana Calm down So hide plane command I can set the plane deactivated For shoot, I'm shooting the cannon
And for the reset scene command I'm resetting the scene I'm just reloading it basically Cool Saving it Always important to save the scripts And then switch back to Unity We have our script here I just attach it to the camera as well So drag and drop There you go And then we plug in the references Like the plane
And the cannon The cannon is attached also to the camera So I just attach this one here And yeah, let's build it Again for the emulator But we will also see it in the HoloLens later Okay, let's build it for a second You notice it's taking quite some time to build
The deployment cycle is taking quite a bit to test some stuff And at the end I will share some best practices How you can avoid that And how you can be more productive and faster inside Unity Without having to wait always for the build time and the deployment cycle Cool Launch the emulator again with our updated stuff
And the emulator is really nice It's included in the SDK And you can actually also load different spatial maps So you have this room tab here Where I can basically load a spatial mapping I created before So I can take the device and save the spatial mapping of a room And then load it into the emulator Pretty cool
Also for testing Okay, cool So you see the gazing here I have this little flat cube This is now shown at the position I'm looking at And also nicely orients to the surface of the object I can also use speech commands Hide plane You see And you see another thing They bounce off in the real world And if you look closely you might be able to see a sofa here
So this is the default room that comes with the HoloLens emulator Okay, cool So this is the emulator Let's do something very brave Let's switch to the device And you should be able to see what I see now
Let's put it on And you will get the code from the demo all in GitHub So I put all the sample code up on GitHub Good Let's disable also audio Yep, there we go
Okay, let's open the HoloLens start menu I have the finished app already pinned here So I open it Place it somewhere like this And then it loads And like I said, you can get all the source code from that little demo at the end of the session It actually also has audio collision
So we have spatial sound as well There you go I have a different gaze cursor You see, I made two little cubes And I have someone else here This is me, a 3D scanned hologram Yeah, I know it's weird Yeah, and since we're in the Scandinavian region
I thought instead of shooting spheres, we'll do something cooler So I thought we shoot some meatballs like scuttballer Is it correctly pronounced? I hope so So let's shoot some scuttballer This guy, I hate him Let's shoot him away Hide plane
You see that? And you can all get your scuttballen Full lens There you go And you see them? They bounce off here in the real world, right? This is pretty amazing with the HoloLens The HoloLens can do the spatial mapping, right? And so we can
Yeah, we can shoot here And I can do this the whole day, basically And it's almost lunchtime, right? Who didn't have his scuttballen yet? Yeah, okay, cool Glad that worked out because sometimes this doesn't work, right? With the Wi-Fi But I actually have my own Wi-Fi access point
So that's good Cool Glad the demo worked Like I said, you can get the source code of this Maybe not with the scuttballen texture But with the rest, you can get all of that on GitHub later on And you can play around with that Cool I also had a video prepared just in case it doesn't work
So this is the video I recorded I record it at home So and you see I, for this one, I actually used eyeball textures And yeah, the nice thing is You see how it adapts, the spatial mapping adapts to the environment So those feels plain Those feels You see how it bounces off and rolls down the stairs So this is recognized by HoloLens
Yeah, good fun Okay, cool So shooting eyeballs and scuttballen is really, really fun But a very small niche, right? So let's talk about some real-world applications And we have built a couple of apps right now
So we did an engagement with a museum So we tried out some different stuff What you can see in a museum So we put some dinosaur models into the office Which is very amazing actually Because you can see those dinosaurs at the real-world scale And you actually can figure out in metric space, right? So you can figure out how big they actually are
Really, really nice experience And we also did some stuff for automotive industry For construction as well And plane maintenance and a couple of other Actually real business use cases I want to spend a bit of time to talking about an app I was working on last couple of months It's called Holoflight
And I want to share some best practices Some learnings and stuff with you from the applications As we build it So Holoflight is a real-time flight data visualization basically And we take the real flight data and visualize it in 3D And if you think about air traffic controllers What they use these days is They are looking on flat 2D screens, right?
But planes and flights are actually flying in 3D They have an altitude So we figured let's put them in the HoloLens And visualize them as holograms So that gives you another relationship between the flights Because you see them nicely stereoscopic So you get the relations between the different flights And we can also visualize a usable invisible information
Like flight trails and so on Here's a quick video Showing you the app Yeah, let's turn down the volume a bit What is that? Okay, that's a bit slow
What is that? Let's restart the video Okay, that's better I think Cool, so you can see the Hawaiian Islands here And we have the flight space of the Hawaiian Islands basically visualize And you can gaze at those planes that are flying
Like I said, this is real-time flight data And you can see flight information like callsign, altitude and whatnot You can also hear air traffic controls, conversations and spatial sound And we can also visualize the airport weather information Wind speed, wind direction, all of that stuff And we use spatial mapping of the HoloLens
So the user can basically pin those information panels On different places like on different walls, on a table and whatnot And basically layout the workspace We also have different level of details for the terrain The terrain is by the way built using Bing Maps API data So this is real topographic data Yeah, like I said, we have real-time flight data
But we also have our own Azure backend where we can catch the data So we can play it back at different speed basically What you can see here, we play it back faster And then you can see those flight trails visualized So in flight information you usually don't see So that's nice and adds another value of course Okay, so what are the challenges?
First of all, usually flight information is visualized in 2D But if you visualize it in 3D, you also open another dimension of arrows basically Which you don't see in 2D Yeah, you also need to be careful with the holographic frame size Because you cannot put too much information in front of the user Because flight information is very dense, you have a lot of information You want to show a lot of information But on the other hand, you need to be careful that you're not putting too much stuff there
A special mapping of the Hololens as you have seen It's really amazing, it can scan basically the room So you want to use that in some way in your app Gazing and selecting at small objects like those tiny planes Can be very hard to gaze at them and select them Because they are pretty small So we had to fix it as well
And yeah, you want to make it an awesome immersive experience And spatial sound is also one important factor there So how did we solve those? First of all, finding the right flight data And we partnered with a company that provides us the flight data In a nice REST-based web API So we get the data as JSON
Easy, everyone can pause that But then we noticed a bunch of arrows in the data actually Which you don't see in 2D like I mentioned For example, we had some like crazy altitude drops Like phew And this happens just a few times But not that often like we have seen in the data So there were definitely some data glitches, right? Because if it happened so often as we have seen it
No one would fly anymore Right? So we had to fix that And you can basically go with two approaches You can do it offline Or you can do it online Since we want to do it in real time We did it online So we invented some algorithms there To fix the data To smooth out the data To make more sense out of the data basically
And avoid those issues Then of course you need to visualize it in 3D So you have geographical coordinates And you need to map that to an unwrapped planar rendering And we also want to map the altitude And if we would map the altitude linearly Like the 30 to 35,000 feet of flight space We would waste a lot of rendering space With uninteresting flight information
Because the most interesting flight information Is in the first like 3 to 5,000 feet close to ground, right? So what we do We use a non-linear mapping basically of the altitude So we give the more interesting flight space Also more rendering space Then we get a bunch of positions for each of the plane in the flight space And they can even have different timestamps A lot of different data basically
So we need to normalize them So they make sense when we visualize them all together In one flight space rendering And the right size is important of course Because we are visualizing those planes And if you're stepping it further away They can become very small So you just see a few pixels actually So what we do there
We use level of detail to swap out the plane model With just a cube And you don't notice this Because you just see a few pixels anyway And we also make sure if the user steps even further away The cube will stay at the same size You can still see a few pixels there in the back And you know there's something going on there Yeah UI We had a couple of iterations for the UI To make sure that we are like I said
Not polluting the holographic frame Not showing too much information there And this was the first iteration That's my fault It's developer UI So super ugly And the next one was in-place billboards So we had those little billboards Which were shown directly at the plane position So you could pin and enable them Multiple billboards basically And you see the issue in the screenshot
You have those overdrawing You have those overlay issues Also not very nice Then we figured out Let's use a curved screen UI Which is nice Because the human head is also a bit curved So you have all these curved TVs right So we figured let's use a curved piece But this thing then kept on growing and growing We wanted to show more information
Like flight information And weather information And whatnot So it was also not fitting the holographic frame anymore So then we split it up And we polished it To what you have seen in the video And we have those independent panels basically And the user can pin those Separately in the room basically
And lay out his workspace And we're using the spatial mapping of the HoloLens To allow the user to place those Cool Yeah size matters Also for ray casting Like I said Those planes are very small And if you want to gaze If you want to select them The gaze rate will often hit nothing basically
So we figured let's do something else And we use basically just a simple sphere collider So we use a sphere collision volume Which is a bit larger And you can also think about to dynamically adapt it So if the user is stepping further away You can grow it a bit To a certain amount And if the user is getting close So you can make it smaller And then actually switch to the real mesh collider of the plane
The mesh The sphere collider is also a nice performance gain Because testing a ray with a sphere is really cheap It's really easy And doesn't cost a lot of performance Yeah spatial sound There's some crazy experiments out there You can see on the slide
Really might I'm sure it has a super amazing spatial sound Like with all this But it might also be a bit heavy on the head So I rather prefer this one And the HoloLens has those tiny speakers up here You have those small speakers here And of course they don't provide a very bass sound
Not very low frequency But what it does with those tiny speakers is super impressive It's really good And yeah it's perfect brain trickery actually Because you know where the sound is coming from And in Holoflight we actually use it for this Speedbird 182 heavy Right mic alpha Right bravo Then papa So we use that to playback air traffic control conversation
And those are not just adding value from the ATC conversation itself But also from the spatialness From the spatial sound Because we're playing back that sound in 3D at the plane position So even if you're not seeing the hologram Like rotating your head You don't see the hologram which is somewhere here But you hear the spatial sound
And your brain knows where to turn You know where the sound is coming from Right And this is what they have really done very nicely with the HoloLens They have a very good algorithm there to compute those spatial sound Really good And if you have a use case in your applications Make sure to use spatial sound It's really adding the topping on the cake if you will
Cool Some further best practices So use fading and transitions It's an important one Because in the real world objects don't just appear or disappear Maybe ghosts if you believe in such things But I don't so And virtual objects you can just enable disable them right You can just make them appear or disappear
But of course you want your virtual objects behave like real world objects So you need to fade them in Move them in Grow in size Shrink in size and so on And I have three short clips here where I want to visualize you The differences of all those three approaches What you can do basically And so we have all the Hawaiian islands here
And then we want to switch to just one island right So we see all islands Just switching to one island without any transition So this is not very nice Very awkward Then you could do a cross-fading like an alpha blending between the terrains So you can do this Which is a bit nicer It's a bit smoother But on the other hand you're losing the context
So as a user you don't know where you're looking at You don't know which island you're actually zooming into Right You see all islands and just see one island But which one? So what we did in the app is basically scaling up the mesh of the all islands Sorry Scaling up the texture of the all island mesh And then do a cross-fading to just the one island
So this is a bit nicer And it's a nicer transition for the user to actually know where to look at them Cool So let's talk about my top 10 Hololens development recommendations First of all you can be a really good developer You can be a great 3D developer Best one in the world
But if you don't have any 3D content to show Well you don't have anything to render right And at Identityvine we're lucky to have a bunch of very talented designers And also 3D artists who can make very nice and good 3D models And also with a reasonable triangle count And I say reasonable because Hololens is a mobile device
Right Hololens is a mobile device And compared to virtual reality headsets like Oculus Rift and HTC Vive With those they are just displays basically With a bunch of sensors But the computing is all done on a computer In a full-blown desktop PC Right For the Oculus Rift you need a really high-end PC
Which of course can compute like amazing scenes Can compute really nice renderings But you're always connected with a cable You're basically always on the leash Right So with the Hololens I much prefer the Hololens approach Because it's self-contained Everything This is a computer right It has everything inside here And I don't need an extra cable I can just put it in and walk freely around
And can interact with other people as well So this is much nicer But on the other hand you have limited computing power of course Because you cannot put in your full-blown desktop graphics card in that device It would probably get very hot if you would do that And this is something you want to avoid as well So limited computing power
And what we noticed is that the Hololens is mostly fill rate bound basically So you can render tens of thousands of triangles Like 50,000, 60,000, 70,000 is not an issue to render those But if those are rendered closely to the eye If these are rendered closely to camera And take up a lot of pixels And you have a heavy pixel shader running for each pixel
Then your performance will drop You get really quickly running into issues there So you want to draw your pixels very easy And for example don't use the Unity standard shader Because this is too heavy It's doing too much And what we noticed is you can actually get away mostly with vertex lighting So you can just use a simple lighting model for vertex-based lighting
And have a super cheap pixel shader And another thing you want to avoid is like overdrawing This is just with every mobile device basically You don't want to have like meshes models Multiple after one One after another Which would cause like multiple pixels drawn multiple times
You want to avoid those overdrawing issues Also large transparent objects are issues So you need to be careful there Because you want to render with 60 frames per second This is really important to render your holograms of 60 frames per second Otherwise if they're dropping to 30 frames They can become unstable So you have seen they're pretty stable in the room Even if I turn my head they will be at the same position
But if the frame rate drops They can become unstable and drift And this can make users sick Actually Microsoft did some user research there And some people really get headaches And they can throw up So you really want to go with 60 frames and optimize everything Still the HoloLens has a multi-core CPU
And in HoloFlight we use that for the data fetching The data clearing And you know all the processing is done in a background thread basically So we can keep the UI thread free from that work And UI thread can keep up with the rendering loop Yeah if you have never done a 3D programming
Or it's been a while You probably want to brush up some of your math skills As you have seen there's a bunch of stuff going on Like with vector algebra and so on You don't have to implement all yourself Unity or whatever game engine you're using is helping you a lot They have everything built in But of course you need to be familiar what that means You know what is a transformation
What is a matrix calculation Right so this is important Anchor your holograms So the HoloLens has an API which is called World Anchor If you remember the demo I grouped those cubes And the plane in one container Right in the scene hierarchy And what I could do
I could apply a World Anchor to that container And then the HoloLens runtime would basically give That World Anchor its own coordinate system Right it has its own coordinate system Which then results that the HoloLens makes those World Anchors very stable So even if I leave the room and come back into the room The HoloLens the holograms will still be at the same position
This is done with the World Anchor basically And the coolest part about it is you can actually persist those So you can save them in a global storage in HoloLens You can save the World Anchor with an ID And once you reload your app or restart the device And then load the World Anchor position They will be at the same location
And when I'm flying back home hopefully tomorrow I don't know I heard some really interesting things about a strike or something Well and if I'm flying home hopefully I will see some holograms in my office At the same position when I left them So this is done with the persistence right So you can persist those World Anchors Pretty cool
Yeah leverage level of detail like you've seen You can save some rendering time there as well You don't need to show the high detailed model when it's just like meters away Yeah the Gaze Cross is important As you have seen like the main input paradigm of the HoloLens is Gazing and gestures
So we want to make sure that your Sorry that your Gaze cursor is very stable and very nicely done Because the user will see that most of the time basically And one part about it is to smooth it basically Because the Gazing is based on head rotation right This is based on the IMU unit Which means this is based on sensor data
Like every sensor in the world this contains noise if you use the raw data Which makes the Gaze Cross jitter always slightly move And this is something you want to avoid right You don't want the user to like always have this jittering thing in front of it So you would want to smooth it basically And the Holo toolkit I mentioned
It has a bunch of reusable scripts and also a smoothing algorithm implemented So you can use that one We actually developed our own because it gave us a bit better results with less lagging It's a bit faster to react Also the one we implemented has some prediction So this is giving us better result
Another thing about the Gaze cursor is the hand ready state So the user is air tapping right And the cameras of the HoloLens of course need to see the hand So if the user is air tapping here Won't work because it doesn't see the hand right It needs to see the hand somewhere here or there also works But not there So you want to give that feedback to the user
That it's now basically seeing the hand And they can interact with the with the piece the user is gazing at And what most apps do and also the HoloLens start menu They show an open ring Gaze cursor when the hand is in view To tell the user okay you can now interact And they show a flat circle something like this
When the hand is not in view right To give the user the information okay Open the cursor you can interact the other cursor Nope Those details really matter They make out a good experience and you probably want to have a good experience So pay attention to those details Yeah use animations and transformations to let your virtual objects behave like real objects
And yeah so this is also a nice one So if you have a bunch of Unity projects you probably also have a few reusable scripts The naive approach would be to copy those script files between all the different projects But this is not nice because when you have a bug or something
I want to change something you have to copy all the stuff So what I did instead I created a central C Sharp solution Where I have all the reusable scripts in one central place basically And I can just then build out a dll and copied it into a special folder into your Unity project
So the Unity project have that assets folder right And there's a special folder you can create it's called plugins where you can put in dlls And then you can just use those scripts as well from the dll And this works out quite nice because I also have a post build step in my C Sharp solution So I just hit build and all my projects are updated with the latest stuff
There's one thing, one gut job The Unity editor uses mono runtime Which is actually a five or six year old version of the mono runtime So it's quite outdated I think this is being fixed now since Microsoft acquired Xamarin So I was at the Unite conference last week and they basically announced that
On the roadmap that it's planned to update the mono runtime they're using inside Unity Which is great because you don't have tpl of anything in the Unity editor right You don't have TasksPale library So but on the HoloLens it's running universal Windows platform So it's running uwp.net core, latest one
So what I have to do is basically have two C Sharp projects One I'm building for the Unity editor, mono runtime The other one I'm building for uwp And I'm just sharing the scripts and sometimes you have some pre-compiled directives Like if Unity underscore editor blah blah blah do this right Those kind of things
But anyway having this central C Sharp solution With all the scripts in one place really is a huge benefit for maintenance Yeah avoid the long deployment cycle As you have seen it's taking quite some time to deploy something to the HoloLens First of all you need to you know change your Unity scene
Then build the visual studio output Then you know build it again And then deploy it to the emulator or the HoloLens So this is really if I want to take some if I want to test some small stuff And what I did here I wrote a custom script Where I can simulate the gazing and gestures already inside Unity
So I can do the same like the emulator does I can use my mouse for gazing I can use the mouse click right mouse click for app tapping I can also use different gestures But I can do this already inside Unity And for every little change I don't have to deploy it to the device or the emulator It's a huge huge time saver Right so the HoloLens also has a bunch of other gestures I didn't mention
Like the up tap I said then it's double tapping And tap and hold so you can do like scrolling Nice thing is three dimensional right Not just this but also like this So those gestures I can also simulate already inside Unity with the custom script I wrote there Yeah so you can stay productive most of the time inside Unity
But of course you also want to test it on a device if you're lucky enough to have one Because nothing comes as close to this device as the real device itself You know the performance and spatial mapping and so on But anyway the emulator is really good As you have seen it supports speech recognition I can load spatial mappings I can basically test most of the stuff already with the emulator
Cool So I think next chapter of computing is really happening right now And it's a great time to be a developer And it's really good to be working in that space again Yeah and the HoloLens is one of its kind I think We have a bunch of virtual reality devices
And the HoloLens is often compared to those But it's actually of course it's not a VR device as you just learned It's actually a mixed reality device So totally different And it can also do things which are just not the difference In you know showing the rendering of the pieces like augmenting the real world But actually it can do the spatial mapping for example
You can have the collaboration features and so much different things And I'm pretty sure the HoloLens it will change how we interact with computers It really has huge potential And you can even run 2D apps on it So you can develop the universal windows platform applications And they will also run in the HoloLens But they will run in a window basically So you can have a 2D window
Then you can pin it here or there or whatever It's also nice But of course with a stereoscopic 3D rendering 3D is king basically So you want to have 3D content And you can do this with going straight with Dyke 3D 11 Which we also do by the way Or you can use Unity right And Unity is a nice tool You can get quickly started
Be very productive And there's a use case for every piece right For example I wouldn't build Skype with Unity right I would build it straight with Dyke 3D and C++ That's for sure right But for really nice proof of concept Unity is great And not just for that like I said A bunch of HoloLens applications in the store
Built by Microsoft Or actually built using Unity Yeah you can get a few links here So the HoloLens SDK You can download it for free It has the emulator inside And also the special Unity build That is supporting HoloLens development HoloToolkit for Unity is also available
On GitHub it contains a bunch of scripts Prefabs, shaders So they actually have some nice optimized shaders you can use Like for vortex lighting and so on Pretty good stuff So you really want to grab a copy of this one Once you install the SDK And yeah on my blog you can find the top 10 Development recommendations Longer write-up with more details
And also the slides and the demo code I will put the link for the demo code On my blog as well And the demo code is actually on GitHub Cool so we just have a few seconds left And I don't want to overrun Because there's lunch I think We're all hungry The shotball and what's just holograms So they weren't real
Hopefully they have some real ones I need to try them Anyway so you can shoot me an email Or tweet to me Or I will stick around here a bit And just ask me there if you want to ask some questions And with that I thank you for attention