Supercharge your Data Science workflow with Notebooks, VS Code, and Azure
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 130 | |
Author | ||
License | CC Attribution - NonCommercial - ShareAlike 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/49981 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
EuroPython 202022 / 130
2
4
7
8
13
16
21
23
25
26
27
30
33
36
39
46
50
53
54
56
60
61
62
65
68
73
82
85
86
95
100
101
102
106
108
109
113
118
119
120
125
00:00
Data managementComputer programmingTouchscreenSoftware developerMoving averageNormal (geometry)TrailFormal languageSpeech synthesisCoefficient of determinationDegree (graph theory)Dependent and independent variablesMachine codeComputer engineeringGoodness of fitActive contour modelBitUniverse (mathematics)Right angleMultiplication signMeeting/Interview
02:02
Machine codeEmailComputer virusLaptopMachine codeData managementComputer programmingTouch typingAudiovisualisierungInformationVisualization (computer graphics)Computer animation
02:21
Demo (music)Gamma functionLine (geometry)Machine codeDemo (music)Medical imagingUniform resource locatorWebsiteProduct (business)PredictabilityPauli exclusion principleWeb applicationBookmark (World Wide Web)Endliche ModelltheorieComputer animation
02:52
Address spaceMedical imagingBookmark (World Wide Web)PredictabilityEndliche ModelltheorieComputer animation
03:17
Machine learningAbelian categoryDemo (music)Data modelFingerprintGamma functionExtension (kinesiology)Singuläres IntegralTensorNeuroinformatikCoefficient of determinationDifferent (Kate Ryan album)Machine visionComputer-assisted translationActive contour modelMachine codeMedical imagingVirtual machineEndliche ModelltheorieValidity (statistics)Multiplication signCellular automatonLink (knot theory)BitPresentation of a groupMereologySampling (statistics)Cross-platformType theoryText editorWeb pageExtension (kinesiology)Installation artTransformation (genetics)Functional programmingBookmark (World Wide Web)NumberSoftware frameworkCategory of beingSet (mathematics)Point cloudPhysical systemPower (physics)Phase transitionPlotterGUI widgetVisualization (computer graphics)outputWave packetOrder (biology)Function (mathematics)View (database)TensorSoftware repositoryLaptopDemo (music)Social classBraidFilm editingAudiovisualisierungInteractive televisionWindowLibrary (computing)Matrix (mathematics)Right angleComplete metric spaceGraphics processing unitScripting languageUniform resource locatorUniverse (mathematics)Source code
09:47
Singuläres IntegralTensorGamma functionHeat transferFunction (mathematics)Machine learningDemo (music)Junction (traffic)Read-only memoryStatement (computer science)Error messagePoint cloudRemote procedure callVirtual machineShift operatorLetterpress printingWritingWindowArtificial neural networkVirtualizationLine (geometry)Set (mathematics)TensorExtension (kinesiology)Variable (mathematics)Sampling (statistics)Mathematical modelState of matterSoftware repositoryNeuroinformatikFunctional programmingLaptopWave packetProof theoryType theoryHeat transferKeyboard shortcutEndliche ModelltheorieUniformer RaumMedical imagingMachine codeCellular automatonTransformation (genetics)Standard deviationIterationPhase transitionComputer iconPixelSoftware testingPresentation of a groupMultiplication signLevel (video gaming)Demo (music)Connected spaceComputer animation
16:17
Demo (music)Machine codeData modelInflection pointSinguläres IntegralLaptopMaxima and minimaMachine codeEndliche ModelltheorieLine (geometry)Cellular automatonMathematical analysisPoint cloudLaptopProduct (business)Wave packetMathematicsBoom (sailing)Scripting languagePlotterSet (mathematics)Right angleSpeech synthesisComputer animationSource code
17:50
Data storage deviceVirtual realityVirtual machineMobile appMachine codeVisual systemFunction (mathematics)Gamma functionPredictabilityWebsiteData storage devicePresentation of a groupWeb serviceMachine codeMultiplication signEndliche ModelltheorieFunctional programmingMobile appDependent and independent variablesUniform resource locatorMedical imagingTemplate (C++)Entire functionInferenceExtension (kinesiology)CASE <Informatik>Projective planeBitMereologyDemo (music)Computer fontRight angleWorld Wide Web ConsortiumBlock (periodic table)DebuggerWeb applicationVirtual machineSource code
21:24
Gamma functionData storage deviceVirtual realityVirtual machineMobile appMachine codeVisual systemMachine learningSoftware repositoryMachine codeMultiplication signFunctional programmingEndliche ModelltheorieContext awarenessAudiovisualisierungPoint cloudVirtual machineData storage deviceWeb applicationSelectivity (electronic)Subject indexingPresentation of a groupLaptopWebsiteLink (knot theory)Right angleResultantSoftware repositoryWorld Wide Web ConsortiumDifferent (Kate Ryan album)Web serviceWave packetGraphics processing unitExtension (kinesiology)DebuggerFunctional (mathematics)Visualization (computer graphics)NeuroinformatikComputer animation
23:09
Exterior algebraWeb browserSpacetimeTouchscreenRevision controlConnectivity (graph theory)Machine codeWeb serviceLaptopIntegrated development environmentText editorMultiplication signVirtual machineDifferent (Kate Ryan album)Machine learningSlide ruleMeeting/Interview
Transcript: English(auto-generated)
00:06
We're gonna be kicking things off here in just a minute with a talk by Jeffrey Mew of Microsoft. And speaking of Microsoft, if you all haven't noticed in the track name, I should probably mention that Microsoft is one of our generous sponsors that makes EuroPython 2020 online possible.
00:25
They've got some really great tools for Python development. I can say firsthand they're awesome because I do all my Python development in VS code. It's a fantastic tool. So yeah, we got people coming in here and this is gonna be great.
00:40
So Jeffrey Mew is a program manager at Microsoft working on the Python data science and AI experience in VS code, which means he's responsible for building some of the stuff I like. Specifically, he focuses on making the lives of data scientists easier across our ecosystem. He holds his bachelor's degree in computer engineering from the University of Waterloo.
01:02
He's a lover of dogs and Python, the language that is, he's still a little bit unsure about the snake. So Jeffrey, thanks for joining us. Thank you for having me. So where are you streaming from?
01:21
So I'm streaming from Seattle in the US, so it's actually in the morning for me. Yeah, same here. I'm over in Spokane, so we're only about 300 miles apart. Okay, really close. Awesome. Yeah, is it warm there too? Yeah, it's pretty good weather. That's what I love about the West Coast weather.
01:40
Just really nice all the time. Normally you don't hear people say that about Seattle, but we kind of like the rain, you know, sometimes. All right, well, I will turn this over to you then and rock and roll. Sure, thank you so much for having me. Let me just quickly share my screen.
02:02
We're gonna be talking about how to supercharge your own data science workflow with notebooks, VS code and Azure. And like Jason mentioned, my name is Jeffrey. I'm a program manager here at Microsoft and I work on all the data science and AI tools within Visual Studio code. And there's some of my contact info if you want to get in touch.
02:22
So obviously, I just want to start with a quick demo of what we're actually building today. And along the lines, like I mentioned, we're gonna be showcasing all the cool data science tools that we have in VS code. So we're actually, I've actually created this website and it's already hosted in Azure web apps. And this is what our final product is going to be. It's just a website that it takes an image URL, it passes this image to my model that I've already predefined.
02:46
It's also hosted in Azure and it will generate the pet breed detector and give me a prediction of what it is. So I can just go to search for images of my favorite pet breed, which is Shiva's because they're so cute. Let's just pick this image. I can right click, copy image address, go back to my website, paste it in.
03:04
It gives me a preview of what it is. And if I click submit, it actually passes this image to my model and it gives me a prediction. So you can see here, it actually predicted Shiva's. So this is what we're essentially going to be building today. So how do we actually classify, how are we going to classify dogs and cats and more importantly, pet breeds?
03:23
Well, it's really easy for us as humans to detect the difference between like certain, especially between dogs and cats. But with pet breeds, you can see between these two, it's also, it might be a lot harder for even humans. So to do this, we're actually going to be training a computer vision model to do the same thing. So as humans, we've been trained to essentially tell the difference between these pet breeds and dogs.
03:43
But we're going to be doing the same thing with our computer vision model. So to do this, we're going to be going through our traditional machine learning workflow. So this is something that almost all data scientists go through. So we're going to start with our data exploration phase. So this includes things such as getting our data set, doing the data cleaning.
04:01
We're going to move on to our training step, which is where we're going to actually create our training script. We're going to be doing this in Jupyter notebooks and VS code, defining some compute. So obviously, you probably don't want to run this on your local machine or it'll take forever. We're going to be running this on a more powerful GPU system. And then finally, we're going to actually productionize our code and deploy it to Azure in the cloud.
04:21
Like you just saw, I had the website hosted and this is our final step. So the first step, like I mentioned, is data exploration. So to do this, we're going to be using something called the Oxford Pet Data Set. So it's a data set from the University of Oxford. It has around 37 different categories of pet breeds and around 200 images per class.
04:40
So this is something we're going to be using in VS code. So we're going to go back into our demo. We're going to go into our favorite editor and IDE, Visual Studio Code. So to actually get, you can see here the notebook I've already created. But to get the data science features within VS code, you'll obviously first need to install Visual Studio Code, which is completely free and cross platform.
05:03
So work on Mac, Windows, Linux. And the more important thing is you need to install the pipeline extension because that is where all of our data science goodies are held. So to do this, you just quickly click on the extensions tab and then search for the keyword Python. And it should be the first one that pops up and authored by Microsoft. And then you can quickly install it.
05:20
I've already installed it for the sake of this demo. But once you actually have it installed, you'll see this start page up here and pop up. And here you can just do things like if you want to create your own notebook, you can do it. But I already have an existing notebook, which I click on this to create and open up an existing notebook and open up in what we call our notebook editor. So this is a brand new feature that we released near the end of last year.
05:43
And you can see it's your traditional Jupyter notebook UI type view. So you have your cells, you have your input and output. So if you have any output cells, it'll show below. You can have markdown and everything. And what's really great about the notebook editor in VS code is that it combines the flexibility of Jupyter notebooks as a data science or just Python editing tool in general,
06:05
because you can run cells out of order and everything, but with the power of VS code as editor ID. So one example of this is you have full IntelliSense and autocomplete support within your cells in VS code. So if I want to do, for example, import pandas as pd, and then I can type pd.,
06:25
and then it will actually, you can see a lot of these autocompletions are popping up as I'm writing it. So if I want to also do os., it gives me the top suggestions of what it thinks I want to do with this package OS. So if I want to make a path with something, I can type that as well.
06:42
And then it gives me these suggestions, which is really great because traditional Jupyter notebooks often doesn't support this. Great. So now we can, the first step, like I mentioned with the data exploration, is actually importing our dataset into our local machine. So to do this, I have the cell. So the first cell is just some generic imports I needed to do.
07:01
But the next part is actually importing the dataset onto my machine. So I have my dataset URL, which is pointing to that same Oxford dataset. And I've written a bunch of helper functions, as you'll see throughout this presentation in my notebook, just to clean up the code a little bit so you don't have to worry too much about the code. But if you're ever interested in learning more specifically of how all the code works
07:20
and how these helper functions or what these helper functions actually do, I'll be including a link at the end of my demo, at the end of this talk to the GitHub repo so you can actually look at the code yourself in your own time. But in this code, all it's going to do is it's going to download this dataset and it's going to save it into my local workspace. So you can see here, if I go to my files tab,
07:40
you can see I have this images subfolder. So I've already run the cell just for the sake of time. I've already pre-downloaded it. And it's separated into its training and validation step. And you can see all the different pet breeds are here. So once we actually have our dataset downloaded onto our machine, one thing we want to do as a traditional data scientist, you want to actually make sure that that data is correct.
08:01
So just do a sanity check. And to do this, I've written this really simple function using matplotlib. And what the function does is it will just plot random samples of each of the categories of images. So you can see here, I can just do a quick sanity check to make sure that these images look right. There's no like really weird images. There's no images of, I don't know, maybe like a snake or a pig.
08:21
So we can make sure that these images are correct. But another really great way is we have a, we also fully support custom iPod widgets. So you can see this plot using matplotlib, which is a really great library. You can see it's really static. There's no real interaction with it. You just look at it. But with iPod widgets, if we run this cell,
08:42
iPod widgets are essentially a more interactive plotting or interactive experience. So you can see here, if I want to look at my plots or my training images more interactively, I can actually scroll between and look through the different, look through my different pet breeds. So it's kind of like a more customized interactive UI. And we fully support this within VS code as well.
09:02
Great. So now that we've actually imported our dataset and then quickly looked through our dataset to make sure that all the data is correct, now we can start transforming our data into tensors. So like I mentioned, we're going to be doing this pet breed detector and to train it, we're going to be using something, we're going to be using a Python framework called PyTorch,
09:22
which is a really popular deep learning framework. But PyTorch doesn't actually understand these image files. It needs to actually be converted into something called tensors, which you can just think about as matrices of numbers. And we need to convert them to tensors before we can actually pass it to the training, pass it to the model, sorry.
09:41
So to do this, usually you just want to apply some transforms. So here I'm just cropping them because all these images can be different sizes. So I want to make everything uniform and standard before I pass them to the model. I'm going to crop it, apply some transformations and then convert it to a tensor. So I can, again, I can run the cell by just clicking this run cell icon,
10:01
or I can actually run the cell by, we fully support Jupyter hotkeys as well. So if I want to use my shift enter, I can just type shift enter and it'll also run the cell as well. So with these transforms, it's often really easy to make a really quick mistake or it's hard to tell, it's really easy to make a one-off error because these are just numbers and you're applying these transforms.
10:20
So a really great way to check to see if these transforms are actually performing properly is with a feature we recently added called run by line. So you can think of run by line as a simplified notebook debugging experience. So what this means is that, I'll just show you with the example, there's this run by line icon here and what run by line does is
10:40
it steps through your code line by line and gives you a state of all, it shows you a state of all your variables at each line of code. So rather than having to write print statements between all these, so maybe you have to write print image dot shape, instead of having to do that on having a bunch of print statements and just generally taking more time, you can actually run this, sorry,
11:01
you can run run by line and it will actually showcase, you can see it's stopping at the first line of code and it'll step through each of the lines of code and tell you the state of all your variables and actually showcases the state in what we call our variable explorer. So if you can see, I opened up the variable explorer at the top of my notebook here and I wanted to look at the variable image
11:22
because that's the one I'm most interested in. So if I look at image, you can see right now it's just a JPEG image file because in the previous cell, I just opened up a random sample image just to test to see if my transforms are working. So as you can see, if you pay attention, let me just make this bigger because I zoomed in on VS Code
11:42
just so everyone can see better, but you can see image right now is around 500 by 375 pixels. So if you remember the first step is I crop it to 224 by 224. So you can see as this is running, yeah, so you can see now the size is 224 by 224. So you can see the crop actually work
12:02
and I can continue to step through my code. If there was an error, it would just stop and tell me there's an error, but so far there's no errors, which is great. Then I next transform into a tensor so you can see the image afterwards transformed. You can see now it's a tensor type before it was an image type. So now I know that this tensor transformation is actually working.
12:21
And then finally at the end, it just finishes and it knows that because I got no errors and everything looks right so far that I can now apply this transformation to all my images. So I just wanted to test on one image before I did it on everything. Great. So let me just close up my Variable Explorer again. And the next line is where I'm actually, I've written this function previously to just do the same transformation,
12:42
but instead of just that one image, do it on all my images in my dataset. Great. So now we just did the data exploration phase. So like you saw, we imported our dataset. We did some data cleaning where we quickly checked all our data within our Matplotlib just to see if everything looked great.
13:00
And then we converted everything into something that the computer and more importantly PyTorch will understand when we do the training steps. So now we're gonna actually head to the training phase. So again, we're gonna go back to VS Code for this. So in the training phase, we're actually, all we're doing is we're applying something called transfer learning.
13:20
And to do this, we're just taking a pre-trained neural network. So we're taking something called ResNet-18, but we're actually just setting up the model within VS Code. This is all from PyTorch as well. And we're going to actually train the model. So I've already written this training function. And again, if you want to learn more about how this training function works, please check out the GitHub repo at the end of this presentation.
13:40
But for the sake of this demo, I've just trained it for one epoch and you can think of epochs as iterations. So the more iterations you do, generally the better accuracy you're gonna get. So I only did it for one just to show a proof of concept, but you can see after training it for one iteration, essentially, this kind of took 40 minutes almost to run, which is extremely slow
14:01
because I'm running it just on my personal laptop. And I think we can do a lot faster than this. So to do this, we can actually leverage something called Azure VMs. So Azure VMs are actually, we can leverage Azure VMs because they have GPU compute.
14:20
And this is where these deep learning models actually thrive. So to do this, I just need to actually download this Azure VMs extension. So I can search Azure virtual machines and they can see here. I already have it installed, but once you actually have your Azure virtual machine, this extension installed, you can quickly go into your Azure tab.
14:41
This Azure tab will now pop up. You can go into virtual machines and then you can see I've created a virtual machine already for EuroPython. And then what's also great is VSCode has really great remote SSH support. So you can actually connect your remote machines and then do everything you're doing live on that, the same thing that you're doing in your local machine on that remote machine.
15:01
So what we're actually gonna do is we're also gonna install the remote SSH extension, the remote SSH extension. And then from here, we can actually connect to, you'll see this button up here and that you can actually connect to your remote machine. And I already have this set up, but for the sake of time, I'm just gonna quickly skip over this, but this will just pop up a new VSCode window.
15:22
It will look the exact same thing as this, as you saw already, but it's now gonna be running on your remote machine. And then on that remote machine, we've gone, I ran the same training, the same exact same code, the exact same file, but I ran it and it's actually got the same accuracy. Might be some errors with my connection,
15:40
but I got the same accuracy, but with, I ran it for 10 iterations, sorry. This is only one. So it took, on my local machine, it took 40 minutes to run one iteration, but on that remote Azure machine, it actually took around 15 minutes to run 10 iterations, which is insane about how much speed up I got from using the remote machine versus my local machine.
16:03
So now that we've actually trained our model, we actually need to save our model, which is the final portion. So I just have this basic function, which just saves the model. And once we have the model saved, we can start deploying it to the cloud. So now we're actually on, we've completed our data exploration and we've completed our training
16:21
and now we're ready to productionize our model. So to do this, we have this really great feature that we just released recently called Gather. And you can see the button for Gather right here. And Gather is essentially a dependency analysis and code cleanup tool. And what it does is it will just look through all the code in your notebook and look through all the cells and all the code
16:40
and only extract the relevant code and lines of code, only extract the relevant cells and lines of code that are required to generate the cell. So if I click on this button, it will just give me a new file called the Gather Notebook. And you can see it only contains the code that is required to generate my model. So you can see it only contained the dataset, transforming the data and the training.
17:00
And it left out all these plots because it realizes these are just intermediate steps that were not needed. It left out a bunch of my markdown cells because that's not needed as well. And it left out a bunch of imports. So you can see it only kept the key imports as well. Finally, I just need to move this, sorry. So finally, once you're actually in this Gather Notebook
17:20
to productionize your code, instead of having, you usually wanna convert to a Python script because you can't really do much with a notebook to deploy. Instead of having to copy and paste your code over, you can actually, we have a feature called export where you can just quickly export as a Python script and boom, it's now a Python file where you can start refactoring your code and change it into something that
17:42
you can basically refactor it and get it ready for production to deploy to the cloud. So speaking of the cloud, we're gonna get to the deployment step now where here's the outline, which is just showing you how we're gonna use this. So previously we used VS code and Azure virtual machines to leverage the compute, but now we're gonna actually make use of these three Azure functions called Azure services,
18:03
sorry, called Azure storage, Azure functions and Azure web apps. So we're gonna be using Azure storage to actually store our model because we want it to be kind of like a microservice where we don't have to update the entire web app each time. We can just upload the model directly to here and our API or whatever will read from this microservice.
18:22
The next step is we're gonna be using Azure functions. We're gonna next use Azure functions where Azure functions you can think about as an API service and this is where we're actually gonna host our API endpoint for the model. So this is where the website's actually gonna call it. And finally, we're gonna be using Azure web apps, which is the front end.
18:41
So that's the front end we saw at the beginning of the presentation, the demo where I hosted the website where it actually did the prediction of the pet breed. And what's really great is because Azure is so tightly coupled with VS code, I can do everything to deploy from within VS code. I don't need to go anywhere else. So again, to do this,
19:00
I just need to install the relevant Azure extensions. And again, these are all completely free, these extensions. So you can see I already have them installed. So I have Azure app service, Azure functions, Azure storage. And the first step, like I mentioned, is I want to deploy my model to the storage. So if I just open my storage tab,
19:20
I can see I want to create a container to actually store my model. And I've already created a pet detector, but if you want to create your own, you can just right click and click create. And to deploy it, you just right click and click upload block blob. And then from here, you can quickly scroll to where your model is. So I have my model folder right here, and it's a checkpoint.ht
19:41
because that's what TensorFlow saves it as. And I can just quickly click it and then click upload. But again, for the sake of time, you can see I've already previously uploaded this. Once you click upload, it'll show up here. So now that I have my model in storage, the next part is actually creating my API endpoint. So to do this, I will go into my functions tab and I'll create a new function. And you can see if I go into my folders,
20:02
my functions actually contained within this inference subfolder. And you can see there's kind of a lot of files here. It might seem a little bit daunting, but what's great is with this Azure functions extension, you can just click this one button here, which creates a new project. I can define where I want it to be. In this case, I created the inference one.
20:22
And here is where it automatically generates all these files for me, so I don't need to do anything. And this predict.py is the only thing you need to create. So when you actually create the template, it's kind of blank. But this is where you're actually editing to make your custom API. So you can see here, this is where I'm actually just getting the image URL
20:42
and it's passing it through the model that I'm getting from Azure storage. And then it's returning a response of what that prediction actually is. And again, once you have your Azure function created, you can quickly just deploy it by right-clicking my Azure function, click deploy to function app, and it'll just deploy to that function app.
21:00
Again, I've already deployed it for the sake of time. And again, you don't have to worry too much about the code. I'm going to be linking the code at the end of this presentation if you want to take a close look in your own time as well. Finally, the last part is app service where we're going to be hosting our front end. So here is where we're going to be hosting that front end. I just created a basic HTML file.
21:22
That was what you saw earlier for this. And this basic HTML file, all it does is it just calls my API that you just saw previously on my Azure functions and then just returns the result. And again, super simple to deploy to this. I just go to my Azure tab. I can right-click create a new web app, but I already have one here called Jamie PyTorch web app.
21:43
All I need to do is just click right-click, deploy to web app, and then just point it to that website. So I have this website, and then you can see this folder, all it contains is that index HTML, just one file, click select, and then it'll deploy to the cloud. So those were how, and as you can see, all this was done within the context of VS Code.
22:03
So I'm just going to quickly jump through a quick summary just because we're running low on time. So we started off with our data exploration. We did this with the Python extension within Visual Studio Code. We next went through our training. So here, again, we stayed within Visual Studio Code, but we used Azure virtual machines for the compute.
22:20
So that's where we leverage our GPU compute to speed things up. And then finally, to productionize our code, we used our different Azure services. So we used, you can see, Azure Functions as the API, Azure Storage to store our model, and Azure Web Apps to actually host our front end. And the key portion is this was all done within VS Code. So this is why VS Code is so great.
22:40
It has all our data science tooling and needs, and I didn't have to leave VS Code for anything. So what's next? Here's just the link for the GitHub repo if you're interested as well. And as well, if you want to try our VS Code notebooks, you can just go to aka.ms slash notebooks. So these are the only two links you need to remember if you want to remember this presentation. And that was it. Thank you for putting up with the technical difficulties
23:01
in the beginning, but thank you so much for your time and listening.
23:22
I'll leave the slide up if people want to take notes as well. Sorry, I think you're muted. Oh, thank you. I'm sorry about that. Thank you so much for that. Yeah, so Microsoft, Harry has a question. He said, Microsoft has announced
23:41
the Azure notebooks will be discontinued soon. Is the VS Code editor recommended to replace the Jupyter IDE? So the Azure notebook service is being discontinued, but we're replacing it with something called the Azure notebooks component, which is going to be basically a version two or a better version of Azure notebooks, but embedded within different components of Azure services.
24:02
So one example could be like the, within Azure machine learning. But I would say that code spaces, that's something I forgot to bring up because I didn't have time, but I can quickly show it right now. I don't know if I'm still sharing screen. Yeah, but there's a thing called code spaces, which is VS code in the browser. And this is a really great alternative to Azure notebooks,
24:22
which you just have the same notebooks experience that you just saw in VS code, but now it's within your browser and within a virtual machine. So just call it code spaces. Awesome. Well, thank you again. Thank you very much.