Python and Async programming
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Teil | 6 | |
Anzahl der Teile | 169 | |
Autor | ||
Lizenz | CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben | |
Identifikatoren | 10.5446/21200 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
|
EuroPython 20166 / 169
1
5
6
7
10
11
12
13
18
20
24
26
29
30
31
32
33
36
39
41
44
48
51
52
53
59
60
62
68
69
71
79
82
83
84
85
90
91
98
99
101
102
106
110
113
114
115
118
122
123
124
125
132
133
135
136
137
140
143
144
145
147
148
149
151
153
154
155
156
158
162
163
166
167
169
00:00
SynchronisierungBitSelbst organisierendes SystemMultiplikationsoperatorVorlesung/Konferenz
00:33
TaskSynchronisierungVersionsverwaltungDifferenteDatenparallelitätComputeranimation
01:17
LoopSynchronisierungExistenzaussageGenerator <Informatik>SpeicherabzugKoroutinesinc-FunktionFigurierte ZahlCodeUngleichungCodierungVorlesung/KonferenzComputeranimation
01:56
LoopKonvexe HülleSynchronisierungCodeKoroutineGenerator <Informatik>SpeicherabzugCASE <Informatik>FehlermeldungVorlesung/KonferenzComputeranimation
02:23
LoopBenutzerbeteiligungNatürliche ZahlCodeSynchronisierungWeb ServicesFormale SpracheComputeranimationVorlesung/Konferenz
03:02
W3C-StandardEreignishorizontServerSocket-SchnittstelleSynchronisierungDatenparallelitätEchtzeitsystemWeb ServicesBenutzerbeteiligungSocket-SchnittstelleServerEreignishorizontComputerarchitekturSocketGrenzschichtablösungFront-End <Software>Serviceorientierte ArchitekturMathematikComputeranimation
03:54
Mini-DiscWeb SiteDatenstrukturKomplex <Algebra>ROM <Informatik>ProgrammierungDatenparallelitätEndliche ModelltheorieGreen-FunktionCodeAggregatzustandAbfrageSynchronisierungInformationsspeicherungDifferenteSoftwareentwicklungKomplex <Algebra>DatenparallelitätMereologieHalbleiterspeicherStichprobenumfangFormale GrammatikGerichteter GraphCodeKartesische KoordinatenQuick-SortEndliche ModelltheorieMusterspracheProzess <Informatik>EinsDatenstrukturPunktFormale SpracheSchaltnetzElektronische PublikationFunktionale ProgrammierungResultanteSchedulingMini-DiscWeb SiteURLVorlesung/KonferenzComputeranimation
06:32
CodeDatenparallelitätMereologieEndliche ModelltheorieCodeLoopVirtuelle MaschineEreignishorizontThreadProzess <Informatik>Vorlesung/KonferenzComputeranimation
07:04
ProgrammierungSynchronisierungCodePunktDatenparallelitätSichtenkonzeptDemoszene <Programmierung>MultiplikationsoperatorProjektive EbeneProgrammierungInterpretiererBitSynchronisierungVorlesung/KonferenzComputeranimation
07:47
HalbleiterspeicherRastertunnelmikroskopTransaktionSoftwareCodeVorlesung/Konferenz
08:13
ThreadLoopEreignishorizontThreadResultanteInformationsspeicherungSoftwaretestTaskCASE <Informatik>EreignishorizontProzess <Informatik>LoopEntscheidungstheorieEinhängung <Mathematik>XML
09:36
LoopTaskEreignishorizontFunktionale ProgrammierungWort <Informatik>ThreadCodePunktTypentheorieAblaufverfolgungSystemprogrammSchreiben <Datenverarbeitung>Vorlesung/KonferenzXML
10:51
Schreib-Lese-KopfSynchronisierungEreignishorizontGamecontrollerProzess <Informatik>AbstraktionsebeneCodeLoopThreadp-BlockFunktionale ProgrammierungNetzbetriebssystemCASE <Informatik>Fluss <Mathematik>Gerichteter GraphVorlesung/KonferenzProgramm/QuellcodeXML
12:11
Funktionale ProgrammierungGebäude <Mathematik>Termp-BlockSynchronisierungAusnahmebehandlungLoopResultanteInformationWrapper <Programmierung>CASE <Informatik>EreignishorizontBaum <Mathematik>Einfache GenauigkeitMehrrechnersystemSchedulingBitVorlesung/KonferenzXML
13:59
Funktionale ProgrammierungMultiplikationsoperatorLoopTypentheorieDatenparallelitätZahlenbereichBitDatenverwaltungGewicht <Ausgleichsrechnung>Vorlesung/KonferenzProgramm/QuellcodeXML
15:02
FehlermeldungTaskFunktionale ProgrammierungGerichteter GraphMailing-Listep-BlockLoopOrdnung <Mathematik>ResultanteKoroutineWurzel <Mathematik>ZweiSpeicherabzugBildschirmsymbolSynchronisierungMessage-PassingStrömungsrichtungUmwandlungsenthalpieGesetz <Physik>SchlussregelGewicht <Ausgleichsrechnung>Vorlesung/KonferenzProgramm/QuellcodeXML
17:11
SynchronisierungSchedulingLoopGewicht <Ausgleichsrechnung>Gerichteter GraphLoopTaskZahlenbereichMultipliziererResultanteDatenparallelitätRuhmasseFunktionale ProgrammierungKoroutineBitInformationSynchronisierungMultiplikationsoperatorMailing-ListeHochdruckCodeVorlesung/KonferenzProgramm/QuellcodeXML
20:33
CodeURLSoftwaretestLoopResultanteKoroutineSchreiben <Datenverarbeitung>FehlermeldungTaskMultiplikationsoperatorAnwendungsspezifischer ProzessorEinsVorlesung/KonferenzProgramm/QuellcodeXML
22:09
CodeAusnahmebehandlungSoftwaretestTaskVollständigkeitResultanteEinsFunktionale ProgrammierungMinkowski-MetrikMessage-PassingFehlermeldungVorlesung/KonferenzProgramm/QuellcodeJSONXML
23:08
ResultanteEinsTaskSoftwaretestSchnittmengeLoopFunktionale ProgrammierungGewicht <Ausgleichsrechnung>ParametersystemMultiplikationsoperatorZustandsdichteMessage-PassingVorlesung/KonferenzProgramm/Quellcode
24:49
Socket-SchnittstelleEinfügungsdämpfungFunktionale ProgrammierungKonfiguration <Informatik>Socket-SchnittstelleFramework <Informatik>EinsTaskSynchronisierungBitErwartungswertVorlesung/KonferenzComputeranimation
25:50
AbfrageCodeVollständigkeitMereologieServerVorlesung/KonferenzComputeranimation
26:15
ServerCodeEinfache GenauigkeitInternetworkingFunktionale ProgrammierungRechter WinkelSynchronisierungSocket-SchnittstelleProgrammbibliothekKugelkappeProgramm/QuellcodeJSON
28:01
LoopServerGüte der AnpassungThreadRandomisierungFramework <Informatik>CASE <Informatik>DatenparallelitätVorlesung/KonferenzComputeranimation
28:32
LoopThreadCodeMultiplikationsoperatorLoopThreadSynchronisierungWarteschlangeBitServerProtokoll <Datenverarbeitungssystem>HackerAggregatzustandProgramm/Quellcode
29:26
EreignishorizontFramework <Informatik>CodeBeobachtungsstudieSynchronisierungÄußere Algebra eines ModulsProgrammschleifeVorlesung/Konferenz
29:56
EreignishorizontLoopCodeSchnittmenge
30:25
SynchronisierungProgrammbibliothekMultiplikationsoperatorCodeSynchronisierungVorlesung/KonferenzComputeranimation
31:00
Vorlesung/Konferenz
Transkript: Englisch(automatisch erzeugt)
00:00
Can everyone hear me Yeah, great. I think there was supposed to be one of the organizers here to announce me But I don't see anyone and it's time, so I guess I'll just start Yeah, so when I started looking into a sync.io I noticed that a lot of people thought it was quite complicated to use and especially to get started with it
00:22
so I gave this talk the subtitle do you need to be a wizard to use it and The answer is a little bit it's not that hard, but when you're just getting started there are quite a few problems that you'll run into and Well concurrency is intrinsically complicated, so it's not anyone's fault, but there are a lot of new concepts
00:44
there are things you need to learn that are not just standard Python and It has a very long history all the way from the back porch to 2.7 to the new keywords in 3.5 the syntax has changed a lot and the things you do have changed a lot So when you try to figure out how to do something and there's not one way to do it
01:06
there are different ways in different versions, and they will change quite a lot and sometimes the Documented best practice you will find is Maybe deprecated now, so Yeah, that can be a problem, and if you don't know the internals some things will just be weird
01:26
but You get used to those quite quickly and Fortunately the code is quite easy the asyncio code and github is quite easy to read so you can look at the internals and figure those out and There's a lot of incompatible terminologies like core routines are generators except when they're not generators and
01:46
Not all generators are core routines, and you know you have api's like future that implements the same api But it's not you cannot really use them interchangeably and you'll run into trouble with those things and since it's so new
02:00
There's this bottom-up approach to learning a sync.io so you want to It's very common that you start Understanding first core routines that were first generators core routines and have to understand all the internals before you actually start writing code with sync.io So my idea was try to simplify this and try to go through how I've learned the sync.io
02:23
and how I started writing things and the errors I ran into and see how other people can start writing code with the sync.io from this and I work at a web agency And I came to sync.io because sometimes the things we have to do are concurrent in nature
02:44
And I've noticed that some of my co-workers Just started going to other languages mainly go and say well. This is too complicated to do in Python I'll just write it and go create a service, and then we can call the service to do to solve this problem, but
03:02
I realized that Python just added this whole thing to do concurrency that has changed the way concurrency is supposed to be done in Python and We were not really using that so I tried to see well Can we do this with the sync.io instead and some of the problems we're trying to solve are things like
03:23
creating a back-end service for a server for real time with server sent events or web sockets and That would look something like this This is a very simple echo server, but it's not very far from something you could actually use to run a web socket server
03:42
We could for example try to fetch data from several API's concurrently Which I think is something that is very common especially when you're doing service oriented architecture You may have tons of API's that you want to get data from and if each one of these takes about a hundred Milliseconds and you want to query five to ten you might end up spending a whole second
04:05
Just waiting for IO. Well you could do this concurrently So this is quite easy to do with a sync.io it would look something like this You don't have to pay too much attention to this example, so I'll look at them in more detail later
04:22
You can do some sort of pipeline processing which is very similar to before Instead of just fetching the URLs you may want to process them afterwards. Maybe you want to put them in Redis maybe you want to Parse them as JSON and all these things you can do asynchronously and
04:41
You could copy a site into disk again fetch a bunch of URLs then Lock a file and write to that file Or you may want to have a complex data structure that you want to have in memory and you want to access but you want that every now and then this gets loaded from a store somewhere and
05:02
That would look something like this. This is just a sample code, but you know it would work kind of in the same way and So as you can see all of these examples follow kind of the same pattern and that pattern is the concurrency
05:21
model that we're going to be using and There are several models that you could use and different languages implement different ones and they're normally a combination of different concurrency models and So I think I owe is basically a formalization of how we should do concurrency in Python
05:45
Now I think there are two parts to concurrency models the first one is how we write and understand our code I think this is the most important one This is the one that we should focus on and You can see that as an API if you wanted to write an API for concurrency
06:03
You could come up with something like this. You could execute something in the foreground execute something in the background wait for a result or schedule execution for things later, and you can write some functions to do this and This is just like a pseudocode of something
06:21
We may want to do if we wanted to have this concurrency model and if we did that concurrency model that would yield programs that look a certain way now a sink IO will give us programs that look in the async IO way and That will be a unified model of how we do concurrency in Python so the other part of a concurrency model is when the code runs and
06:45
It can run traditionally in threads and processes you can run it in different machines If you use something like MPI or you can run it on event loop and the event loop is a big part of what? As in KO model builds upon and I don't I'm not gonna look too much into this part of things because I don't think it
07:04
Matters that much. I'm gonna look into the other one That is how is the code gonna look and how we're gonna understand that that code is correct and does what we want from the point of view of concurrency and then these things might be interchangeable later, so Let's start looking a little bit to pythons answers to these questions
07:22
but before we do that, I want to tell you what a sync programming Python is not and That is just one thing and it's a solution to your guild problems. So if you Are writing code that runs a lot of Python spends a lot of time in the C Python interpreter
07:41
You're still gonna run into the global interpreter lock and this is not gonna save you from it There are some projects like the galacto me and pi pi stm They uses software transactional memory that are trying to fix this in different ways But you know, we'll have to see what they come up with. But for now if you're gonna
08:00
Do something that is in? That the gill is getting in the way then you'll just have to go and run them in different processes Like we've been doing for years So, okay. Let's get started. Let's write some code But before we do that, I wanted to discuss this one just to get it out of the way and
08:23
We said that things can run in threads and processes and That's fine. We can create thread pools and with thread pools We can then tell them to execute a process and it will look something like this But we can also run things on an event loop and this is the one that is new
08:43
If we want to run something in an event loop, we first need to know what an event loop is and Anybody that's worked with JavaScript probably knows this and many of you probably do but in case you don't it looks something like this it's just a loop that will run essentially forever and
09:00
it will pick a task and based on some policy and ask that Ask that task to be executed and that task will execute until it decides to suspend execution and As you can see there's nothing here that is preemptive like with threads the task will execute and eventually that task will have to tell
09:22
The loop that it has returned from a function so that it can continue running. Otherwise that loop will be blocked right there Once it has suspense execution Then it can reschedule that test decide if the task has a result and he wants to do something with that result Maybe we want to store it. Otherwise, we just put that task back in our queue and select it again
09:44
when we look back and If we want to use an event loop in Python it looks kind of like this you have a function and you can define that function with the same keyword in Python 3.5 and when you
10:00
Define a function that is supposed to run asynchronously that is The same keyword tells the function that is supposed to run asynchronously so that function will not execute unless a loop calls it and Then we have the await keyword to tell the loop when it's supposed to suspend execution so then we get an event loop and we tell it to run that function until it completes and that actually works and
10:24
If we wanted to specify we want to run these things on threads instead we don't write asynchronous functions and we take a normal blocking function that we want to run on a thread and We have this utility called run an executor that will actually allow us to just run that function and it will return something
10:42
That is a synchronous and that the loop knows how to handle So this gives us a flexibility of being able to write our code write code that will look the same way and then Choose whether it's gonna run in threads processes or in an event loop and have everything controlled through this event loop abstraction
11:03
Now, let's start writing some async code. What do we need to know? We need to know the async and await syntax which if we want to have a function that Is not a synchronous that just blocks then that's fine We just write a regular function and we call it whenever we need it If we want to have a function that runs on an event loop and that can suspend execution
11:24
Then instead we use the async keyword and that function will not run by itself Inside that function we can use await and await will just it's very similar to a return Except when the function comes back, it will continue execution from there So we tell the loop that we will suspend execution until
11:44
some other Function or a waitable has finished and in this case, it will be suspend execution or that executor where we're running the blocking function and One of the advantages of this is that this way we can specify
12:04
Explicitly when our code will suspend when it will execute So if we were to run this with threads, the operating system will tell us when the function can Stop or when it can't so we have to be a lot more careful in terms of synchronization Whether here the function will continue to run until that function decides by itself that it will stop
12:25
The other building block that we will need with this is futures. So a future is basically a wrapper that Can you can check it has three methods you can check if it's done You can check if it has a result or you can add a result to it
12:44
So when you have a future and you check if it's done it normally isn't unless you have added a result to it then if You ask for a result and it doesn't have it. It will yield an exception. Otherwise it will you can set a result and then you will get a result and
13:03
Now we know this background information, so let's try to get a loop and start running things So we have a loop we want to create it run it stop it and schedule things on it So to create a loop we use a sync.io get event loop and that's it. Now. We have a loop and It's not doing anything, but we have one
13:23
Now if you want to run it you can just call run forever on it and it will do as it says it will run forever Unless there's an exception inside the loop in which case it would just throw the exception now If we wanted to stop a loop, and I think this is a little bit confusing
13:42
A loop can only be stopped if it's running So if you have a loop and you want to tell it to stop you will have to Ask it after you ask it to be marked to stop You need to ask it to run once otherwise the loop will not be considered stop so if you want to stop a loop that is not currently running you have to ask it to run forever and it will
14:04
Run once and then stop which is confusing is the type of things that you have to look at the internals to understand how it works, but Most of the times you don't have to do this you Instead what you do is something like this you want to run things on a loop instead of managing it manually
14:23
So let's say we have this function that we want to run something It just prints a bunch of numbers, and it waits one second between each number So you ask the loop to run and to complete and that's it it prints us We expected it to it's just a little bit boring so instead what we would want to do it We've been talking about concurrency is run something
14:42
More like this we say well we Want to have two of those functions running concurrently, and if we do that? we can use this ensure future that will just create a future and attach it to the loop and We write an extra function because we only know so far how to run one thing so I say well
15:02
Okay, go loop, and then when we execute that that would actually stop very quickly and give us an error and The reason this is happening is because we created these two futures, but we didn't really nobody's waiting for them So when the root block the loop runs it runs the go function
15:22
And it knows that will it will stop when the go function completes so that function creates one future It doesn't wait for it creates the second future doesn't wait for it And then it completes so the loop stops running and if you're debugging a sync I owe you will get this nice helpful messages
15:40
So instead what we can do is just await them, and then we use a weight, and we tell the loop well I want to run these two things but I Want to wait for them don't end go until they are complete and if you do that it will work But if you look at the result you can see that is actually running one task first and the second task afterwards
16:01
So it's not really concurrent. It's just sequential and you might as well just have written regular functions so instead what we can do is try to figure this out and Go back to the specification and see that we have a sink. I owe that as completed So we can create this Task that we want to run and then we ask a sink
16:22
I owe to give it back to us in a for loop in the order they complete and When one of them completes we get the future we wait for its result and then this works Except it's really ugly It's this doesn't really look very by tonic and it's very complicated for just running a couple of things concurrently
16:43
So what we have instead is this icon aggregators or the inner tools of a sink I owe Where we can do things like a sink I owe dot wait which just will take a lot of core routines or a list of tasks and Not return until all of them have completed
17:02
so if we run something like that, we're actually running concurrently now and it still looks quite ugly, but We can Realize that if we're waiting for a sink I owe the weight then that means that a sink I owe the weight is an
17:21
Awaitable and that means that we can run that directly on our loop And then it would look something like this and this looks a lot more like an API. We will want to work with create a bunch of tasks and ask the loop to wait for all of them and execute them concurrently and
17:42
Okay, I wanted to make an aside here and Remember the internals and how they can be complicated. We have also this function for a loop to run once instead of running Forever or until something completes and we shouldn't call it's an underscore function, but it's there So you call it and then you can see that you can run a loop step by step
18:04
So whenever you're debugging a sink, I owe you can just start running your loop step by step and printing stuff And you'll see what's actually happening Okay, but those are quite boring functions they're just printing stuff So let's try to build something that retrieves some results
18:20
So I changed the function a little bit now It's a countdown and now it just multiplies the numbers as it goes down and returns the result So we create our list of tasks and we wait on them and We assume that the tasks are gonna have some information for us when we get back Thing is what we get is these coroutines and they're coroutines that have already executed so they're not very useful
18:43
We can't really get any information from them so we can try to go back and say well I remember futures were something that I could use for knowing the results of things that haven't happened. So We create a mass futures instead and run that and yeah now we have this task and a task is just
19:04
It's just a coroutine wrapped in a future that the loop is using Internally, and that's good. We can access things from tasks. So we Iterate on over them and that's it. We get the results there. It's just we made this look ugly again, and
19:21
so Instead what we can do is go back to our aggregator list and the sync IO which they're not called aggregator server, by the way That's just the name I give him and We have this one called gathered that does very something very similar to wait Except it will actually return all of the results and since the loop returns whatever
19:45
The function was running returns then we can just print those results and it does the same thing as before except It just looks a lot better now so Okay, what have we learned so far? we have a sink and a weight we can create a weightable functions whose execution can be suspended and
20:04
With a weight we can suspend until another weightable returns We have a loop that we can create we can ask it to run we can ask it to Run something or run forever and we can stop it and we have this functions that allow us to wait for things
20:20
So that we can run many things Concurrently and at the same time get their results after or just make sure that we wait for all of them So okay, that's good with that. We can probably start writing some a little bit more realistic code and we can
20:43
Write something like this. This is an example as I said before just fetching a bunch of URLs You want to get those URLs and you want to get them all at the same time? Or concurrently and then just gather their results and you could do something like that and this works it will
21:00
fetch all the URLs and all the URLs will be there at the end the only thing is that Maybe we don't care about all of those URLs so we can try to do something a little more complex and say well Maybe I just care about the first URL to return So let's write something that wants to solve that so we can write first completed
21:20
And it's very similar to before so we just went back to this ask completed idea that has a timeout and We just wait for those Wait for those coroutines to complete and when the first one completes We just return that and that's it and this looks like it would do the work except when we execute it
21:43
we get a lot of errors and The reason we this is actually returning you can see the HTML up there then was supposed to be returned But then we're getting all this tests that have been destroyed and the loop that hasn't been closed and the thing is with I think I owe you have to do explicit cleanup. You're supposed to close your own loops and clean up your own mess
22:05
So let's start just by closing the loop manually so that it will internally clean up some stuff for us and that looks good Except at least we got rid of one error, but we still have this test pending and the reason those tasks are pending is because if we
22:26
Yes, yeah if we go back to our original code you can see that We're returning after the first task completes. It's not the one We're returning after the first task complete
22:41
So the other ones are still in the loop and they're supposed to be executed, but we're never waiting for them No one is waiting for those tests. So what we can do instead is Something like this we say well, we're gonna execute Do this exact same code except we're gonna wait for all the tests before this function returns
23:01
And that works that gives us the right result The only problem is that this takes as long as the longest wait And if some api's are slower than the others you get some that return in one millisecond and some they return in 100 milliseconds Then you're gonna get the first one That's the result you're returning but then you're gonna have to wait 100 milliseconds until all the other ones complete
23:22
so instead what you can do is Something like this, I guess you start working with it you write another function say well now I want to cancel all my tasks and then within your cleanup you You first run your function that returns what you want great and I'm clean up before closing the loop you
23:44
Cancel all the existing tasks and yeah that works It's not nice. It's not very pythonic anymore. I think so instead would you have as well if we go back to the API and
24:01
We use wait before and wait actually has other parameters like a timeout and return when and we can tell way to return when the first test completes and What wait returns is actually two sets of? Futures one for the tests that have been completed and one for the tasks that are pending
24:22
So we do something like that for the pending task. We ask them to cancel and for the completed one we just Get the first future and return that and this will work this prints what we want and get us the result we want The only issue is that this example is very well matched to what weight does so
24:43
Yeah, if we wanted to return now the first completed, but maybe the first two then this doesn't do the work anymore So we would probably have to go back to something like cancel all or have to write our own function that deals with these things But it's good to have this
25:02
The separate options that we Yeah, we can either do wait and then deal with the pending ones or we can just Write our function and then cancel the other ones Or we could cancel them manually somewhere else but yeah, so whenever we're working with the sync IO the
25:24
Important thing here is we have to do cleanup, and it's not always trivial and We have to close everything whenever we use Sockets readers executors everything has to be closed manually we shouldn't expect the framework to close things for us and
25:41
We should know where our tasks are So okay Gonna make this a little bit more fun before we end and now we have this first First completed API and let's say this is a very critical part of our code. We normally query
26:01
hundreds of API's and it's very important that we just get the first one quick and We might have something else in our stack that is not Python and wants to use this so let's just build a server that returns this and So for this we just we're gonna use the same code
26:22
as we used before and then we're gonna try to get a server that serves first completed so we go to our documentation we get the This is basically the easiest server. You can build this right from the documentation that's slightly modified and
26:42
Right now this is an echo server. It's like okay so this runs in a sync IO it runs on a loop and if we want to Build our own function into it. We can just do something like this instead where This is exactly the same as the echo server
27:02
We worked on before we just changed this code here in the middle that is just calling first completed and waiting for it so And this works you can run this code, and you will Execute it will run a server you can do net cap to it
27:20
and the good thing is that it will Write right just from the example and adding a function there. You will have a working server that executes and can Handle a lot of requests, but if we wanted to do instead of using just plain sockets do HTML
27:40
We can start looking at a sync IO libraries and look for example to into a IO HTTP Again, just the same. This is the basic AOHTTP server just hooker function there and We have a completely asynchronous server that is Returning while running and fetching things concurrently
28:04
So okay is this an overkill yeah, probably it's We don't really need to build a server for this But it's good that we can and we can do it while running everything within one thread in a loop while using the same
28:22
Framework that we will use for all concurrency in Python so for Other use cases maybe something we would want to do instead is just run the loop on a thread and we can do that too we can have our synchronous code and That that is your normal Python. That is Python code that will be running and we can put the loop on a separate thread and then assign things to it for it to
28:46
Run him and then get him back Things get a little bit more complicated here but this will work and you just have to get your Make sure you clean things up and make sure you always call for the loop to run thread save and
29:07
There's a lot much to asyncio that I'm not gonna have time to go into there are low-level API is there's Specifically queues for synchronization I are quite useful and you can build your own protocols for your servers and there are a lot of cool hacks and you should go check out the internals there if you go to
29:26
get a hub Python asyncio you have The actual code there that is very easy to read and that's basically the whole implementation and Out of the external stuff for sync IO the one that they wanted to name
29:41
in itself is the possibility of creating Alternative event loops we can't do can use this framework while using event loops that do not come with a sync IO and one of those is UV loop, which
30:01
Yeah Would use us a lot of things internally, but it claims to be very very fast and if you Look at their benchmarks. It's up there faster than no JS and about as fast as go and This is something that you can just plug in you can run any of the code
30:24
We've been writing and just change the loop and use you be loop instead and it will work with a separate loop so The the flexibility that this gives us is that we can write the same code and understand the code using this framework But at the same time we can change the internals that will run it in different ways and change how it will execute
30:44
And there are a lot of third-party libraries for pretty much everything you want So if you want to start playing with sync IO get some of these and start writing some asynchronous code Thank you