Kotlin coroutines
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 52 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Unported: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/47755 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | |
Genre |
00:00
TermKoroutineTesselationThreadSoftwareentwicklerProgrammierungKoroutineMereologieArithmetisches MittelMultiplikationsoperatorFormale SpracheMixed RealityEinfache GenauigkeitVersionsverwaltungSynchronisierungParallelrechnerVorlesung/Konferenz
01:40
SpezialrechnerGruppoidGewicht <Ausgleichsrechnung>Formale SpracheBildgebendes VerfahrenNichtlinearer OperatorThreadComputeranimation
02:38
SpezialrechnerNeuroinformatikSystemaufrufDifferenteNichtlinearer OperatorThreadPunktMultiplikationsoperatorComputeranimation
03:24
SpezialrechnerTaskCodeProgrammierungGerichteter GraphBitSystemaufrufParallelrechnerHyperbelverfahrenKoroutineFormale SpracheFunktionalDifferenteGewicht <Ausgleichsrechnung>SchnelltasteÄhnlichkeitsgeometrieComputeranimation
04:43
ProgrammbibliothekFormale SpracheFunktion <Mathematik>KoroutineProgrammbibliothekFormale SpracheKoroutineImplementierungDifferenteMAPComputeranimation
05:27
ThreadKoroutineFolge <Mathematik>ROM <Informatik>ThreadKoroutinePaarvergleichPunktDifferenteDeskriptive StatistikCompilerMAPSichtenkonzeptFolge <Mathematik>SoftwareentwicklerÜbersetzer <Informatik>Computeranimation
06:32
ThreadKoroutineKartesische KoordinatenKoroutineThreadHalbleiterspeicherProgrammierungAusnahmebehandlung
07:20
ThreadKoroutineThreadKoroutinep-BlockGeradeZahlensystemNeuroinformatikCoxeter-GruppeAutorisierungWeb logComputeranimation
08:11
ThreadNeuroinformatikThreadSoftwareentwicklerMultiplikationsoperatorComputeranimation
08:48
ThreadTaskNeuroinformatikThreadKonstruktor <Informatik>ZahlenbereichTermAppletCASE <Informatik>InstantiierungVollständigkeitComputeranimation
09:27
NeuroinformatikResultanteThreadVollständigkeitGruppenoperationAppletSystemaufrufMereologieComputeranimation
10:16
ThreadThreadBildgebendes VerfahrenNeuroinformatikGrenzschichtablösungResultanteCodeTermCASE <Informatik>Exogene VariableComputeranimation
11:19
KoroutineThreadCodeResultanteKoroutineCompilerMusterspracheThreadNeuroinformatikLeistung <Physik>CodecProzess <Informatik>Computeranimation
12:18
Funktion <Mathematik>NeuroinformatikFunktionalSchnelltasteAnwendungsspezifischer ProzessorComputeranimation
12:52
SpezialrechnerTermBildgebendes VerfahrenCodecGerichteter GraphNeuroinformatikSchaltnetzGewicht <Ausgleichsrechnung>FunktionalCodeHyperbelverfahrenComputeranimation
13:40
Funktion <Mathematik>Interface <Schaltung>PunktNeuroinformatikGewicht <Ausgleichsrechnung>FunktionalBildgebendes VerfahrenLastGerichteter GraphObjekt <Kategorie>ProgrammbibliothekTypentheorieDifferenteComputeranimation
14:23
SpezialrechnerSynchronisierungCodeBildgebendes VerfahrenProzess <Informatik>CompilerNeuroinformatikLastResultanteVariablePunktGewicht <Ausgleichsrechnung>Byte-CodeSynchronisierungCodierungXMLComputeranimation
15:43
SpezialrechnerNeuroinformatikCompilerGreen-ITBildgebendes VerfahrenThreadCodeProzess <Informatik>ResultanteEinhängung <Mathematik>KoroutineDatensatzComputeranimation
16:29
ThreadKoroutineKontrollstrukturRegulärer GraphVariableCodeSpeicherabzugAggregatzustandGerichteter GraphPunktParametersystemMAPAnwendungsspezifischer ProzessorKoroutineThreadComputeranimation
17:33
KontrollstrukturThreadNeuroinformatikThreadBildgebendes VerfahrenProgrammierumgebungSynchronisierungHumanoider RoboterUnrundheitComputeranimation
18:13
SpezialrechnerProzess <Informatik>Suite <Programmpaket>KoroutineSynchronisierungBildgebendes VerfahrenCASE <Informatik>Overhead <Kommunikationstechnik>CMM <Software Engineering>ResultanteInstantiierungLastComputeranimation
19:03
SpezialrechnerOverlay-NetzBildgebendes VerfahrenNeuroinformatikMultiplikationsoperatorResultanteLastGreen-ITSystemaufrufOverlay-NetzFunktionalCodePunktGreen-FunktionGewicht <Ausgleichsrechnung>Gerichteter GraphComputeranimation
20:31
ThreadSpezialrechnerSynchronisierungThreadKontextbezogenes SystemNeuroinformatikGewicht <Ausgleichsrechnung>ResultanteRechter WinkelComputeranimation
21:04
FehlermeldungAusnahmebehandlungTaskAusnahmebehandlungRegulärer GraphCoprozessorProzess <Informatik>KoroutineThreadCodeInstantiierungInnerer PunktComputeranimation
22:13
CodeProzess <Informatik>PunktSpeicherabzugFunktionalThreadNeuroinformatikDifferenteSynchronisierungWort <Informatik>Metropolitan area networkComputeranimation
23:14
ProgrammbibliothekFunktion <Mathematik>CodeProgrammbibliothekKoroutinePunktFunktionalInstantiierungGewicht <Ausgleichsrechnung>CodierungCASE <Informatik>MereologieAusnahmebehandlungCodeComputeranimation
23:48
Disk-ArrayFunktion <Mathematik>ProgrammierungFormale SpracheCodePunktFunktionalMultiplikationsoperatorGraphfärbungGerichteter GraphGewicht <Ausgleichsrechnung>Computeranimation
24:52
Funktion <Mathematik>LoginMathematische LogikFunktionalCodeLastNeuroinformatikMathematische LogikGrenzschichtablösungPunktEinfach zusammenhängender RaumIdentifizierbarkeitSoftwareThreadNichtlinearer OperatorMultiplikationsoperatorGüte der AnpassungComputeranimation
25:59
LoginCodeVollständigkeitTypentheorieWort <Informatik>Funktionalp-BlockCodecGerichteter GraphGewicht <Ausgleichsrechnung>Objekt <Kategorie>Computeranimation
27:14
Funktion <Mathematik>LoginCodeInformationArithmetischer AusdruckFunktionalThreadEinfach zusammenhängender RaumRichtungNeuroinformatikRechenschieberResultanteDefaultEinhängung <Mathematik>Computeranimation
28:06
Funktion <Mathematik>Vollständiger VerbandKoroutineFunktionalSystemaufrufKoroutineFormale GrammatikPunktAnwendungsspezifischer ProzessorHydrostatikComputeranimation
28:52
Kontextbezogenes SystemKoroutineRechenwerkp-BlockLambda-KalkülFunktionalParametersystemBitKontextbezogenes SystemCodecComputeranimation
29:28
SpezialrechnerKoroutineSymboltabelleThreadPunktHumanoider RoboterFehlermeldungKoroutineElektronischer ProgrammführerFunktionalNeuroinformatikSpeicherabzugGerichteter GraphComputeranimation
30:22
ImplementierungParametersystemAnalytische FortsetzungInterface <Schaltung>GenerizitätKontextbezogenes SystemBitImplementierungFunktionalCompilerAnalytische FortsetzungParametersystemGenerizitätInterface <Schaltung>Übersetzer <Informatik>CASE <Informatik>CodecComputeranimation
31:02
KoroutineVirtuelle MaschineÜbersetzer <Informatik>MultiplikationsoperatorObjekt <Kategorie>ZustandsmaschinePunktKoroutineCompilerZweiAggregatzustandByte-CodeRechenschieberComputeranimation
31:57
KoroutineRegulärer GraphAppletFunktion <Mathematik>CodierungInteraktives FernsehenCodeAppletWort <Informatik>KoroutineRegulärer GraphInterrupt <Informatik>Stützpunkt <Mathematik>Mixed RealityMultiplikationsoperatorCASE <Informatik>FunktionalComputeranimation
32:34
AppletFunktion <Mathematik>ParametersystemAnalytische FortsetzungAnalytische FortsetzungAppletGewicht <Ausgleichsrechnung>CompilerParametersystemDefaultFunktionalEinfache GenauigkeitComputeranimation
33:25
Analytische FortsetzungAusnahmebehandlungRechenwerkFunktionalTemplateProgrammbibliothekKoroutineComputeranimation
34:15
Einfache GenauigkeitFunktion <Mathematik>Analytische FortsetzungGewicht <Ausgleichsrechnung>MaßerweiterungGewichtungFunktionalProjektive EbeneTypentheorieProgrammbibliothekEinfache GenauigkeitVollständigkeitAppletComputeranimation
35:02
AppletKoroutineLoginMathematische LogikFunktion <Mathematik>Interaktives FernsehenCodeFunktionalVersionsverwaltungComputeranimation
35:36
LoginAppletKrümmungsmaßNotepad-ComputerInternetworkingNichtlinearer OperatorWurzel <Mathematik>AggregatzustandAppletOvalTaskAnwendungsspezifischer ProzessorCASE <Informatik>Computeranimation
36:23
KoroutineAppletRichtungSystemaufrufNichtlinearer OperatorKoroutineTaskMathematikFunktionalp-BlockThreadProgrammierungCASE <Informatik>Luenberger-BeobachterAppletMathematische LogikEinfache GenauigkeitOvalDruckverlaufComputeranimationDiagramm
37:52
ProgrammbibliothekFormale SpracheKoroutineParallelrechnerAutorisierungProgrammbibliothekMereologieRechenschieberKoroutineFormale SpracheTeilbarkeitMAPProgrammierung
38:48
KoroutineMigration <Informatik>CodeProgrammbibliothekRückkopplungCodeKoroutineArithmetisches MittelFrequenzProgrammbibliothekPunktMAPComputeranimation
40:08
KoroutineElektronischer ProgrammführerElektronischer ProgrammführerVerschlingungSystemaufrufProgrammbibliothekZweiRechenschieberKoroutineComputeranimation
40:48
KoroutineÜberlagerung <Mathematik>CodeArithmetisches MittelThreadPaarvergleichDifferenteMultiplikationsoperatorKartesische KoordinatenComputervirusReelle ZahlFächer <Mathematik>BenchmarkInstantiierungCodecFlächeninhaltLoopSkalarproduktClientRadikal <Mathematik>Computeranimation
Transkript: Englisch(automatisch erzeugt)
00:05
Does the sound work correctly? Is it on? Okay. So let's start. So, as I was introduced, my name is Svetlana Sakova and I'm a developer advocate from JetBrains. And today we're going to talk about Kotlin coroutines.
00:21
Coroutines are the key new feature introduced in the latest version of Kotlin release 1.1. And its goal is to simplify asynchronous programming. It's important that we didn't invent it. The Kotlin team didn't invent this concept of coroutines.
00:42
The concept exists for quite a while already. So it was first introduced in the 60s. And at that time, they were used to model synchronicity when we didn't have threads. So in a single environment, languages used coroutines to model what we now call synchronous programming.
01:05
What we now do with threads. So in his book, opposed coroutines that are independent parts of the program which can interact with each other to main routine that just calls and uses subroutines.
01:21
So then there is a question. So why did we need to return this concept from the path now today when we have already threads and other means to do asynchronous programming? And the goal of this talk would be to answer this why question. And at first, we'll start with motivation.
01:43
And we'll look at async await. Some of you are probably already familiar with what async await do. So async await is the feature that is already present in some of the languages like C sharp. But if you don't, I will explain it in detail in this talk.
02:03
So again, async await is some feature that is used already in other languages. What do we use it for? There is a very trivial, straightforward example. We're just loading an image and then we're showing this image. Okay. It's very simple. It's very straightforward, but it's wrong.
02:23
Why is it wrong? Because loading image is a time-consuming operation. And our user will be blocked if we try to do this loading on the main thread. That's a common problem and there are common solutions to it. One of the solution, general, it's not one solution, it's just a huge pile of different
02:47
solutions. They all use callback approach when you just extract the rest of your computation in a callback and explicitly say that something has to be done after this time-consuming
03:02
operation finishes. And async await provide another solution to the same problem. So now, instead of using callbacks, you can just await the computation somehow. So you can say, okay, at this point, I will await this time-consuming computation on my
03:23
general thread. So there is no call. Now we can solve this problem without callbacks. So I haven't yet explained how async await works. So we will return to it a little bit later. My idea here is to say that, okay, our goal is to avoid callbacks and to provide
03:45
you more direct way to do asynchronous programming. Nice goal. And now we're ready to discuss how can we achieve this goal? How we support this async await. But before that, as I already mentioned, async await are a feature already available
04:03
in different languages like C sharp. And C sharp, async await are keywords, are language keywords. So they are built in the language. If you are familiar with C sharp, and if you are familiar with this feature, you can just use it in Kotlin in a similar way, in a similar fashion.
04:24
Because we have the same async await. However, in Kotlin, they are not keywords in the language. They are just regular functions. So you see here that the code looks very similar. So you can express the same ideas. But in Kotlin, it's not the feature of async await, it's the feature of coroutines.
04:48
So in Kotlin, we provide the basic support for coroutines on the language level. And we can implement different features from different languages like async await in the
05:00
language library. So again, async await is a feature that proved to be useful in C sharp for quite a while already. So we know from other worlds that it works, and now we can use it in Kotlin, in our JVM world. So now let's discuss what the coroutine is, how it provides the support of async await,
05:24
and how it all works. We're ready for that. Or maybe not ready. What's going on? Because I have this... I think I have to... Yes, sorry. So now we're going to discuss...
05:42
We're going to compare threads and coroutines. We're going to discuss how coroutine is similar to a thread, and how it is different from a thread, and how they interact with each other. So at first, a level of understanding what the coroutine is. Coroutine is very similar to a thread.
06:01
It's not a strict definition of a thread, it's just like description. But the same description works for coroutine as well. So from the point of view of developer, coroutine is just a sequence of instructions. However, different coroutines can be run independently, can interact with each other, and it's all done, they're all managed.
06:22
So the threads are managed by scheduler, the coroutines are managed by the Kotlin compiler. But the first level of understanding is that coroutine is like a thread. Something like a thread. However, it's more lightweight. So coroutine takes much less memory, and it requires much less resources than a thread.
06:42
Thread usually requires one to two megabytes of memory, coroutine requires more like kilobytes, so it's much less. And you can create much more coroutines for your application than threads. So we have an example in our documentation, a very simple example, if you try to create
07:02
100,000 of threads, your program throws out a member exception, however, it works fine with coroutines. So now you're intrigued and you can go to read our documentation. So in general, yes, they require less memory and you have more of them in the application.
07:21
Okay, so now we know that coroutine is similar to a thread. So we're going to discuss how they're different from a thread and how they interact with each other. The convenient way to think of a coroutine is of a computation that can be suspended.
07:43
I will use the following notation during this presentation. I will use the line to draw a thread, like here, and a block to draw a computation, a general computation that can be a coroutine or something other. And coroutine is a computation that can be suspended.
08:02
So somehow we can take these coroutines and put it away from our threads. Why? Why suspend? And to answer this question, let's discuss at first, how can we in general do asynchronous computations?
08:23
A very naive approach would be to start a new thread for every new computation. This works. It is very simple and straightforward, however it's too expensive. So threads are expensive and you just cannot create the thread all the time.
08:43
So in general, we as developers would like to reuse threads and executor helps us here. So executor represents a construction of a fixed number of threads where we can add somehow our computations
09:01
and in this case, threads will be reused. It's more, it's better in terms of performance and everything. The only difficulty here will be to manage, managing dependencies. And actually before Java 8, before Completable Future, without Rx, it's really difficult to write dependencies between your computations.
09:27
And for instance, imagine the example. Like when we have one general computation and it wants to start two other actions, preferably asynchronously, and then wait for the result.
09:42
And we want to emphasise it using executor, using threads. Without RxJava or Completable Future, it's difficult. However, with the RxJava and Completable Future, you just cut your computation into two parts. You store the rest of your computation in a special callback and it works.
10:05
And you can say after these computations are completed, please start, please call this callback. And it works. However, let's make a step back and think of what's...
10:20
So, again, what was wrong with just blocking the thread? So, my image is very simple. It's just like, and the code for this is rather straightforward. Just start two things and then use the results. But if we use, if we do it with threads, it would be, again, too expensive, because the first thread will be idle.
10:44
It won't be reused for something else. And it's not good in terms of performance. And if our main thread is UI thread, it's even worse, because in this case, our user will be blocked. So, it's not the way how we do the things. So, to have better performance, to have better responsiveness,
11:03
we somehow split these computations into callbacks, into several computations, et cetera. But let's imagine what if we could do the same approach, but with something more light than threads.
11:20
And here you can see the power of coroutines. So, in essence, you can express this simple pattern. You can write very straightforward code, but with a concept that is more lightweight than threads. So, with coroutines, you can write direct code, like in this example,
11:44
and you can, instead of blocking the thread, you just suspend the coroutine. You just take your computation, put it away somewhere, and then, when it's ready, when the result is ready, you continue it. So, that's why you don't need the callbacks.
12:02
So, the compiler does all this job for you. You can express the code in a straightforward manner. Okay, now, I suppose you are a bit tired of just pictures, of just images. So, yeah, now, and with the UI threads, it works in the same way. So, now we are going to look at some code.
12:22
And at first, I want to emphasise that Kotlin, with the support of coroutines, goes with the support of this suspend keyword. And in Kotlin, with coroutines, you can mark the functions as suspend,
12:41
and that will mean that this function represents the computation that can be suspended. It represents the computation that can be put away somehow. So, now, let's go back to code, to our image example, and see how it works in terms of these computations, suspended computations, et cetera.
13:04
So, yes, here is our example. We have two functions, async and await, and now we are ready to understand what exactly async and await do and how they work. Async, in essence, just starts a new computation. So, when you say async something, it just, so, for now,
13:26
we don't think about where it starts, or in what executive, we will return to it a bit later. But, in general, you can think of this as, okay, it just creates this new computation and starts it somewhere.
13:40
And await is the point that suspends computation. So, await is called a suspending call, and, actually, await is declared as a suspend function, so you can ask how do you call this await, but if we look at this example, you can see that, okay,
14:00
we call this load image async. Load image async actually returns a type of, an object of deferred type, and this deferred declares a method await, a suspend method await. So, okay, await is just a suspend function.
14:20
It's a library function declared as suspend function. And what it does is the marker that tells the compiler to suspend the computation. So, for simplicity, let's extract the result of load image async into another variable for us to understand more easily what's going on.
14:41
So, it's the same code. It's very similar to how the code is generated, because when Kotlin compiler generates the byte code, it will anyway extract it into the variable. And now let's follow and understand what's going on here. So, at first, we just start a new computation. So, we now know that async starts a new computation,
15:02
and this load image async is some new computation started somewhere, and now we're, and we just have the reference to this computation. And when the Kotlin compiler reaches this await call, await is the point that is the marker that says the compiler,
15:22
okay, now suspend this computation until the result is ready. So, when the Kotlin compiler sees this await, it gets this blue computation process image and puts this aside. And there, and we continue. And we continue, and like the image is loaded.
15:44
And await suspense computation and continues it when the result is ready. So, when image is loaded, when green computation completes, a compiler understands that, the code understands that, and returns the process image to a thread.
16:04
Continues it. So, now you have this straightforward code without callbacks that just works similar to thread, but instead of blocking thread, it suspends your computation with using coroutines.
16:20
And there can be a question on which thread it continues the computation. And the answer to it actually is that you specified it. And I have to admit here that I somewhat lied in the previous slide, because, at least for now, you cannot use async without these first parameters
16:41
that specifies where you want to start your coroutine. So, for now, you have to specify this like executor where do you want to start your coroutines. And here we can use the common pool that contains the regular fork joint pool under the hood.
17:04
And that means that your coroutines can be started and continued on any available thread. And here you can see that coroutine can be started on one thread and continued on another thread, which is free at the point.
17:21
And that means that coroutine saves all the state to be continued, so the state of all the local variables, et cetera. So, I hope you start to understand how it should work. For Android and for other environments where there is the main UI thread, you can specify that I want to start this computation
17:45
and I want it to be run on UI thread only. You just say async UI, and that means that your computation will be started and will be continued only on UI thread. And you see that while this blue computation, the second image, is suspended,
18:01
UI thread can be used for something else. So, another request from the user, some red computation is started, and the blue one will be continued on the main UI thread after this red one is complete. And you also can specify, so there are not just two executors, you can create your own that suits your needs more.
18:23
So, there is some kind of flexibility when you work with coroutines and there is a way to customize things. Suspension might not actually happen if the result is already available. So, there is an example.
18:41
We, again, we start this load image async. Imagine we do some other work while images are loading. For instance, our image is cached or it's very small, so the result is available. And in this case, when we call await, if the result is ready, we can just continue.
19:00
So, no overhead is created. So, let's now look at the example with two asynchronous computations, the images that I showed you before, and now there is actual code that represents these starting two asynchronous computations. And we just say that you want to overlay these two images
19:25
and we have a function that returns a new image which is the result of overlaying the first and the second one. And we want to synchronously load these two images, probably in parallel, if we can do that.
19:40
And after that, to show the result. And to do this, we can say load first image, load green image. This starts the first green computation. Load image async the red one. This starts the second red computation. And after that, when we call the first await, our main overlay async computation is suspended.
20:04
It's put away somewhere, and when the result is ready, we can return this computation back. Actually, under the hood, it can be suspended two times because we have two await calls to suspend suspension points,
20:20
possible suspension points, but these details don't matter, and it doesn't matter for now, and I hope that this image illustrates well what's going on here. There is an example of starting a computation on the UI thread. So we can say just launch something in the UI thread,
20:43
and here you see that we can start another computation in the context of the UI thread. Then, say, realizing await, and when we call await, again, our computation is put away. It's no longer on the UI thread,
21:00
and after we have the result, it's put back. So I hope that now it should be somewhat clear how it all works, and what await async does, and what the coroutines do. The question. What's about handling exceptions?
21:22
How do we process errors? And the answer is that await just rethrows the exception. So regular try catch works. And note that you can throw an exception in one coroutine and then catch it in the other one
21:41
and just handle it in the other one in the other thread. So in this example, we throw it in the green coroutine and catch it in the outer coroutine that was suspended, for instance, and now it returns and handles this exception. So you just write your regular code
22:01
like you write with threads. You write for regular stuff without using these callbacks and other advanced ways of handling exceptions. The other question that may arise at this point, how can you cancel the coroutine?
22:22
And yes, you can cancel it. You just say job cancel. Probably I forgot to mention this. We have here the different functions. So before that, we have only async, but now we see launch as well. It does absolutely the same with the difference that it doesn't return you the value.
22:41
So launch just starts the computation without, so you can join it, you can cancel it, but it doesn't mean to return any result. So if you, and cancellation works similar to threads, so when you have a thread and you want it to be cancelable,
23:03
you have to check explicitly whether it was canceled or not, and the same approach works with coroutines. So you just check, is it selected or was it canceled? Library functions that await, for instance,
23:20
check for cancellation explicitly. So if you use await somewhere at the point when you call await, if your coroutine is canceled, it will notice it. And you can also run the part of code, for instance, in the try block without cancellation. So for try blocks, it's a regular case
23:41
when you want it to be performed independently of whether the coroutine was canceled. Okay. So now we have plenty of time, or at least some time, I suppose, and now we are going to discuss something important.
24:06
So now you have the basic understanding of how coroutines work and how async await works. And now I want to share with you the novelty of Kotlin approach. Because async await is a feature existing in other languages like C sharp.
24:25
However, Kotlin doesn't introduce async await keywords. It introduces coroutine support. And with the support of coroutines, it allows you to actually define your own suspend functions. And organize your code in such a way that all the places,
24:46
all the points of the code that can be suspended are marked with this suspend modifier. So, yeah, there is a first question, whether you can define your suspend function. And yes, you can.
25:00
And let's look at an example. The one, the new one. And this example presents just a simple consecutive logic. So, say we have, like, three operations. We can log in. Just load some user identifier. Then we can, as a separate operation, load user data.
25:21
And at the last operation, we can show this data. And when you write this code in a straightforward manner, like, first, get user ID. Then get user data. And at last, show data. You have two points where your computation can be, like, if you do it with threads, can be blocked.
25:42
Something that can have time. We can imagine that it does some network requests. And the connection can be not good, et cetera, et cetera. So, they are marked. So, yes, you have two points that can cause some problems. The first way is to rewrite this code.
26:03
So, yes, now we want it better. We want to avoid this blocking. And the first approach is to rewrite this code with async and await. To do this, you just, instead of returning user ID and user data from your functions, you refactor it and make them return special deferred type.
26:23
We saw it before, like, async returns deferred. And deferred is another word for completable future, for promise. It's just the synonym. So, in essence, it does the same. And in your function, you have now, instead of user ID
26:42
and instead of user data, you have deferred objects. So, you have to await for them. Rewriting, after rewriting the code to this one, this works. That's fine. And actually, this is how async await work in C sharp. So, when you code in C sharp, you always have to define this async, await.
27:02
This async can always provide their weight for them. However, in Kotlin, you can do even better. Because in Kotlin, you can define your own suspend functions. And that means that you can declare all these functions as suspend functions.
27:21
So, instead of returning deferred, you now return expression directly. And you can declare this show user info also as suspend function. And inside the suspend function, you can just so this is the code that we saw
27:40
the first slide with threads. So, this is exactly the code that we have initially. But now it works with suspensions. So, now instead of blocking the user when the connection is bad, you suspend your computation awaiting when you got your result.
28:02
So, the code looks the same, but it works. And there might be a question again. Where can you call the suspend function? And the answer is that you can call suspend function either inside other suspend functions or inside so-called coroutine builders.
28:22
And we actually already saw coroutine builders. Coroutine builders is just our it's a formal name for launch and async functions. And their goal is to start a coroutine. And there is one more unblocking which actually interacts with the regular blocking world.
28:43
And our unblocking will be the entry point to all these coroutines programming. So, coroutine builders start a coroutine. And looking a little bit under the hood, we actually use suspend modifier not only to declare suspend functions,
29:02
but also we mark lambdas as suspend lambdas. And coroutine builder is any functions is any function that takes this suspend lambda as a parameter. And in essence, this suspend lambda is what represents a coroutine.
29:21
So, when you start with a coroutine builder, this new coroutine, it's marked as something within a suspend context. You can declare a nested coroutine. So, you can start a coroutine inside the other coroutine. So, in this example, I start this launch first coroutine in the guide thread.
29:41
And then we have async that starts another coroutine inside the first one. And here you see these symbols. This error with this symbol. And they represent the suspension points.
30:05
So, you can see them in IntelliJ or Android Studio. They mark you directly where your computation might be suspended. So, they are placed whenever you call the suspend function.
30:24
A little bit of implementation details. Actually, when you declare this suspend function, the Kotlin compiler secretly adds an extra parameter. Continuation.
30:41
And in essence, continuation represents a generic callback interface. So, in the end, it's still programming with callbacks. However, in this case, they are hidden by the Kotlin compiler. So, you can think of it like that.
31:03
And another internal detail. That coroutine body, coroutine is cheap. It is represented by just one object. And it is compiled to a state machine. And every suspension point represents one of the states.
31:20
So, when your coroutine reaches some suspension point, it is put away. And it's stored in the first state. Then you return it. Then it reaches the second suspension point. It's put away. It's stored in the second state, et cetera. Again, in our documentation, you can find the exact example of the byte code. The Kotlin compiler generates for the state.
31:42
I will leave it to you to find it. Because anyway, it doesn't work this slide. Because you have to think calmly for some time to observe it and to digest this to understand how it works. But we have it. And a couple, I still have some time.
32:01
A couple more words about interaction between coroutines and regular words. So, you now have all your coding code bases, Java code, some Kotlin code. And how they interop with each other. Because Kotlin is about interoperability.
32:21
So, we have to think about interoperability in this case as well. First question for you. How do you think can you call a suspend function from Java? Because actually I've already shown you. And the answer is that probably you don't want to call it directly. Because to call it, you have to provide this continuation parameter.
32:42
And Kotlin compiler somehow does all the magic. But with Java, you have to do it by hand. And it's not the way we it's not the way it intended to be done. So, it's possible, but probably not the best idea. However, there is an easy way how to interact with Java. So, if you have your suspend function declared in Kotlin,
33:02
you can just wrap it into some kind of future or promise or default. And return it to Java. So, if you have Java 8, you can wrap it into completable future. If you have Java, you can wrap it into single. And afterwards, you can use it from Java in this way.
33:23
In a regular way, working with the completable feature. Another thing, if you are working with some third-party library, and this library provides callback-based API. You can, again, instead of calling it directly,
33:40
providing the callbacks, you can wrap into suspend function. This function will use special intrinsic Kotlin function suspend coroutine. I don't want to dive into details of how it's implemented. Again, you can just research it and think about it for a while.
34:02
But the idea is that it's doable. So, it's just a template that you can do. And you can work with all these third-party API in a way using these suspend functions. The same can be done with RxJava. And so, you can similarly wrap it.
34:25
However, with RxJava, there is a weight function defined, suspended weight extension function, extension function defined in the library. So, you can just call another weight function on a single. Or completable future.
34:41
So, you know, Kotlin has extensions. And with extensions, we can provide this very native experience. Like, weight is not a keyword, it's just a function. You can really declare this function for all the type of futures that you have in your projects.
35:01
And that will work. I expect this question. So, we have, like, five more minutes. And I expect you to ask, like, what about RxJava? So, we now want to use RxJava. And how does it correspond with interacts with coroutines? And I want to illustrate my answer with the same example
35:24
that we used so far. So, we had this code. We rewrote this with suspend function. It looked very similar to the straightforward version that you write by hand when you first look at the problem. It can be rewritten with completable features.
35:40
With a bunch of operators. It can be written with RxJava with a bunch of operators and with the callbacks that provide all these operators. And I think that I try to state here is that probably there are you don't need Rx for all kind of tasks.
36:02
So, there exist tasks that if you strive hard to add a Rx for your application, like, if you try to find somewhere where you have to add your Rx, you have to invent this observable,
36:20
that probably coroutines will suit you better for this case. And in general, it's not the question of whether Rx or coroutines. There are a great amount of tasks where RxJava really solves the problem. It provides you a great way to manipulate observables
36:42
to respond with changes that happen somewhere. But there are other really big pile of tasks when you just want some consecutive business logic. And in this case, probably it's more intuitive to use direct style of programming with coroutines.
37:02
Coroutines won't block your thread and you can use them directly. And also there is a bunch of tasks where you can probably use both. So, you saw already that if you for some reason want to call suspend functions from Java, you can wrap it into a single.
37:20
Or there are the case if you are writing custom operators for your program for Rx, coroutines allow you to write them in a much easier fashion, in a much simpler way. Again, you can find this into documentation like how the classic example is written in the coroutines
37:44
and it's much more understandable in this case. Because you know that with backpressure and others, it's hard to provide your own custom operators for Rx. Okay. So, now I have to go with this. And we're almost done. So, just some wrapping up and a couple of references.
38:04
And I showed you this first part of this slide at first. When I illustrated that, okay, on the language level, we have coroutines and we have async await defined in the library. But in fact, we have much more.
38:22
We have also support for channels, support for authors. We even have the support for yields that is not about asynchronous programming but is implemented using the same concept of coroutines. You can find it all again in the documentation. So, this coroutines feature is something very powerful.
38:42
It provides us a lot of means how to do asynchronous programming in a new way. For now, coroutines are released in the experimental status. So, in general, it is a very new approach. So, we want as much feedback as we can gather at this point.
39:05
So, we want the community, that means you, to try it and to give the feedback. And I used to answer this when question. And I can't say when. Because for now, the approach is that the more you try it
39:22
and the more feedback you give us that that works, how that works, the faster it will be released. But anyway, in JetBrains, we try hard to provide the easy means to migrate your code.
39:41
So, you can use it safely now. And when we release it, the support library will be provided. So, the code that you write now with coroutines will continue to be supported. And I think that everything that I've discussed in this talk won't be changed. So, the basics won't change. Some more deep things into the library may change.
40:03
That's why we want this probation period. But anyway, it's up to you now. There are links. And I want to mention that this library, it's mainly written by Romainy Lizarov. And if you want a more deep so, there is a brilliant guide.
40:22
A guide to coroutines by example. It's also written by Romain. And if you want more digression into this topic, you can Google his talks. And here is the guide. And you can also find his talk in a minute of advertisement. You can find his more deep dive into coroutines.
40:44
Yes, I know there's 30 seconds more. And I have two more slides. Another thing of advertisement is the Kotlin book. Probably you know it. Now it's finished. It doesn't cover coroutines in any sense. But it covers all the rest Kotlin. Yeah, and I forgot to mention that at this conference, I will do the workshop that also
41:04
doesn't cover coroutines, but covers all the rest Kotlin. And thank you. Have a nice Kotlin. I must say, when you gave your Kotlin talk two years ago, you were already very convincing
41:26
and everything. But now you're obviously glowing from inside as a matter of success. Thanks a lot. Before we get into the Q&A, is Gautier already in the room? Yes, he's here. Sid, Gautier is here. No need to look for him longer.
41:41
Because Gautier is the one who's going to give our next talk. But we have time for one or two questions to Svetlana. Are there any? Yes, I have a question. So, beside the quote you mentioned, you didn't give a real opinion on efficiency, Kotlin or Rx. Can you state that anymore?
42:00
So, you said that both are fine, but there is no difference in efficiency. Is that correct? Actually, for this, we need real application, we need real comparison, and we need benchmarks. So, I don't have these big applications in mind with exact comparison.
42:22
But in essence, coroutines implemented is quite efficient because they're lightweight and they're cheap. But for real thing, we need to wait benchmarks and real comparison. And especially for Android applications, there is some circumstances. So, we need to wait for that. Okay, thank you.
42:41
I think this guy was in front of me. Hello. Thank you for the talk. I'm already a big fan of yours. I bought the idea. And my question is always, also aims at emphasising where the performance improvement comes from. In the example by Roman, he outputs 100,000 of dots, and he uses coroutines, and he
43:06
suggests the reader to use the same amount of threads. But that is not really fair. A more fair suggestion would be use an executor with the same amount of threads as the common
43:22
pool in coroutines. And then in the loop, run, well, 100,000 runnables. That would be fair. Yeah, and I can say that when we compare the time threads with coroutines, if you try to, for instance, decrease the amount of sleep, for that example, you will also
43:44
want to see the same difference, the same behaviour. So, I would say that that example is just like the example tries to convince you to read more about coroutines. And, as I told before, we just need the benchmarks.
44:04
It's like I can say, coroutines are better, but the other guy will say, it's faster, so we need something real in the future. Now, we have three more very brief questions. Actually, just a small announcement. I will be here, I think, for some time, so probably we will give time for the next
44:23
talk for the next speaker, and you will ask me all these questions just there, just after the talk, or any time afterwards. Thanks a lot. Okay, thank you. So, Dana, thank you.