What's new in Upipe
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 644 | |
Author | ||
License | CC Attribution 2.0 Belgium: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/41741 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | |
Genre |
00:00
Presentation of a groupGoodness of fitBroadcasting (networking)File viewerTouchscreenDistribution (mathematics)VideoconferencingTopological vector spaceBitMultimediaProduct (business)Software frameworkQuality of serviceComputer fileProjective planeComputer virusComputer animationLecture/Conference
01:52
Software frameworkMultimediaSeries (mathematics)Open setEvent horizonBlock (periodic table)Thread (computing)State transition systemMereologyBitThread (computing)WindowLie groupSynchronizationPlug-in (computing)BefehlsprozessorBroadcasting (networking)Module (mathematics)Endliche ModelltheorieComputer fileEvent horizonOnline helpAtomic numberComputer animation
03:45
Module (mathematics)SynchronizationComputer hardwareView (database)VideoconferencingThread (computing)Lecture/Conference
04:14
File formatAerodynamicsOpen setSoftware testingSuite (music)TouchscreenInterface (computing)Data centerBuffer solutionBitLogicConfiguration spaceCartesian coordinate system2 (number)Computer animation
05:11
Software testingoutputLibrary (computing)Suite (music)Lecture/ConferenceMeeting/Interview
05:45
Open setModul <Datentyp>Insertion lossComputer musicLink (knot theory)Value-added networkClosed setFunction (mathematics)SynchronizationDevice driver2 (number)Function (mathematics)Module (mathematics)Software bugSpacetimeDirected graphMereologyWindowLibrary (computing)Perfect groupStatisticsMathematicsPlastikkarteComputer programmingComputing platformComputer animation
08:35
Real numberCodecCodeBroadcasting (networking)Sampling (statistics)TrailSampling (music)Lecture/Conference
09:34
Broadcast programmingEncryptionFile formatBitPay televisionError messageSource codeVideoconferencingSynchronizationEncryptionRevision controlPhysical system1 (number)Module (mathematics)SatelliteStreaming mediaOrder (biology)Form (programming)Simulation2 (number)ImplementationComputer animation
10:51
Series (mathematics)MereologySilicon Graphics Inc.Streaming mediaMatrix (mathematics)Standard deviationLecture/Conference
11:15
State transition systemOpen setVorwärtsfehlerkorrekturAsynchronous Transfer ModeAutomatic repeat requestSoftware testingPerfect groupSoftwareAutomatic repeat requestMultiplication signImplementationVideoconferencingMixed realityAsynchronous Transfer ModeMatrix (mathematics)Exception handlingMachine visionInformation securityComputer animation
12:15
VorwärtsfehlerkorrekturJust-in-Time-CompilerBit rateAreaCausalityInsertion lossSoftwareState of matterSinc functionLatin squareLecture/Conference
13:15
Frame problemParsingImplementationBitDifferent (Kate Ryan album)Moment (mathematics)EmailCASE <Informatik>Computer wormParsingFunction (mathematics)Revision controlComputer animationLecture/Conference
13:59
Asynchronous Transfer ModeParsingBitLecture/Conference
14:20
Latin squareSoftware testingSystem programmingData bufferSource codeSynchronizationFunction (mathematics)Artistic renderingFreewarePosition operatorComputer programmingBlock (periodic table)Type theoryFrame problemSource codeSpherical capTouchscreenVideoconferencingStreaming mediaComputer animation
15:41
outputFunction (mathematics)Software crackingSampling (statistics)Level (video gaming)Source codeMultiplication signModule (mathematics)Lecture/Conference
16:31
Open setSource codeBlock (periodic table)Fallbasiertes SchließenSinc functionRaw image formatStreaming mediaSource codeEmailSoftware testingWaveError messageComputer programmingFile formatSatelliteCodierung <Programmierung>Frame problemVideoconferencingParameter (computer programming)Computer fileLine (geometry)Projective planeBlock (periodic table)Standard deviationObject (grammar)Module (mathematics)Motion captureView (database)Product (business)Function (mathematics)Computer animation
19:04
InternetworkingLink (knot theory)SoftwareCodierung <Programmierung>Bit rateImplementationData transmissionStatisticsWeightOpen sourceSoftware developerLecture/Conference
20:57
Software testingChainPoint (geometry)EmailPresentation of a groupForm (programming)Data transmissionCodeFile formatMeeting/Interview
21:57
Presentation of a groupSoftware testingImplementationMixed realityMeeting/Interview
22:43
Lecture/ConferenceMeeting/Interview
23:05
Service (economics)Program flowchart
Transcript: English(auto-generated)
00:05
So we're going to start the next presentation. So it's yet another multimedia framework called U-Pipe, a product I know a bit about. But this year, Rafael has agreed to do the presentation. So please give a warm welcome to Rafael Caré.
00:22
Hello. Thank you. So a little bit about myself. I started contributing to VLC in 2006. And I joined U-Pipe three years ago. So what is U-Pipe? It's a G-streamer for broadcast.
00:44
So the big question is what is broadcast exactly? I've worked for six years in broadcast, and I'm still not sure exactly what it is. So if I try to guess, it's old-school TV. I think when the first TVs were created in black and white,
01:03
this is a distribution of video or radio for millions or billions of people. So what do we need in broadcast? We need high quality of service, 24 hours per day,
01:20
good quality because people watch TV on bigger and bigger screens. So it's not like on the phone where you can have 144 video. The quality needs to be very good. So as Christophe said, he created the project around 2012,
01:46
so about 15 years after creating VLC. So all the mistakes that were done in VLC, which we still suffer from today, they're gone. U-Pipe is perfect.
02:01
U-Pipe is a pipeline, and of course it comes with all the modules, the plugins, which you can connect together to do anything you want. So it's fast, of course.
02:23
It uses a model of event. So when you're not receiving anything, the pipeline is waiting on the file descriptor. So it's not using any CPU, and as soon as you get something incoming, you're waken up by a poll, so you can handle it immediately.
02:43
So on Linux we use libev, but on Windows you could use the enlightenment libecore. But Windows support is still not there. There was some work done, but the guy left.
03:02
So if you know about Windows and need a broadcast framework, come and we'll help you at this part. So it's a bit weird for something modern to use no threads, but actually it's a lie.
03:20
Each module runs on the main thread, so you don't have synchronization problems in your pipeline, but what you can do is deport one module or part of the pipeline to another thread, and we already have synchronization using lockless and atomics when needed.
03:43
So you can deport your pipeline and get a thread for free, but from the view of the pipeline you don't see the thread. Everything runs in sync. Wrong button. So of course it's dynamic, so each module can add its own requirements,
04:07
and so on and so on. For example, if your hardware video needs some specific strides, it needs to be aligned on that many bits to be able to perform SIMD,
04:21
or of course zero-copy. The pipeline can render directly in the final buffer at every step. So the pipes themselves are really dumb. They just take something in and out and give you an interface for configuration,
04:42
but all the logic is done in the application, so you just use the stupid pipes and you tell them what to do. As I said, it's broadcast, so it needs to be very reliable,
05:01
because you don't want to be sitting in front of the TV watching some news, and suddenly it goes black screen for two seconds, then it comes back because something crashed in the data center. So we have a test suite, which is not complete, but we are working on making it better, and we use Valgrind, libassan, which was mentioned,
05:24
and we are fuzzing individual pipes, especially the framers, since they receive untrusted input. So we use AFL, and that's it.
05:44
So, what's new? Since last FOSDEM, I think we had a release around last FOSDEM, and another one is coming today. So, ten contributors, not many new people, but hopefully you will be interested in joining us.
06:04
So about one commit per day, blah blah blah, some stats. The pipeline itself didn't see many changes, there were some improvements, but no bug fixes. So this is cool, because it means that the pipeline itself is,
06:22
as I said, perfect. Oh, that laugh is true. And, but what we did, we created many new modules, and I'll talk about them. So the first one, the biggest one maybe, it's the deckling sync, so the blackmagic cards,
06:41
which do SDI output. So it's, it was a lot of work, because if you don't know these cards, the manufacturer gives an SDK, which is a cross-platform, because it runs on Linux, Windows, and Mac. But the problem is, it's proprietary.
07:03
Both the kernel driver and the userspace library, and it's corrupt. You have many, many, many problems with synchronization. If you add a reference clock, the signal will drop four seconds later or something, so you have to test and guess what's wrong.
07:22
You ask them, what's, why doesn't your stuff work? Ah yes, so we will fix it in the next release, but then they give you a new release, which adds more bugs, so, we have to work around many, many bugs, and now it's stable, so,
07:41
it's actually working on your TV, so if you're watching your TV, that module might be behind. We support Teletext in a vertical ancillary space, so this is a, vertical ancillary is a technology from the old days.
08:02
It's been there for 50 years, and after the switch to digital TV, they kept it, because the guys, they had learned to work with all technologies, and they didn't want to change, so we had to follow. So Teletext is mostly used in Europe. In the U.S. it's closed captions, which we support too.
08:22
We also do SMT337. What is it exactly? It's a method for transporting compressed audio over a PCM channel. So you have a, it's all digital, so you have a start code,
08:41
which is an audio sample, which would not make sense in the real world, in real audio, and when you notice the start code in the, in the audio, it means the, the PCM samples that follow are actually compressed data. So you can transport in a stereo track,
09:01
so two channels PCM, you could transport a compressed AC3 with a 5.1, or WBE, which is a codec that is only used in broadcast. And there was a decoder added to FFmpeg last year.
09:20
It's not much known. And also in SMT337, you can transport PCM. So you have PCM, you add the start code, and then PCM behind, so it doesn't make sense. But you can do it. We also added the ASI sync. ASI is a bit like SDI,
09:41
it's a format for transmitting video. And we already had a source module, so to receive the ASI signal, and we transmit it as a transport stream, for example. So, and it's, I mean,
10:01
it's designed for MPEG-TS, because it uses the same 27 megahertz clock. We have the DVB-CSA, which is an encryption system for satellite, so for pay-TV. So you can do it both sides, encryption and decryption.
10:20
And it's actually a fork. The original version came from VideoLAN, and that guy forked it, and he added SIMD, and he actually devised the bit slicing implementation, so it's very, very fast, I think a couple gigs per second,
10:41
or more than other. So, we added the FEC, so for the error correction, which uses the CMT standard. So it's part of the series, which defines SDI over IP,
11:01
but it's not specific to SDI over IP. So any LTP stream, you can add FEC, and we can receive it. So you can choose the size of your matrix. And sometimes FEC is not enough. You have too many losses,
11:20
so you can't reconstruct the matrix perfectly. You're losing video. So we also added ARQ. I mean, at last NAB, there was a new solution by iVision called SRT, Secure Reliable Transport. And SRT is a mix of FEC and retransmission.
11:43
But we chose to use the existing standard, although we don't know any other implementation of this standard, so we can't test it in the real world, except against ourselves. If you test your software with your software, it's always going to be compatible.
12:00
So we are looking for another ARQ implementation. So the mode is NAC. So each time you lose a packet, you ask for a retransmission. And that's why you have a longer latency than with FEC.
12:21
So really, both ways of protecting against packet loss are good, but they have their drawbacks. So it depends on the kind of JIT area of loss your network has. And since it's a pipeline, we can actually put one behind the other.
12:42
But really, we should merge them, so each one can know that they need to share a common internal state. So, for example, if you can correct a packet with FEC, you shouldn't ask for retransmission because it's going to make your bitrate higher and maybe cause more packet loss.
13:00
So we still did not merge it, and actually, we don't know how exactly to do it efficiently. So maybe we'll use them, either one or the other, but never together. So I was talking about the SMT337.
13:22
We actually had an FFMPEG-like moment where two persons were working on computing implementations without telling the others. And we actually committed the two versions, but they are a little bit different, because one is a parser, so it will extract the payload only,
13:43
remove the C3-7 header, which you might not need anymore. But in our case, since we are retransmitting on the SDI outputs, we need to keep the header. So they worked a bit differently.
14:00
Also, you can have 16, 20, or 24 bits mode in C3-7, and the parser is only using 16 bits, and the other one is 20 bits, but you could modify them if you needed to support another bitness.
14:20
We added a free type for text rendering. It's very basic, because we don't deal with positioning. If you have a P, for example, the bar of the P is below the screen, so you don't see it. But since we only needed something simple, we write the text in all caps,
14:41
and that works for us. So that's cool. Blank source. What this does is generate an empty stream, so you can create your outputs and start transmitting nothing,
15:03
which is useful if you want to switch programs. So you start the output on an empty source, and then you replug your actual program to the output which already exists, and it's going smooth.
15:24
AVSync. It's a raw mixer, so when you deal with SDI, each audio frame is mixed together with the video, and you have to send them as one packet, one block.
15:44
So we need to synchronize them at the sample level. So we can use a blank source for the video, but for the audio, we are actually based on the output clock.
16:05
So since the input and output clocks differ, you need to resample the audio so it will fall exactly each time. And we use peaks, because with peaks, you can do fractional resampling, so going up and down as you need without airing your cracks.
16:24
If you lose one audio sample, you would hear a crack. So with this module, we have perfect sounding output. We added V210 decoder after the encoder. So that's the SDI video format.
16:41
And we made it fast. Actually, James made it fast with AVX2 assembly. And yeah, we did more smaller pipes. Zone plate is a test pattern, which is used in TV. So instead of a blank source, which is black,
17:02
we could use this one. XO65 encoding, which is still slow. Grid module, which is used with blank sources. So this is a seamless switchover. You create your output, which would be your TV channel, and you can switch the programs. You connect one source and the other,
17:21
and they flip in an instant without artifacts. DTSDI as well. It's a file format used in the DECTEC tools. If you do raw SDI captures, it's a simple error. Block to sound, which was the first contribution of a new contributor to U-pipe.
17:43
So what it does is you set the parameters you want from the wave header that you don't have, since it's raw PCM. And you can do a PCM stream from file. What's coming? DVB satellite source.
18:01
SDI over IP, which is working on air now, but not yet committed. So 2022, the old standard, which is working. And we are working on the new one, which was just released last year. And it should be working for now.
18:22
I hope so. And we'll also work on the subframe latency encoders and decoders. So we can actually transmit or decode the video before the rest of the frame has arrived. So let's say if you have a progressive scan on your camera, you can start working
18:42
on encoding and transmitting the first lines before the scan is done. So that would be cool. And of course, I hope that some of you will be interested in joining the project. So we are open. We are cool. We are a small project.
19:00
We are cool. Only nice people. So if you are not nice, don't come. So any questions? So are there any questions? I can pass the mic.
19:22
Yes. It's more of a comment. On the SRT thing, it doesn't do FEC. And one of the big features is that they try to guess the link of the bandwidth so that you can inform your encoder.
19:41
OK. I thought it was doing FEC as well. They found that FEC is not useful over the internet. That's what they're... Oh, OK. That it's more efficient to do retransmissions. OK. So that's what the internet is doing. Maybe the... Yeah, it depends on the link. Maybe on the dedicated link. Yes. FEC would be useful. Low latency, you have to do FEC. Yeah.
20:01
So... Yeah, the definition of low latency is not that low, basically. They're kind of... I was talking about the... We use the standards, but when doing the implementation, I realized that you don't receive the... You don't know the bitrate on the right end of the retransmission, so we had to add a custom RTCP packet
20:21
to send back the bitrate to the sender and to do some stats. So maybe they tested this RFC2 and saw that it wasn't usable for them. Yeah. I know you did a lot of research before there. It's something that was developed as proprietary software for five or six years, and they shipped before they open-sourced it.
20:43
Okay. All stuff. Someone else? Is there any other question? I have a question. I'm coming, I'm coming. Who's running? I'm sorry, I'm coming because you have to be in the camera. Otherwise, people don't hear you at home.
21:02
I just wanted to ask if you're using the Dolby E-decoder in your workflow. No, because what we do is transmission. So we transmit the Dolby E as Dolby E. That's the point of using C3C7 because all the equipment on the chain
21:21
understands this format, so we don't need to decode it to PCM because Dolby E would be, for example, six channels, but if we send it in the compressed form in C3C7, we can fit six channels in two. So decoding would not make sense because we don't have any...
21:42
So did you ever test it? I'm just curious because I wasn't really able to test the decoder. I didn't, I didn't. Okay, thank you. I could share some parts, maybe. That would be cool. Okay, just send me. My email was in the beginning of the presentation. You can ask me.
22:02
Hi. Hi. You were mentioning wanting to find other, I mean, testing interoperability for retransmission. So there's Gstrom, which is the U-pipe for everything else. No, but you can try, I mean, essentially there's RTX support in it.
22:21
Okay, is it using the RFC? Pretty much. I mean, the RFC that you mentioned is not the actual implementation of the retransmission. It's a whole bunch of recommendations on how you should behave. Okay, I will have a look at Gstrom retransmit.
22:41
Thank you. Any other questions? No? Last chance? Well, thank you, Rafael.
23:01
Thank you for your attention.