FOSDEM Video Box
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Subtitle |
| |
Title of Series | ||
Number of Parts | 490 | |
Author | ||
License | CC Attribution 2.0 Belgium: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/47397 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Motion captureStreaming mediaBitDevice driverCuboidFreewareData structureAsynchronous Transfer ModeGroup actionSelf-organizationDevice driverStreaming mediaBuildingQuicksortPlanningSlide rulePoint (geometry)Data managementMereologyWordDecision theoryGraphics processing unitMultiplication signWhiteboardFood energyComputer animation
02:53
Continuous trackNumberMathematicsMereologyStreaming mediaCuboidEvent horizonPoint (geometry)Projective planeLie groupVideo gameParallel portBitWebcamMultiplication signWindowCellular automatonLaptopProper mapBuildingNetwork topologyTransportation theory (mathematics)Self-organizationCore dumpNatural numberComputer animation
05:18
Streaming mediaMotion captureLaptopMathematicsCuboidSlide ruleCursor (computers)Scaling (geometry)BitSoftwareStreaming mediaData structureSynchronizationService (economics)WebsiteFile viewerBuildingVotingLie groupServer (computing)Video gameEmailVideo projectorReduction of orderComputer animation
07:48
Software testingKernel (computing)SoftwareServer (computing)Lenovo GroupSoftware bugVirtual machineMultiplication signLaptopSpacetimePressureInformationCycle (graph theory)BitProcess (computing)Stability theoryPhysical lawRow (database)Streaming mediaRight angleComputer animation
10:37
Streaming mediaCuboidSlide ruleOffice suiteCASE <Informatik>Point (geometry)Video gameLecture/ConferenceComputer animation
11:38
Office suiteCuboidCASE <Informatik>Streaming mediaDigital photographyBuildingSoftware development kitOpen sourceNP-hardSoftware testingComputer animation
12:31
Streaming mediaStreaming mediaComputer hardwareCuboidSoftware testingRevision controlOpen sourceAxiomSlide ruleRoutingSoftware1 (number)BitDigital electronicsCASE <Informatik>Fitness functionLecture/Conference
14:10
Streaming mediaOffice suiteSlide ruleMereologyLaptopCuboidDiffuser (automotive)Simultaneous localization and mappingComputer hardwareStreaming mediaElectronic visual displayGoodness of fitPiPower (physics)Standard deviationSystem identificationBitConnected spaceCodierung <Programmierung>Point (geometry)CodeVideo projectorScaling (geometry)Engineering drawing
16:22
outputArchitectureCoprocessorBLACK MAGICCommunications protocolDevice driverFirmwarePlastikkarteGUI widgetStreaming mediaLocal ringBackupBootingKernel (computing)Reverse engineeringImage resolutionBefehlsprozessorScaling (geometry)Point (geometry)Computer hardwareData managementFunction (mathematics)Moment (mathematics)Multiplication signProjective planeUsabilityRevision controlPlastikkartePredictabilitySlide ruleMassWhiteboardStreaming mediaCASE <Informatik>CuboidFocus (optics)MiniDiscBlack boxSoftware testingDevice driverVideo projectorOnline helpPerturbation theoryFirmwareTelecommunicationDevice driverInternetworkingFile systemKernel (computing)Sign (mathematics)BitState of matterSpecial unitary groupInformation securityGroup actionInformationUniform resource locatorPhysical systemAreaLogic gateMultiplicationGoodness of fitArmElectronic visual displayMixed realityTime zoneRight angleComputer animation
23:19
Image resolutionNetwork socketRow (database)ComputeroutputPiComputer hardwareStreaming mediaWaveSound cardTransmissionskoeffizientFunction (mathematics)Right angleCycle (graph theory)WhiteboardMenu (computing)CuboidComputer fileNetwork socketBitLaptopTape driveRemote administrationWaveformInterface (computing)Game controllerPersonal identification numberLink (knot theory)Key (cryptography)Goodness of fitNetwork topologyDecision tree learningPoint (geometry)MetreScaling (geometry)Form (programming)Intelligent NetworkComputer animation
27:14
Computer hardwareMultiplication signNetwork topologyForm (programming)LaptopDoubling the cubeCuboidImage resolutionScaling (geometry)Video projectorProjective planeInterface (computing)Directed graphAssociative propertyFunction (mathematics)BitStreaming mediaData conversionOffice suiteMathematicsCodierung <Programmierung>Connected spaceQuicksortSound effectSlide ruleCASE <Informatik>System on a chipWaveformDiffuser (automotive)Computer animation
30:10
Electronic visual displayVideo projectorLocal ringStreaming mediaLaptopElectronic data interchangeSoftwareAxiom of choiceMaxima and minimaImage resolutionWhiteboardProjective planeSlide ruleVideo projectorImage resolutionDiffuser (automotive)WhiteboardPixelDigital rights managementAnalogyQuicksortCodierung <Programmierung>Device driverModule (mathematics)System on a chipCASE <Informatik>Open sourceComputerOnline chatPrototypeComputer hardwareEndliche ModelltheorieUniverse (mathematics)Special unitary groupState of matterCodeRight angleMultiplication signFamilyInformation managementGoogolSoftware developerInstance (computer science)Computer animation
33:36
Axiom of choiceImage resolutionMaxima and minimaWhiteboardSlide rulePlanningMotion captureBitWhiteboardPersonal identification numberElectronic visual displayComputer hardwareState of matterRight angleLine (geometry)Electric generatorMultiplication signComputer animation
34:29
Streaming mediaEmailOpen sourceWhiteboardSoftware testingMultiplication signSpeichermodellAnalogyDigital photographyPoint (geometry)Sign (mathematics)Video game40 (number)Computer animation
35:42
Computer hardwareTerm (mathematics)CuboidBuildingDevice driverRule of inferenceComputer hardwareStreaming mediaControl flowPlane (geometry)Video projectorGroup actionoutputWordSpeichermodellComputer animation
37:10
Video projectorElectronic visual displayImage resolutionScale (map)Musical ensembleoutputTouchscreenAsynchronous Transfer ModeImplementationCuboidMultiplication signVideo projectorElectronic visual displayDevice driverInstance (computer science)Scaling (geometry)Image resolutionComputer animation
38:15
Streaming mediaControl flowFrame problemOrder (biology)Bit rateEmailFreewareFile formatSlide ruleAddress spaceGame controllerRankingBitCodeCodierung <Programmierung>Computer hardwareGraph coloringElectronic visual displayReal-time operating systemCuboid2 (number)Line (geometry)Touchscreen40 (number)Streaming mediaStandard deviationMultiplication signFrame problemSemiconductor memoryVideo gameProcess (computing)PiProjective planeConnected spaceBefehlsprozessorCursor (computers)Boundary value problemCache (computing)Computer animation
40:16
outputFunction (mathematics)Computer hardwareBitGame theoryElectric generatorRoutingMultiplication signCuboidComputer animation
41:17
Software testingDevice driverWhiteboardComputer hardwareFunction (mathematics)SpacetimeModule (mathematics)Modul <Datentyp>Lattice (order)BitTable (information)Computer hardwarePoint (geometry)Parallel portInterface (computing)WhiteboardCASE <Informatik>SpacetimeTerm (mathematics)Revision controlSoftwareConnected spaceFreewareSlide ruleComputer animation
43:22
Module (mathematics)PixelImage resolutionMotion captureFrame problemPulse (signal processing)SynchronizationSingle-precision floating-point formatBitOverlay-NetzLine (geometry)WhiteboardSet (mathematics)Electronic visual displayEngineering drawing
45:05
WhiteboardStreaming mediaSoftwareTouchscreenComputer hardwareSemiconductor memoryBitBand matrixBus (computing)InternetworkingScaling (geometry)Computer animation
45:48
Electronic visual displayImage resolutionResultantFlow separationSingle-precision floating-point formatQuicksortTotal S.A.Electronic visual displayPlane (geometry)Graph coloringCASE <Informatik>Game theoryOverlay-NetzBitTorusFile formatMultiplication signLevel (video gaming)MereologySqueeze theoremOpen sourceData conversionNetwork topologyReal-time operating systemSemiconductor memoryPoint (geometry)Bus (computing)State of matterConnected spaceComputer clusterSpacetime2 (number)HypermediaRevision control1 (number)Arithmetic meanMatrix (mathematics)Motion captureTerm (mathematics)Scaling (geometry)Computer fileExtension (kinesiology)Physical lawDigital rights managementCodierung <Programmierung>Band matrixDevice driverComputer hardwareColor spaceCodeAsynchronous Transfer ModeComputer animation
50:30
WhiteboardSoftware testingDevice driverBuffer solutionElectronic visual displayImplementationData conversionFormal verificationEuler anglesSpacetimeHacker (term)Continuous trackUniform resource locatorBootingStreaming mediaComputer hardwareKernel (computing)Power (physics)Semiconductor memoryCodeMultiplication signVoltmeterDevice driverSlide ruleFrame problemBand matrixComputer hardwareUniform resource locatorBuffer solutionBitSoftware testingStandard deviationQuicksortDirection (geometry)Rule of inferenceTwitterOverlay-NetzFood energyPlanningSoftware bugWhiteboardPoint (geometry)Right angleInsertion lossCore dumpSpeech synthesisFlow separationComputer animation
53:50
Point cloudFacebookOpen source
Transcript: English(auto-generated)
00:05
So guys, this will be a bit of a bumpy ride today. These slides were almost finished an hour ago, and we had 20 minutes to talk together. Bye, Vasil! He will be on one of the slides, you'll recognize him in a bit. He knows I've shown him a few minutes ago, yeah.
00:23
He was not... well, he was happy and not happy, so yeah. The slides were only finished a few... an hour ago, Max, and we had 20 minutes to go over them last night when they were still in unfinished state because of all the stuff we had to do. So it will be a bit of a bumpy ride, and I have a bit of an idea of the structure.
00:40
These guys, I have no idea of the structure anymore. Surprise! And I don't know whether the transitions will be okay or not. We will see. So over there is Chary and he can start. Hi, I'm Chary. I've been part of the FOSM organization for 16 years now. I have taken up a role in the video team five-ish years ago
01:04
when we tried to come up with a new solution to VR streaming. As you've seen, our video history has been a bumpy ride so far. But in the last couple of years, we've managed to get it to work quite well. And while this, what you're about to see, the boxes you see in the back,
01:22
well, that's up here in the front are mostly the reason for that, so we'll give a brief overview of how we got all of that to work, and then we'll tell you about our future plans. Mark, introduce yourself. Well, I am Mark. I've been, I'm too old to remember.
01:42
I'm too old for this shit. How long I've been on the FOSM team. Yeah, I started playing with video together with Chary, I think. Yeah, but I remember you running the AW building in 2010. No, yeah. I had Coreboot and XORC then, and then you were running the building.
02:02
So we first started chatting, yeah. 2010, yeah? So I'm Luc Verhagen. I'm mostly known as libv, mostly known for graphics drivers such as, well, mode setting, the free ATI driver called Radeon HD, which was actual C code, Lima, Linux Sanxi, Flash ROM for ATI graphics cards, that sort of thing.
02:23
I've been running a dev room here since 2006 with a few hiatuses in between. I think at this point with Philipp de Swert stopping with the embedded dev room, I'm the oldest dev room manager around. So a few years ago, we didn't have enough talks.
02:42
In 2016, there weren't enough talks for the XORC dev room, so I decided for another hiatus, and then I helped these guys out with a bit of video, which was interesting, and then I slowly got more and more dragged into the whole thing. So you all know this, FOSDEM is indeed insane. So nobody does what these guys are doing, or I only do a small part,
03:02
but these guys, they are core organizers, and they have no time, and they don't sleep much in the weeks before the event, and you can't reach them anymore on the days themselves, and if you see them, they're always running. So the numbers you see here, we're now at 29 parallel tracks. Last year, we were in 28. Nobody does that.
03:23
I think we are three times as large as the next larger on the planet, something like that. Of the 835 talks accepted in Penta, there were about 1,800 proposals, so more than half got refused,
03:41
and we're the only people that are streaming this out live, and 29 live streams? Nobody does that. Nobody does that. Even if it's only 720p, nobody does that. And you'll be surprised about the budget. We won't say any numbers, but you will probably spend more on public transportation,
04:01
going back to the inner city, and back and forth two times, or four times, than FOSDEM spends for every visitor, pretty much. So it is quite insane. It's all either beer event, it's probably a bit of money of the food trucks, there's big sponsors, but they're also limited in the amount of money that they can give,
04:24
so that it stays a proper grassroots project, and that it doesn't change its nature at all. One of the mantras is small changes every year, tiny changes, tiny changes, don't change anything big, like one dev room, or small changes to HDMI. Well, that is exactly one mistake that we made at one point,
04:42
is when we scaled up from five rooms to 21 rooms. Don't do that. It doesn't work. When was that? I think that was 2014. Oh, for video. For video, yeah, not for all dev rooms. No, don't do that, that idea. Yeah, for me it was amazing. Before that it was always Michael Erbel from Phoronic sitting in front with an HD USB webcam.
05:06
There was like one dev room in the H building where there's only one door, no windows, no ventilation, no nothing, and he would just sit there for two days straight, and then it's just insane. So all this stuff is quite amazing. So this is how our setup is working.
05:22
Those boxes, well, they're wooden boxes, you'll see a picture. Oh, I forgot to bring you one of the boxes upstairs. Yeah, there's no changing that anymore. There's a picture of a box in there as well. So the speaker's laptop is all the way on the... Is there a cursor being there?
05:41
That's the speaker's laptop in this image. This is the slides box, which is a bit special. It goes out to a projector. The audio is going straight into the camera, and this then feeds it in over HDMI audio over the camera into another slides box, and this all gets combined in a switch and then goes out to the fossil network,
06:01
and you can see the cable running over there to the back, and it goes here into, at least in this building, it goes here into the ULB network, and then it's into a video VLAN, and the guy who was setting up the audio earlier is sitting downstairs watching a big monitor, and this year we have three guys watching monitors, right? So that's already a big change.
06:22
I didn't do your suggestion to the slides because, well, this one laptop is quite cool, and there's only a small machine in between doing the actual streaming, right? So downstairs there's a few laptops. There will be a slide about this. You don't know the structure yet, right? This is a bit of a surprise for you guys, and if you've ever talked here or if you've been running a dev room,
06:44
then you know that since a few years, a few minutes after the talk is finished, you get an email, and you get told that you can review your talk, and you can say the start and the end, and then minutes later, if there's nothing going wrong and the network is okay, this will be published already, and it's all being streamed live, and the publishing is already,
07:02
whether it's Belnet and a few other sites probably. We have about ten mirrors that hold our contents, so we upload it to our master server, and then the mirrors just sync from there using our sync. That's nothing magic, and we use a redirection service. It's mirror bits actually to redirect you to the closest or most useful mirror for you.
07:25
So as soon as the talk is reviewed, it will be available to viewers within the next half an hour or so. So the next few slides will give you a bit of an idea of the insanity and of the scale of the current setup, and most of the insanity we will keep.
07:40
We will just fix this one thing and take one of the biggest issues. So this laptop is a laptop downstairs, and there's a few more of those. It has friends. Mark? Well, yeah, we first attempted to recycle some old servers. They were more efficient as space heaters basically.
08:03
We had five or six of them in a garage, and it was in the middle of winter, and it got pretty hot in there with just trying to do eight or nine streams, and then they fell over and dumped into kernel bugs.
08:21
And we were under time pressure for the first time. We were trying to do this software setup ourselves. So I said, hey, I have this laptop that I have this Voctomix stuff that we use to do the mixing on. I tested with it, and it seemed to work.
08:44
Let's just try with a laptop and see if it can stably do the work. Do the encoding and mixing, et cetera, in a stable way. You have network. You have a power supply. You have built-in UPS basically.
09:00
It's dirt cheap. So we buy a bunch of these before the conference, about 50, and we get rid of them after the conference and during the conference. People buy them, right? They were sold out in one hour. Did you make money, or did you just break even, or did you lose a tiny bit?
09:20
We don't make money on the... Well, we make a couple of cents on the laptops. So we buy them in bulk, and we sell vouchers at the info desk. Well, I did that yesterday. I sold 38 because I had reservations for a couple of others, and we keep a few ourselves for use under the year.
09:42
And they were gone in about an hour. We sold all of these for 150 euros apiece, which is approximately the price we paid for them. They're Lenovo X250s this year, but we started off with X220s a couple of years back, and we repeat the same cycle every time. So this costs us nothing at all, and we get to...
10:02
Well, we have them from October approximately, so it gives us time to set them up, test them all, and use them at the event and at no cost actually. So this is underneath the big chairs of the big room downstairs? Next to the network room. So you first pass some really smelly sewage pipes,
10:24
and then you get to a warm room, and there's lots of humming going on, and that's where they are standing. So on the other side of this there is actually a few server racks, and I think a few of Fosdom's machines are always in there all year long, right? So we rent most of the other gear.
10:41
Like the cameras are rented, the microphones are rented, and the stands are rented, and there's a big mess every year trying to get the inventory of that done, trying to get them tested, and then finding out that this year a few HDMI connectors were broken on the cameras, and the microphones were broken. This is always fun. But yeah, this is the slide pile. They come from Holland somewhere.
11:01
They fill up half a T5 transporter, so that's quite a big pile. Now this is only 35 of our boxes. In the Fosdom office there's usually 58 lined up, but there's a 3D printer on this side of it, and then on the other side there are just more boxes.
11:21
And the only picture I got was from, this is from Ghosty that I got, and the other side was all blurred, and I have one picture with you in it, but it's showing you from your better side, so I was not using that slide. You were just looking in a box at this point, so no, we're not using that slide. So they're all packed up in flight cases.
11:41
This is the exact coffin that is over there. These things get quite, quite heavy, and we have to drag them from one floor up from the Fosdom office all the way down every Thursday before Fosdom. And tomorrow evening we will have to carry them all up together with all the boxes that we have,
12:01
all the kit, which is also about between 80 and 100 boxes that we have, like this big. And these things you need four people to get those up the stairs or down the stairs as well, so they're quite bulky and quite heavy. So we have several of them. This is a photo taken yesterday morning, so six and a half of these boxes,
12:21
and in every building usually one box travels off, and that's sometimes also where the video people will live for the weekend when they check streams and make sure that the cameras keep on running. So this is one of the video boxes. You've probably seen them everywhere already. We are now currently using two in the back because we're testing a open source hardware kit.
12:40
It's open source hardware, right? Yes. We're testing an open source hardware camera and our usual camera, so we now have two camera boxes. There are two different ones. There is one for slides, there is one for camera. We will get to that in a bit. The Axiom people are right next to it, by the way. These guys are doing something really cool. It's a 4K camera, which is completely open source hardware.
13:02
So it's not cheap, but you're working on it. But next year you guys should have a talk about this in this exact room. Let's do that. Okay. Yeah. And invite the guys who do the full HD version as well. So there's two different boxes. This is what they're, what? And next year you should bring 30 cameras maybe.
13:23
For free. Just a suggestion. So yeah, Mark, you used to double with 3D printers for a bit, so you did all the casework for this, and you thought it was going to be fancy in all 3D. Well, all laser cut stuff, yeah.
13:41
And then you started screwing them together, and then you noticed that maybe for 58 boxes this is not the best idea. So this was a lot of work for somebody. You got some idea. Not me, not me. I outsourced it. You outsourced it. And the connectors in front are also mighty expensive. I think all the connectors per box are more than what
14:00
our box will be in the future, pretty much. This is this Notric stuff. It's on the lower end of 100 euros. Just the connectors. Anyway, so you've all walked in here. You've seen the Frankenbox that I've built over last month with parts that I scrounged from the office like a month ago.
14:20
And yeah, if I would just leave the slide at this, then you would have learned nothing, because all you see is cables. So I've done something slightly fancy. So where the signal from the laptop first comes in is an HDMI active splitter. It's one of those cheap Chinese things that you bought, what, 100 from immediately, or? Yeah, we have about 100 of them.
14:42
Now, this is not very smart, because the first HDMI connection that it sees is also the monitor identification data that it keeps and feeds on. So this is always a bit of a gamble. Now, from here it goes through the projector, and it goes through the scaler. And I have a slide about this in a bit,
15:03
because this is a very important and also blocking item for us. From the scaler, it goes to a hardware H264 encoder. There's also a slide about this one. And from there, it goes to a banana pie with a status display in front. And a very cheap, at this point, very old artist,
15:21
that's enough to store the data for the whole weekend. From there, it goes to a five-port gigabit switch. And from there, it goes either to the front box or wherever we find network. And then it goes into the FOSDEM video VLAN, and then it goes from there, right? Exactly. Maybe just to give you an idea of what one of these boxes costs, the total price for all of this,
15:43
what you saw until now, is about 650, 700 euros, of which the bulk is actually the H264 encoder, which goes for about 400 euros. So just to give you an idea of what one of these costs. So this is a standard ATX power supply. We have the power good signal, of course,
16:03
tied to five volts, so it's always on. The second it sees power, the switch is flipped, then it's on. And then we have five-volt connectors and one 12-volt connector for the H264 encoder, which is quite hot. Yeah, we standardized on those so we could use the ATX power supply. So no nine volts, no nothing else.
16:22
Nothing special. So this is one of our biggest blocking items at this point. We've been using these since 2014 as well. So it's an anything to 720p HDMI. So we have VGA, we have HDMI. It can do full HD. Right, but we use it for 720p because we
16:41
don't have 720p for everything, which is good enough. It used to be dirt cheap Chinese shanzai hardware that is like, yeah, questionable in some ways, but stable, but it's almost impossible to get anymore. Yeah, so 2015 and it's Chinese,
17:01
so it's there one moment and it's gone the next. And well, you can go and have a look at it later because I didn't manage to fill. So the big gap over here is where another one of the shots from here should go, and I didn't have time to put it in today, so there's already too many pictures in this whole thing. So this box grabs HDMI and outputs what?
17:23
A scaled version of it. So actually it can grab VGA, HDMI, or composites, and it outputs HDMI in a normalized resolution. You need something to capture that, right? Yes, so we'll get to that. Okay, cool. Because the same company manufacturers also.
17:42
Yeah, yeah, we know, but I think we are not sure whether it's the same CPU being run in there. We haven't tried to look. I don't think many people have this scaler. The people have been working with the other devices, the HDMI to Ethernet version. People have been reverse engineering that. That's the one that I'm using.
18:00
Exactly, but we don't think it's the same hardware, actually, because this chip, the markings are lasered off, and it has U-Arts, but it's encrypted. So if you can break it, it would actually help. Yeah, not for us anymore, hopefully in a year, but no. Anyway, so then the Blackmagic 8 SUSEC 4 encoder,
18:20
the most expensive thing in there, the most hot thing in there, and the most awkward to use. Yeah, black box. Luckily it was reverse engineered already before you guys started with it, because I wasn't there yet at that time, to at least get the video stuff. So if you look at the slides later on, you can click on the link and see the work that Timo did.
18:42
And you all know the stuff from the Broadcom driver and a few other Wi-Fi and other drivers that you have to go and extract the firmware, and upload it, and even then it's very picky about the resolutions it takes, because the resolution I need for our test hardware is also not working on that thing. So we came into the project at the point where,
19:00
like into the Fosnan video project at the point where we already had this hardware, and we wanted to get something as close to free and open as we could possibly do. So the first thing we did was like, well, look for just extracting the firmware and running it, well, just throwing a binary blob,
19:22
but it was the best we could do, but at least with a mainline kernel and stuff like that. So the next thing for me as a Sunxi developer, as one of the founders of Sunxi, is the biggest eyesore in the whole thing, and I've been complaining each and every time. Even the first time I saw it, I was like, why did you do this?
19:41
But these guys were not responsible, because otherwise we would have chosen something very obvious, somebody who talked an hour ago at Sveta, and of course, Olimex is what we should have done, but it was all set in stone back then already. We have like 60, 70 of these devices laying around. So yeah. Any reason to not use this magic board that they do for LinuxConf Australia,
20:04
or some Debian conferences? So we are in active communication with almost all these people, like the DEPCONF people in particular. There are several.
20:22
It's expensive. It's difficult to deploy on such a massive scale, because it needs to be, well, it needs to be tended to like a little child. I'm sorry, but like, yeah. I never used it. I just read about it,
20:42
and was always curious to try it, but I ended up with that HDMI to Internet grabber, because it cost like 20 times less than the board. Yes, exactly. Of course, and you're right to use it. So one of the reasons why we made so many compromises
21:00
on so many things that we picked out is ease and speed of deployment. So we get to the venue, well, we have the venue available from noon on Friday, and by the evening, all 30 rooms must be working. And we were pretty good this year, like 1700 we were done. Exactly. So speed and efficiency and predictability
21:22
is what we're looking for, and that's why we made a lot of compromises. Yeah, I want to stress that, like I really admire the work that goes into this FPGA-based stuff like that Mitro and other people are, I know, I know, are doing.
21:42
So like definitely it's a really, really good thing what they are doing, but our use case is a tiny bit special with the focus on fast deployment, massive scale deployment, and a very, very, very low budget. So yeah. So yeah, this runs the BMD tools from Timo,
22:03
talks to the H.264 encoder, and uploads the firmware and this and that. It streams over its gigabit Ethernet port, which is usually, are there any, all 820 devices without gigabit? Does anybody do that? You do that? Okay. Nobody does that at all? Okay.
22:21
And then we need to set up for local backup because at one point we lost some videos because some disk was gone somewhere a few years ago. So it's being stored in multiple locations, off-site, on-site, and in the box itself. We use an SD card for the file system at this point.
22:41
We display the status, it's this black thing you saw in front of the box. It's a status display, we will talk about it later a bit, and it controls the scaler, which is probably the most interesting hack of the last few years, but we will get that in a bit. And since last year we are running mainline, you wouldn't mainline kernel, because Paul Kocialkowski,
23:00
because of all the work of the Samsung community, and we threw Paul Kocialkowski in LCD, and he implemented that for us. He was our hardware enablement, different manager the last two years, and we miss him, and we hope that he comes back next year because we didn't miss him. Yeah. So this is Gary's special feature, all you.
23:24
So the tape, for correctness, the tape is all me, that's here. Here he has created a 3D printed USB holder for this little thing up there, and it goes into the USB socket that you see up there. It's genius.
23:40
But the real genius comes now. Right. So in the first year that we were using the boxes, one of the things that really frustrated me was controlling the scaler. So you need to input some settings so that it will output the right resolution, and it will actually use HDMI input instead of VGA input, et cetera, et cetera.
24:01
The scaler being inside the box is hard to reach. I mean, it has basically two control interfaces. It's got four buttons on the front, hardware buttons, and it's got an IR receiver. So first I thought, well, you know, probably the best way to approach this is to solder something to the buttons
24:21
and then make something to control the buttons via the GPIO pins of the banana pipe. But then that would involve, on top of opening all the boxes, that would require a lot of soldering. I would have to take out all of the scalers, do the soldering, do the wiring, et cetera, et cetera. It would not be fun.
24:41
Exactly, it wouldn't be fun. So I thought of a more stupid way of approaching this. And I was thinking maybe I could do something with the IR receiver on the board. Now, one thing about the banana pipe is it also has an IR receiver,
25:01
but it's not got an IR transmitter. So that was a bit annoying. First I thought that, well, maybe I could use USB and, you know, control an IR transmitter that way, something, something. But that would require, you know, getting IR transmitters, et cetera, et cetera. I wasn't really looking forward to that. So I thought a bit further
25:21
and figured that an IR signal is actually just a waveform. Now, what on the banana pipe could emit a waveform easily without, you know, too much fuss? So I built an IR receiver
25:43
that goes into the audio in of my laptop, and I captured all of the IR signals that were emitted by the remote control. By actual remote control. You were going through the menus of the remote control. Exactly. And I captured the waveforms in just plain wave files.
26:02
I edited them a bit so that I had the proper signal. Audacity to the rescue. Exactly, I used Audacity for that. And then I had a couple of wave files that I could play on my computer. They would make funny sounds if I played them through my speakers. But if I attached an IR transmitter to my audio out port,
26:20
so to my audio plug, audio out jack on my laptop, the IR transmitter would actually emit right pulses, and I could control the scaler using my sound card in my laptop. And so I opened up all the boxes, installed an IR LED right in front of the IR receiver
26:43
using just a 3.5 mm jack going to the banana pie. And now we can remote control the scaler just by playing some wave files from the banana pie. And I can cycle through the entire menu of the blanking device just from the banana pie.
27:00
And I can see the output on the stream. Yeah, with a bit of a delay. With a bit of a delay, so it's not perfect. But instead of opening all the boxes and using a screwdriver to punch all the buttons in the right way, I can now do it remotely. So I was happy. So one of the big issues we have with the boxes is this double scaling that we're doing.
27:21
So this is why speakers normally get told to do 4x3 because our scalers are being set by our waveforms to do 4x3 conversions. So if your laptop is running 16x9, then you'll have this first letterbox effect of the... What do you want to say, Kitty?
27:42
It's not entirely true. So our scalers are actually set to... So the output that goes to the encoder is actually 720p. And it will... Exactly. So the problem is, it's a bit complicated to explain. So imagine you have a 4x3 projector.
28:04
So a projector projecting a 4x3 image, which this one isn't. And remember what Luke said at the beginning, that the EDID transmitted to the laptop of the speaker is the first one the splitter picks up. You will probably see the projector there.
28:23
So your laptop will see a 4x3 projector and use and adapt its resolution to that. And imagine that you have made 16x9 slides because you thought that this was a modern time and we would all use 16x9. You were wrong.
28:42
So now you have 16x9 slides that you made, outputted on a 4x3 resolution, and then sent into the box. Now, on our end, we want 720p. So our scaler will convert your 4x3 signal, which has already two black borders added by your laptop. Stop here. Yeah, so that's the second one.
29:02
And it will add two borders on the side again. So now we have a double letterbox effect. So if you see on the stream that there's lots of black borders around it, that is the reason why that is. And it's a bit silly, but it is the way it is. We can't change it in this design.
29:20
So every November there is a video hackathon going on in the Leuven office, where all the boxes are usually living most of the year. And last year we started chatting a bit. We had both seen this Adafruit adapter that is built for HDMI to LCD conversion so that you could use an HDMI connection
29:41
for Raspberry Pi, for instance, because the Raspberry Pi doesn't have that much else. And then drive an LCD with that. Well, what is coming out of that is a parallel signal. It's parallel RGB with two things and a display enable and maybe a backlight enable. But this is usually also a signal that you can take in on any SoC
30:01
that has like a camera interface that is a bit wider than anything else. And this is how we got thinking in just November 2018 about maybe we can do something else. What if we feed this directly into the SoC? So in that case, we will be able to drive this projector directly and do all sorts of fancy things, get rid of all the borders.
30:22
We can provide our own EDID so that everybody uses the exact same resolution. If we get an onboard H.264 encoder, then we have that problem solved as well. If we then use open source hardware, like for instance, all MX stuff, then we have open source hardware,
30:40
mainline support comes with that, and we can make it small and cheap. And we get all our wishes granted. So that was the state last year. Last year, we had a few chats here. I had a chat with Svetaan, with Vasil, with you guys about, we should do this, we should do this. And the lucky thing was that on Monday morning every year,
31:04
me and Egbert Eichen will bonus go for the, we've been thrown out of the hotel, let's have breakfast. And I started explaining what we had been talking about all weekend. And Uwe, his face lighted up and said, well, I work for the Darmstadt University. I do electronics there. I could prototype this for you.
31:22
I could design this for you because this was the only thing that we couldn't solve before that. Because drivers and stuff, that's what I usually do. And I could try to marshal in other people from the Sunk Sheet project. But Uwe said, I could do that. And Uwe is actually here, and he's been designing our board. So please put your hands together for Uwe.
31:42
So he solved the last question we had then. And ever since then, it's just, can the hardware do it, and can we get the right resolution? So design resolution is 720p because, while we don't need that much more, we could maybe do it in full HD. But 720p is bad enough, and people are already abusing this many pixels that they have
32:02
because not every person develops slides that can be read in the back of the room. So the SoC is naturally an Allwinner 820 because it's probably one of the most open source friendly chips out there, even though Allwinner isn't always being that open source friendly. It's one of the chips with the fewest restrictions,
32:21
no DRM stuff. There's almost no binary blobs around anymore, but most of you will know that if you're in this room. We ended up choosing the Olimx Lime 2, but there will be a separate slide about that. And we first thought about the TFP401 module from Texas Instruments, which has a module from Adafruit.
32:40
But the issue with that is that it's actually a DVI to parallel decoder. So no audio. So no audio. And we thought that we could maybe use a splitter or something and then use one of those HDMI to I2S decoders. I2S is the digital sound that Philips developed ages ago, but it turns out that these are just abusing the cable of HDMI
33:03
and doing nothing else. So we looked a bit further, and then we found the ADV7611 from Analog Devices. And we had a chat at Embedded World as well with some Analog Devices guy in there. And apparently Analog Devices is one of the most open source friendly makers of these sort of chips.
33:21
As an all-time display driver developer, that wasn't the case in the early 2000s. So things have changed there. Now, the second we found out that we wanted to use this chip, Google turned up something. Something quite interesting. Oh, yeah. Me and my funny slides. So this is where the plan got together.
33:43
So the Lime 2 is what we're using, of course. It's sitting right here in front. You all saw it just now. Luckily, it exposed almost all the pins on its tiny connectors, which are a bit difficult to work with, but we're also very stable. So when you have the daughter board attached to this,
34:01
this will never, ever come loose on itself. So it's a bit of a disadvantage, and it's a massive advantage at the same time. And we only had to change the backlit display enable signal for the LCD with something else so that we could have full 24-bit RGB capture on the Lime 2.
34:24
So this is the board that we're designing our first-generation hardware around. So when I googled for the ADB7611, we found this, and this happened, and it also wasn't a SunXee main list, just at a time where I was no longer doing that much SunXee.
34:40
So in the second half of 2014 is when I wound down there a bit. These guys, Fjorg Höttinger, Fjorg Löpich, and Gabriel Lucas were designing pretty much what we were trying to do as well. They were still using a Lime A10, and they were having issues with getting the signal right,
35:01
but that's something forced slightly later in this talk, and they ended up kind of losing steam. They had other things in their lives happening, and it kind of died down. But the second I sent them an email, I got an email back, and a week later, we had their test chips that they got from Analog Devices, and we have one of their boards as well, which is in my backpack down there
35:22
because I wasn't prepared enough. So we have this here as well. This is the exact board that I photographed this morning. At some point, they will get this back, but this is how friendly they are, and we will be reusing their open-source hardware design for our future designs as well. We will just adapt it to our own needs. So the work they've done, we will continue with that, and we will finish it.
35:45
So our plan, low-cost, the hardware design. So yeah, you can all read. And I just had to put the word chock-a-ball in there. Just go for a brick stitch. That's where we want to go. We want to have it brick-sized. We want to be able to send people
36:02
to every building over the whole campus with a backpack with at least two devices in there and that all the rooms have two devices as well so that people are no longer running with one of the two. We have two spare boxes, and thanks to this one, we have three spare boxes, and the rest is being spread around. And if something breaks, then there is one guy running with one of those boxes all across the campus.
36:22
So that is usually not a nice thing to have. It's not easy taking some of these into your hand luggage on the plane, you know? So this would already be big for what we want to have. So if it's already this size, we would be very happy, but we can go a lot smaller than that. And then it's just two in a backpack
36:41
and then send off a video guy, and we don't have to physically see them again the whole day anymore. I mean, just talk on iOS. They don't like the screwdrivers and the cables and the clippers and whatever, but you need to fix these things when something gets loose inside either. And if there's that many boxes spread around,
37:03
then you just have enough in your backpack. Then there's no more worries. You just swap one for the other. So the big advantage, and as a display driver guy, is that we can drive the projector correctly. So this thing is what I photographed this morning when it was still dark outside.
37:21
It's the only box doing it in the whole of FOSDEM that actually when there's no cable attached, it will actually do it properly. Otherwise, it's the scaler saying that I have no signal. So we can use the native resolution, keep the aspect ratio as it is, and we can do very interesting things because we have so many display layers going on here
37:42
that this is now just a PNG shown on one layer, but we can make it all dynamic so that it could show the next talk is coming up. It's going to be that. We could, for instance, down here, we could show this many minutes left so that the speaker can immediately see it or the time's up. If zombies attack, then we can also put it on there or stuff like FOSDEM's over.
38:03
FOSDEM's over, everybody leave or clean up, and we can all control that remotely. So it will solve at least the projector side of things as well and make this whole thing a whole lot nicer to look for. Then we get full control over MPEG4 encoding.
38:20
Yeah, we're not tied to AAC because the CCC guys, they're probably in here as well. They want to move to something else than ACC. We don't care. The hardware encoder does H.264. Whatever it's packaged in, it doesn't care. That's something that we do through probably FFmpeg or something like that, but we're far away from, well, a bit away from that still. This is another display thing that's going on.
38:42
On this side, you see what's happening for every of the standard FOSDEM boxes. Let me just use a cursor if you want to wake up. This is all dynamic. It changes every second. Basically, as fast as the FFmpeg process on the Banana Pi
39:00
can grab a frame from the live stream, render it to a PNG on the CPU, and then show it on the screen. It's dynamic, and it updates every five-ish seconds maybe. When it doesn't have an IP connection, it will show it as well. On this side, this is just me showing off what we can do with our future project
39:22
because all of this is just a PNG to show off what's going on. All of the display engine has a dark background. Black, it has a background color. Then it has a 40 percent opacity real-time full RGB 720p rendering at 60 frames per second.
39:43
This is actually eating quite a lot of memory boundary because we're scaling down and we're trashing the cache all the time, but this is real-time. This is 60 frames per second instead of one per second. These things are just PNG, and this is also two PNGs being shown there. There's four layers happening there already,
40:01
and we can do a lot more. We can draw individual lines and individual bits of frame buffer and then show them, which will make it a lot more faster. We are using so little CPU with this. We're not doing anything with it. It's basically like a few percent. The last one, and this is the last one we will solve for our first-generation hardware,
40:22
is that in this room it's okay because we only have one microphone. The microphone is attached to the camera. If you have a handheld microphone, it will have another receiver as well, and then they have to be mixed together, which they usually do in front, because then they can drive the speakers
40:42
that are in there or in the bigger rooms, and then there is a big XLR cable going all the way back, just taking the exact same route as the Ethernet cable, and it's always a bit of fun and games dragging these XLR cables across the campus and then putting them in each and every room, and we want to get rid of that in time as well.
41:01
If we can get the latency down, then we might be able to have the box in the back also taking two microphone channels and then send it to the front and send it out over the speakers as well if we can get the latency down, but wait and see if we ever get there. Well, we will get there, but when we ever get there. This is the board that,
41:22
so last May we had a meeting in Frankfurt, so it's just kind of halfway for all of us, so it was us, two, me, and who lives just one, what, 20 kilometers away or something like that, and then we chatted a bit of what we're really going to do and how we're going to attack this, and in the end we decided to go back
41:42
to the Adafruit TFP401 module, the DVI decoder, as something that would give us hardware quickly that I could start developing the software for this. So Uber designed this in the space of what, a week and a half or so? It's just a connector, there's a few resistors in there,
42:02
and then one voltage shifter for VGA probably, but we're not using the VGA yet. No, I think it's totally passive. It's totally passive, okay. So you had some space left, that's why you added the JTAC adapter. I first tried to use the CSI-0 connection.
42:23
It's not MIPI CSI, it's the Chinese version of it, so it's still old school parallel. It's called CMOS sensor interface. Thank you, all winner. And this was a dead end, so I spent three weeks trying to get that to work and then cleaning up the register tables and then thinking, no, I'm not going to use that.
42:42
It's just 12 registers. I've dealt with a few more, so let's just write it from scratch and see where we are. I will at some point have to feed this back to the existing CSI driver, but I needed to have this thing working faster to verify what we were doing. It also has an LCD connector.
43:00
We have the VGA connector, we have VGA out for free. So why not just implement that as well? One thing that hasn't happened yet, nobody has written the software for that. Maybe it's something for future development, it's not that important. The LIME tool provides the HDMI output, the Adafruit adapter. Oh, in case, let's move to the next slide.
43:22
It's this bit on the side, and this is what we've been running all weekend. Now you can tell that we've been in this room. Yeah, only in this room. There's only one setup that we have. We have seven of these boards, seven of these setups implemented. There's two in Leuven. Uwe, you have one.
43:40
I have two, and then Fazil and Marjan have one each as well. So we are running this, and this thing is very stupid. You can't talk to us, the DVI decoder. And it doesn't deal well with quarter megahertz resolutions, dot clock resolutions, and you can tell because the first four pixels up there are a bit late.
44:03
And our capture engine is a bit weird. It doesn't have a display enable. It needs to be told when the syncs happen, and it needs to know this many pulses after the sync horizontal or vertical happens is when we start capturing. And because this module is a bit stupid, the first line is always messed up.
44:22
I've had this running for a full day capturing full HD at 60 frames per second. I think it was like 14 terabytes of data that it tested with a special overlays that I developed that I could figure out here and in the other corners where all the pixels were correct there. And we lost 400 pixels, always the first pixel up there,
44:44
over millions of frames. So if we over a flat cable like this as well, so if we move to the ADV, which we can tell to boost the signal as well, if we have this all on the same board, we will not lose a single pixel on full HD resolution.
45:03
The issue we do have is that the Allwinner 820 is pretty limited in memory and bus bandwidth. And we run into that mainly because we're using the scaler badly on the status screen. So the upcoming design we will reuse the work that Heor and Gabriel did
45:24
and add our own connectors to that. So the LCD connector we want to keep because status is very important. VGA is nice to have, and we've mostly implemented it already. In hardware it's an easy thing to do. VGA? Okay.
45:42
Software will be a bit more work, and we will also have to build the audio breakout connector. So this is the whole pipeline we're using now. One other interesting thing about the CSI-1 engine, so the capture engine is that it doesn't do proper RGB as we know it.
46:00
It does this planer, so we have a separate plane for each channel, so a separate bit of memory for red, a separate bit of memory for green, a separate bit of memory for blue. Nobody does that. I think yesterday somebody said, or it was Daniel or something, who said Amigas used to do it, but Amigas were a few years ago.
46:21
So what? Next mode. Okay. So V4L Linux doesn't know about it, and DRM doesn't know about it either. One interesting thing is that no display engine usually can handle that, but if you have YUV, you have color space conversion, and you have a matrix three by three.
46:40
If you put a unity matrix in there, you have the exact same result. So that's what we do there. Then I've been having tons of fun and games with the display engine. We've been driving this poor status LCD for three months at 154 hertz because the clocks of this pipeline
47:02
and this pipeline ended up being tied together. One changed and then ruined the clock for the other, so I've been fixing that sort of stuff. I've been adding overlays, more overlays than KMS can handle. KMS has a, because it was originally,
47:20
planes for KMS were originally made for Intel, and at a time when Sandy Bridge happened, and at the time of Sandy Bridge, they had finally gotten to the stage where we can do it all in a 3D engine, and we had one or two overlays left, or three, something like that, very, very little. Back in that day, I was actually consulting for you for a bit,
47:42
and I had to go and get, 2011, I had to go and get an older EPC to be able to get an older display engine where I could play with planes a bit more to implement one version of Hardware Composer that fell off victorious in the meantime. But yeah, each of these display pipelines
48:02
supports 37 overlays, and KMS supports 32 in total for the whole driver. So we have four base layers, we have a hardware cursor, we have a background color, and we have sprites which are not scalable, are a single color format, but it's like 2K on 2K,
48:21
and then it's just a memory bandwidth issue, and then we can place all sorts of display games. Now for the encode pipeline, since we have this really weird color format, we have to tie a few engines together. So the G2D part is the one that will convert from R8G8B8 to NV12,
48:42
which is half resolution, so 12 bits. It's YV420, in case anybody knows that sort of stuff, or wants to know that sort of stuff. In between that, there's always a part of the encode engine that's the media ISP, and it can't handle the resolution. What it can do is do thumbnailing,
49:01
so it will take one picture and then rescale it. I haven't gotten it to work yet for any scaling, but that's what we hopefully can get to work and can make use for our status display that we're no longer trashing the bus. And the H.264 encode, I have been doing a bit of archaeology, and by this time it was already almost October, November.
49:21
I got Jens Küsker from the Linus Sunxi team, his old code from 2014, and all the various forks that remove this copyright and remove those commits. I got them all figured out and did a bit of archaeology because I like those guys. And this is how Open Source actually should work. And in the end, I moved all the code
49:41
from user space into the kernel, and we are hitting with the bus being trashed. For a 40-second file, we require 20 seconds to do encoding at 720p, so we're twice real-time. If we stop trashing the bus, we are four times real-time,
50:00
so we have space left to maybe, if we stop trashing the bus, to do 180p, but I can't promise this at this point. But all of these things have been verified, so it's time to move on. All of these engines, the fact that the capture could do this resolution with a parallel connection, the fact that the MPEG encoder could do,
50:22
the H.264 encoder could do this, all of this were still unknowns, and we've all tested it, and it's all been working pretty well. So we are moving forward slowly. So the work done, I think I went over most of the things already. There's a tool that is written specifically for this using KMS and V4L Linux.
50:42
It's now heavily threaded, which will change, but it's just keeping all the buffers in flight. And we have so many things happening on overlays. We have all the data coming in and going to different places that I don't think anybody in this planet will ever be able to write a gstreamer pipeline for that.
51:01
We can't debug standard gstreamer pipelines. This one will be something special. No, so this will probably be a custom tool that does all sorts of stuff that is very specific for conferences, but which will show anybody else who wants to play with it how to do this. Also, this tool is being built up at the same time that the drivers are being built up
51:22
and verified, so these things are growing together, which is what it is. Yeah, so high memory bandwidth is our biggest problem. So what we're going to do is now I need to go test the HDMI active signal. So at this point, whenever there's no frames coming in because my code is that bad,
51:42
the threading code, if it doesn't see frames, then it waits for 1.60 of a frame, and if it doesn't see a few frames, then it will say, oh, let's show this other image, the one that you see when the speaker unplugs. And this is six months of work to have this one feature here. That's a nice thing, right?
52:01
So six months of work for that. So I need to verify I2S coming from the ADV, from the HDMI decoder, and verify the audio, and then Uwe can go and finish the design of the actual board and start prototyping it and getting it made. And then there's a few other bits and bobs
52:21
that I need to go do. There's a lot of work ahead still. But everything is looking as it will work. Everything that we've wanted it to do, it has done so far. And it's just a matter of just doing it, and we are doing it. So the future is then, with the external audio connectors that we will have,
52:41
is that we will have balanced audio for XLRs, that we can use the standard microphones and this and that and plug this directly. We will need to have phantom power for that, which is going to be a bit of an issue to go from five volts to that, but that's what Uwe can work on after he's done with the first thing. One other thing we want to do is the separate switch that we have.
53:01
Oh, apparently time's up. Okay. Yeah. Then let's have the next talk. So the deployments very quickly. This is how we're going to deploy this year. We still haven't talked to the FrostCon guys. We've been too busy. Maybe we should have done that before. Yeah, yeah, yeah. But the CCC guys are doing most of FrostCon, and the dev rooms are not being recorded,
53:21
and we will try to take that over and test our hardware. Then we will do OpenFest, because that's already always involved with. Then in a year from now, there is this conference that nobody heard of that we might want to roll this out. By the way, the dates you saw there are not confirmed, so don't book anything. No, no. It's just me guessing.
53:42
Here's some URLs so that it's in the slides. Question time. I'm really sorry, but this will have to go outside.
Recommendations
Series of 110 media
Series of 70 media
Series of 64 media
Series of 150 media
Series of 199 media
Series of 84 media