We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Blender projects for 2020

00:00

Formal Metadata

Title
Blender projects for 2020
Title of Series
Number of Parts
490
Author
License
CC Attribution 2.0 Belgium:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
An in-depth look at the development process of virtual reality in Blender. This project started in 2014 and illustrates well the development channels, how Blender does onboarding, how the development team collaborates with others interested parties, and the role the community has in the projects.
Software developerCoordinate systemPlanningWhiteboardMathematicsStapeldateiPhysicalismPatch (Unix)TelecommunicationWeb pageMetropolitan area networkProduct (business)Thomas BayesMoment (mathematics)WebsiteBitVirtual realityAreaAugmented realityMultiplication signLimit (category theory)Shooting methodGoodness of fitOpen setComputer fileCausalityGodPoint (geometry)Different (Kate Ryan album)Time zoneAuthorizationArtistic renderingLocal ringPlug-in (computing)Stress (mechanics)GoogolContent (media)Military baseRevision controlReverse engineeringHead-mounted displaySystem callGroup actionClosed setExterior algebraGame theoryAuthoring systemTDMAIntegrated development environmentLevel (video gaming)INTEGRALPrototypeEndliche ModelltheorieCyberspaceInterface (computing)QuicksortPairwise comparisonOperator (mathematics)Standard deviationElement (mathematics)NeuroinformatikArithmetic meanComputer architectureDemosceneOrder (biology)ImplementationDressing (medical)CollaborationismMixture modelData structureObservational studySoftwareAbstractionOnline helpGreen's functionVideo gameCodeStudent's t-testTrailDisk read-and-write headCore dumpSmoothingDesign by contractFundamental theorem of algebraOffice suiteUser interfaceProcess (computing)MereologyRepresentation (politics)HypermediaSheaf (mathematics)UsabilityDependent and independent variablesInteractive televisionSource codeOpen sourceState of matterPublic domainChainGeneric programmingDegree (graph theory)FeedbackCycle (graph theory)Smith chartFlow separationArithmetic progressionTraffic reportingLattice (order)BlogRational numberTap (transformer)TwitterSineRight angleTouchscreenMultilaterationSet (mathematics)Natural numberKey (cryptography)Term (mathematics)AnalogyPlastikkarteSequenceView (database)Text editorVideoconferencingGenderLine (geometry)Texture mapping
Transcript: English(auto-generated)
My name is Dalai. Nowadays I work as a development coordinator for Blender at Blender in Amsterdam. I started coding for Blender 11 years ago, my first patch into Blender.
And although it's fascinating to talk about the Blender project for 2020, I think you can all find these in our official communication channels, which a lot of you don't follow. So I recommend going to code.blender.org. It's our main communication page from developers to developers. We just recently posted something about the main 10 projects for this year.
But for this moment here, I think it's more interesting to take from all these projects, which you can see how it's going to be affecting for the end users, what it's going to bring to Blender.
Leave it in the side for a little bit and talk about a particular project, which for me is a particular pet project. It's an area of technology in Blender which I'm quite passionate about, virtual reality. Virtual reality or augmented reality or XR, which is an umbrella for all of those.
And from this project, try to illustrate a little bit how do we get new developers in Blender, how do we get new features into Blender, how is our communication, where does that happen, how it is to be transparent and open and big at the same time. So let's start our journey here. In 2014, we're starting to play with VR in Blender.
So this is the Gooseberry project. At the same time, one of the open movies we had in Blender was starting. Who here knows about the open movies? Okay, good answer. Open project made by the Blender Institute, Blender Animation Studio.
And I was playing with one of their files and starting to render with stereoscopic VR. Basically, you create two panoramas, you put an oculus, and you can look around. 2014, like six years ago. It was actually really well received. At the time, the mother of the director of Gooseberry, of the Cosmo Blender method, was there, she was like, fascinated, oh my god.
And for me, it was because that was something that did not start as a feature for VR. It started as something for full domes, it's a different kind of technology. And I started working on these as a contributor, as a volunteer, and at some point, made it to Blender mainstream.
In 2016, we said, okay, if you can render in VR, what if you could give one to Tom as well? He'd be very thankful. What if you could also author in VR, if you could experience and see how the project is going
before you go all the way to the final rendering. So we're starting to experiment with storyboarding into Blender, but then putting in one of those headsets and looking around. At that time, it was a plugin using the Oculus SDK.
Totally non-GPL compatible, totally experimental, non-official. But it got things going. The technology is very interesting, and we're starting to get some traction. In 2016, the Blender Animation Studio, they had the Camenades,
one of their open productions over there. And they said, what if you could partner up with Google? Google at the time was promoting a lot of their cardboard headsets. And they wanted content, and they wanted authoring tools. DCC, they called it, Digital Content Creation Tools. And Blender, as a big umbrella,
was never afraid of partnering up with someone as big as Google, if it was of common interest. There was nothing that was violating the openness, the freedom, it's the other way around. They allowed us, allowed Blender to actually move forward a little bit with its VR agenda, and produce some really interesting content.
At the time, they were using OpenHMD. It was a Linux-based, reverse engineering-based, HMD VR support for Blender and Linux. Why am I saying that? Because at the time, one of the experiments I was showing
was using the official Oculus SDK. The other one was using some reverse engineering Linux. But how does it become mainstream? How can we take something like this and make it into Blender? The interesting thing is, there are a lot of people interested in that. That project I was showing here, something I was actually developing personally, myself,
in a research facility in Brazil, because people from Oculus Animation's Oculus Story, which was a mini-group inside Oculus, the company was bought by Facebook, they wanted to experiment with storyboarding for VR experiences. The alternative was to draw something, animate, compile the game,
and then see how it looks like, and then do it over and over again. And if you could have something that's the same authoring tool for the game, it can also be used to preview the game. It's actually very impressive. But I have people like the MPX Project, which are people that were already involved in Blender. I have guys who here know what is the grease pencil in Blender.
Grease pencil is what allows people to do those 2D drawings in a 3D environment. And the whole grease pencil theme is actually composed by people that are not in the payroll of the Blender Foundation, like contributors, and they've been involved for a few years already.
And they went to the next level, and decided to think how what if we could also draw in VR. We could be working with Blender, put in Oculus, and work a little bit and take it out. So that was the whole idea. And what they wanted is a seamless integration.
They wanted to be able to not have to learn a new UI, not have to learn a new way to interact. So then, for example, they wanted the whole Blender interface inside the VR space. And they made it. They had someone hacking their way around Blender. It was a really nice prototype. We had people like Blender XR, which is a company called Marui.
They developed a plugin for Maya for VR, basically a plugin where you could do all sorts of operations immersed in VR. So it's only UI, it's only UX. And they wanted to start supporting Blender. But they had the same problem we had in the other projects. How can we do something that's compatible with GPL,
with a Blender license, and at the same time compatible with the industry standards for the Oculus or the Vive or the Microsoft Lens. Interesting problem. BlenderFX is a group in Germany. It's also been involved with the Blender development as contributors. So users, artists that use Blender, giving us feedback,
sitting together with developers to think about the features. And they were taking some of the open movies and re-rendered them as a panorama and have a small TV that you see immersed and you see the video playing there. And they were also interested in using Blender in VR for scene inspection. They were doing architecture reconstruction,
wanting to be able to look around and then look around. And before rendering the final thing, they wanted to use Blender. They're using Blender. And, surprising, Ubisoft. More recently, they joined the Blender Development Fund. It's a way that, if you watched Tom's talk, probably he covered most of the history of how the funding happened to Blender,
where we are today. But, basically, Ubisoft not only joined the Blender Fund in giving Blender money, but also promising to allocate some development time on their own team, Ubisoft France, to actually implement features into Blender. And they are particularly interested in VR because they want tools for set dressing.
So you have your set, you want to put different furniture or pebbles or stones. Or, for a director, and you want to see how a shot is going to look like before you render out. Since everything nowadays goes to computer, goes to 3D, Maestro also has authoring tools for the director to be immersed.
So how do we consolidate all those different tools? And what's the role of the foundation in all of that? Because if you think about it, the foundation can grow until a certain point. But the fundamental role of the Blender Foundation, I would say, is to make sure the collaboration can happen. To provide the infrastructure, to provide the onboarding, and to make sure it cannot work together.
Luckily, for everyone involved in that, last year, in June, was the first release of OpenXR 1.0. OpenXR is a standard by Kronos. Kronos is the same group behind HTML, behind OpenGL.
So all those are Kronos we got used to. And they tried to unify this whole ecosystem for VR. While before, every time a software wanted support, either Oculus or 5 or Microsoft HoloLens, you need to support their SDK, you need to be compatible with their SDK, yada yada.
It created a whole abstraction layer where Blender only needs to worry about OpenXR. And the same way OpenGL is integrated in a low level in the operating system, it's just less compatible. I don't know the details. It shouldn't matter. But it allowed Blender Foundation to say,
you know what, we officially can actually help bring VR into reality. So last year, we had Julian Eisel to participate in the Google Summer of Code. We still use the Google Summer of Code to bring new developers on board. And he basically got as far as, I'll say quite far,
a whole scene inspection. This is, again, one of the open movies we had at Blender Animation Studio. And he was using OpenXR to do the whole thing only on Windows, I believe. Because on Linux, they still don't have head tracking. But we said, you know what, we can, as the Blender project, support the basics of VR.
VR is a very niche project that is a bit overkill to dedicate... We have the whole Dev Fund, right? To dedicate core money to feature creep and try to get a really smooth VR experience. Who here has a VR headset? Oh my god, there's actually a...
It's in a lot of people. We are like a very biased sub-sample. But it's a very interesting topic, and we developers, geekers, thinkers, we like to think about that. So the idea was to at least get the basics running, the fundamentals.
Julian actually been a contributor to Blender for years already. So six years. He's been on and off in the Blender contract. He participated in two Google Summer of Codes. First commit was in 2014, and nowadays, after the Google Summer of Code, it was a nice project. Julian also was available.
He said, you know what? Come on board. We had enough funding at the time to have him working for time. And he's not there only to do VR, using interface and basic, but fixing, triaging, everything else. But it's also part of the process here. It is working well. Let's get him involved,
and let's commit both parts to continue this relationship. Of course, again, we don't do anything only by ourselves, and for ourselves. So for instance, the Blender conferences here actually got a small representation of those groups
I was talking about. People from Blender FX, Danielle from MPX, people from Ubisoft. We then could sit together and say, okay, what can the foundation do, and what can everyone else do? So I agreed that the first two milestones is by the foundation, which is a basic OpenXR support. It is a basic scene inspection,
and API for drawing in VR. But what's going to be the experience, the usability into VR? No one knows. It's going to be a few years until that's consolidated. So Danielle's going to lead a third milestone toward basically drawing and sculpting interaction.
Madui, which is not represented here, will make sure their plugin can be run into whatever API we come up with. Then you're going to have Ubisoft keeping us in check, because at some point you're going to say, hey, they're basically there. Go have fun, and give code back. Give code back. But we cannot have people gathering physically at all times.
It's not practical. And at the same time, Blender, as a project, has years, 20, 25 years? Way too many years. 18 years. Since it's open source. But it's since 1998 is when it was online.
And the whole infrastructure of Blender, of the communication, was built on top of, mainly at least, 2002 here, IRC, which in a way didn't aid it so well. When it comes to competing with Twitter, to Facebook,
people are going to use the channels they're used to use. So we are trying to modernize a little bit. So this is a rocket-based online IRC. So we have a generic. We call Blender.chat, is the website. Anyone can go there. It's where developers now talk among themselves.
It's where we work. We even, as an example, have a whole VR room channel, only dedicated for that topic. And it's really the place where people are supposed to work. We have a dev talk, which is a discourse-based website, where, again, we try to separate user feedback and usability from general development. But the idea is that the user, the cycles module,
the Glee Smith module, the VR group, should be able to use that among themselves, and everyone can read and can follow and can, in a way, interact. We keep everything open and transparent. We have the, every Monday, we have a development meeting,
and we keep everything there, posted, what happened, what didn't happen, the progress reports of every single developer. Jesus Christ, shame on me. There's only one more slide. And we, as mentioned before, we have a nice blog where we encourage not only our own, it's not a place for PR.
We're not here about PR, we're not here to talk about marketing, but a place where people can share development. We're the degree spencer team, which is working in Spain, in Argentina, and we kind of quit taps on their work and have some collaboration. They can go there and own that space. The Blender infrastructure is a place for everyone
that contributes to the Blender project to really take ownership from. And, of course, use this for also outreach for the community, where you're using YouTube, you're using Twitter. So, overall, we are in a bit of a transition from going over a tradition, IRC, and mailing lists, and lonely developers
working their own little projects, to try to be bigger and try to get more people to collaborate. So, basically, we hope that you give an insight into the project and more people can join us. Thank you.
I also believe we might have five minutes for questions. So, if anyone has any questions or comments. Were you here from the beginning? In the beginning, sorry.
More or less. Maybe the first Blender or so, yeah. No, that wasn't fun. You made us talk later. If you go to the code.blender.org, we posted this last Monday. There's an overview of the ten big core projects. There's even more than that, but those are the projects that's everyone's responsibility.
Every single developer in the core team would be responsible for this. The criteria, we didn't put there, but it's basically if a developer goes away and they want to lead a project here, someone would fill it in for them. If the VR developer goes away, maybe we won't have VR for another three years.
Yeah, that's the reality, right? Those are the core projects we're gonna make sure is delivered. I see a hand here. When you actually use VR and you see the same thing in your screen and then in VR, it actually is really different.
I mean, the feeling of seeing the models is really different from... So, exactly how does it help the productivity to have the VR? Is it just for making VR? Just for example, using VR would help you make a movie later. I'm just curious how VR helps the productivity because I would assume if I make it something on the screen,
people may look at it on the screen, but if you make it on VR, then the screen is going to be different probably. So, just my question is how... Could anyone hear it? Could it get the question down there? What's the point of VR if you're not only doing VR?
If you're doing VR, it's very obvious, of course, but again, Ubisoft, for example, they'd like to... When they're immersed, they can still have a traditional virtual camera there, like a preview, in the virtual set. If we're doing architecture modeling, I want at some point, that's more AR. You are here in Blender and you put your glass and you see for modeling this lecture room.
You can see the whole lecture room and go back. If we're doing character modeling, you probably have seen those making-ofs of Disney where you have these sculpted characters. You can also use these to inspect them. I'm totally biased, but for sculpting,
can I actually sculpt in a more natural medium, right? You actually see and move around and touch. But the real answer is no one knows. But we're willing to give it a try for the technology as long as there is this other part from the community embracing that as well. That's kind of the process.
We need the creative people to come back too. Maybe there's a good mid-term, a good compromise between digital and analog. Do you have a question?
For 2D? I don't need to. To use Blender. Oh, to use Blender. For 2D people. There's a whole roadmap now for storyboarding, with tools like Blender.
Even for 2D, it's so handy to have a camera where you can actually pan around that you can reuse the assets for different shots. That's a whole discussion. You think it might happen for one last question? It's not.
We do have a developer, Ricardo Antalic, who is being hired to work full-time with Trigene to help the infrastructure as a whole, but also to try to tackle the video sequence editor. But it's one of those projects that, if he had for whatever reason decided to walk away,
we won't be able to prioritize. However, it's in the agenda. This Vulkan, for instance, is in the agenda as well. More storyboarding for 2Ds in the agenda. More scoping tools, texture painting tools. There's a lot of work being done. We had to draw the line at some point. The other can promise. Everything else is circumstantial.
Again, anyone is welcome to add to Blender. Ideally, to see what's the roadmap of Blender to try to help on top of that. Otherwise, it might get too complicated. Everyone's welcome. It's a bit of an open end, what's going to be there. We can only tell when it's ready. Thanks, everyone, for the time.
We have speakers here. That's all the time I have. Thank you.