WebPageTest: Licensing and Update
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Alternative Title |
| |
Title of Series | ||
Number of Parts | 637 | |
Author | ||
License | CC Attribution 2.0 Belgium: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/52830 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
QuicksortGroup actionProjective planeBitPoint (geometry)Computer animation
00:35
UsabilityComputer-generated imageryPoint cloudSmith chartWeb pageCodeSoftware testingQuicksortInstance (computer science)Service (economics)Projective planeAdditionProcess (computing)Uniform resource locatorScripting languagePoint cloudFile archiverTask (computing)MereologyPoint (geometry)Scaling (geometry)Line (geometry)Right angleGoodness of fitData structureUniqueness quantificationFirewall (computing)Web 2.0Core dumpMedical imagingUtility softwareServer (computing)Open sourceProgram flowchart
04:18
Product (business)Software testingOpen setQuicksortBasis <Mathematik>CodeSoftware testingData storage deviceInstance (computer science)Internet service providerLine (geometry)Product (business)Goodness of fitData structureFreewareMathematicsWeb pageGraphical user interfaceWrapper (data mining)CASE <Informatik>Web browserNear-ringRight anglePoint (geometry)Computer configurationServer (computing)Special unitary groupSoftware developerResultantData centerInternetworkingXMLUML
06:45
Computer configurationPersonal digital assistantCodeComputer-generated imageryPoint cloudUsabilityQuicksortBranch (computer science)Different (Kate Ryan album)Software as a serviceSet (mathematics)CodeService (economics)File archiverOpen setSoftware developerOpen sourceComplete metric spaceSoftware testingComputer configurationWeb pageMereologyDistribution (mathematics)SoftwareInstance (computer science)Field (computer science)Source codeForm (programming)Product (business)Projective planeInternet forumSpacetimeOnline helpNetwork topologyComputer animation
11:19
Linear complementarity problemNormed vector spaceView (database)AdditionWeb 2.0Core dumpContent (media)ResultantQuicksortAnalytic continuationMenu (computing)Software developerStructural loadShift operatorSoftware testingWeb pageEvent horizonValidity (statistics)SequenceMetric systemGreen's function
12:55
Canadian Light SourceStructural loadWeb pageFrame problemQuicksortContent (media)CuboidGreatest elementSource codeEmailVideoconferencingShift operatorWeb 2.0Core dumpCumulantCASE <Informatik>Total S.A.Right angleNumberCanadian Light SourceBitData managementRemote procedure callGraphical user interfaceWeb browser
14:17
Web browserQuicksortArmGraphical user interfaceBinary codeoutputForm factor (electronics)SoftwareStack (abstract data type)Real numberSoftware testingComputer simulationMultiplication signMessage passingData managementRemote procedure callBefehlsprozessorString (computer science)Artistic renderingResultantCovering spaceIdentity managementWeb browserWeb pageJava appletWeb 2.0Arithmetic meanTwitterDifferent (Kate Ryan album)Video cardScripting languageMaxima and minima
17:22
Element (mathematics)TwitterComputer animation
17:42
Element (mathematics)Computer animation
Transcript: English(auto-generated)
00:06
All right, hi everyone. I'm Pat Mainen. I'm here to chat with you about WebPagetest. It's been a fairly eventful year. I want to talk a little bit about the licensing changes, the catch point acquisition, and sort of some of the background on why we picked the license
00:22
we did, particularly being a FOSS conference. I thought it would be interesting for everyone, and also some updates on what we've delivered since then on sort of where we're going and what excites me about it. So really quickly, overview on WebPagetest as a project and sort
00:40
of as an ecosystem. There's the open source code in GitHub, which is sort of the core of WebPagetest. It has the web server code, the agent code, a bunch of utility scripts for installing, and it's sort of the core. Everything you need to build a WebPagetest is up there on GitHub. Up until recently it was all under a BSD license, so you could do whatever
01:06
you want with it, and anyone could do whatever they wanted with it. There's public instance of it, so sort of what you know of WebPagetest is webpagetest.org. That's the public WebPagetest instance that I run historically out of my basement and with partners running test locations
01:22
globally. There's the HTTP Archive, which is probably one of the biggest WebPagetest instances, which is also publicly available and managed by a team of us. It runs 14 million URLs monthly, give or take, on their own private instance of a public WebPagetest and collects all the
01:45
data and makes it available. There's a bunch of internal use private instances that companies around the world run. They just take the code or the available images and run their own WebPagetest either within their firewall or just for doing a lot of testing. Usually API testing stuff
02:04
that you couldn't do with the public instance because it was running on my infrastructure in my basement. You needed to do at scale testing or you need to test behind your firewall. And then there's a few commercial services that are built on top of the WebPagetest code as well,
02:20
so they use WebPagetest for doing all of the underlying testing and then they build their sort of value add on top of the underlying WebPagetest. And sort of in addition to the free code that's up on GitHub, there's images that I provide on AWS and GCE that,
02:40
primarily for the agents, there are server instances as images as well, but so you can scale up and down testers if you would as needed in the clouds in whatever location you want. Those are used both by the public WebPagetest as well as private instances and some of the commercial services we're using those as well. And so middle of last year, give or take I think
03:06
towards September, Catchpoint acquired WebPagetest.org, the public instance of WebPagetest that I run. And so as part of that acquisition, we're sort of figuring out what we're definitely focused on
03:21
keeping it open and free for everyone to use. If you're running private instances, we want to continue to support that market. We want to try and keep the community as engaged as it has been over the years and add engineering resources to it as well, but part of what we also want to do is build a commercial service around WebPagetest. Somewhat like what you see with a lot of the
03:46
commercial monitoring, but maybe we can put our own spin on it in a way that is WebPagetest unique. And so they need to make some of their money back from the acquisition and make it worthwhile, but not at the cost of the community. And so what we've been balancing
04:03
is trying to figure out how do we walk that line, and how do we make it as open as possible and keep the community as engaged as it has been over the years, and still have a commercially viable path for Catchpoint. And so when you're building a commercial monitoring service,
04:22
if you would, the cost structure behind it, a good chunk of cost comes from the actual infrastructure, right? Running instances on the cloud, in data centers, running the server, storage of test results. Like right now the public instance I think has somewhere around
04:40
30 terabytes of test data stored historically. On S3 that gets to be fairly expensive if you're storing it on S3. On top of that you have sort of the product engineering and support for the value add wrappers that you put around the testing. And then there's the the engineering,
05:00
if you would, the costs around building the agents, supporting the browsers, adding new features, dealing with the Chrome changes that come out every six weeks, supporting the instances and that kind of stuff. And so Catchpoint in a top to bottom stack has to
05:20
all of those costs, right? And so if we make and we continue to make the agent available for free for all of our competitors, and we absorb all of the engineering costs for the agent development, all of a sudden we're at a competitive disadvantage even though we're sort of building the technology. If all of our competitors get to use all of our engineering for free and
05:44
they don't have to invest in that, they can have lower cost basis and undercut us on price. Which doesn't seem fair, so we wanted to figure out sort of where that line falls. And I mean, to be clear, somewhere around 95% of the the code contributions to the web page test agent code,
06:06
or the web page test code in general, comes from me. Around 4.9% comes from the community, and it's awesome to see the engagement. And around 0.1% of the code comes from other commercial providers. And so that 95 to 0.1% basis is sort of what we're talking about here.
06:27
And if it was more of a 50-50, then we'd all be sort of absorbing the costs. But we need to figure out a way to distribute those costs that's fair without compromising the openness for the community and the ability for everyone to just continue using the code in cases where
06:43
they're not competing with Catchpoint. And so we looked at a bunch of options, right? And so the easiest option obviously is do a fork. That's what all of the commercial vendors do. And do all of our development in secret. It doesn't feel like a great solution for web
07:00
page tests in general because that cuts off the community and everything else. And so we didn't want to cut off all of our... we want to continue to contribute for the community, for everyone running private instances to be able to continue to benefit from that work. And so a private fork didn't seem like a good idea. We did look really hard at all of the OSI approved
07:24
licenses, trying to keep it as open source as possible. Unfortunately they're all sort of from the days of licensed software that you distribute, put on floppies, or send around. And so they protect code distribution. They protect if someone redistributes your code as part of their product.
07:45
But they don't work very well in a SaaS offering. And so if we protect it with like an AGPL license or something like that, there's nothing stopping from someone building a complete SaaS offering and never distributing the code built on the same code. And so it didn't seem like a great
08:06
solution. We couldn't find any license that protected use rather than distribution. And so another option, and this is like what Mongo and Redis ended up having to do with their service offerings and their code, is build a custom license. And we looked long and hard
08:26
at that. If we could avoid it, we'd like to avoid a custom license. We'd like to try and stay as standard as possible and have as little explaining to do about what's special about our license. And so we came across the, and you know with help from lawyers in this space,
08:45
the Polyform project, which is a set of licenses that came out of the work that Mongo and Redis have had to do to protect SaaS offerings. And so it's what they call a code available license. It's not technically open source because it's not OSI approved, but it makes
09:03
all of the code available with various different restrictions depending on which one of the licenses you picked. And so we picked the Polyform Shield license, which was the most permissive that we could find, that would still give us protections against a competitor
09:22
using our code without contributing to either the code or to the costs of development. And so what that ends up doing is we forked, well yeah, we branched the GitHub code. So there's an Apache branch of the code, which the community is more than welcome to continue to contribute to. It
09:44
continues with the Apache 2 slash BSD open license used to do whatever you want with. But the main trunk of the code is now under a Polyform Shield license, which is as permissible
10:01
as possible. So you can use it for anything you want as long as you're not using it in a way that competes with Catchpoint. And so if you're using it for internal use, you can continue using private instances, continue using all of the code that we develop. We're going to try and keep it as open on GitHub and up to date on GitHub as we develop it.
10:26
If you're running one of the public instances, you're running HTTP Archive for example, we're going to continue to work with them on using the latest and greatest code. If you're running a commercial service, that's where you're going to have to either
10:40
contribute and work with the community on the Apache branch or work with us on a license for the commercial license for the Polyform for the main tree so that it sort of evens the playing field. And that's sort of a lot of the background on why we picked the license that we did. We're trying really hard to try and keep this as open as possible for the community to
11:06
continue using the way they have been and minimize the disruption to as many people as possible while still keeping it fair for everyone involved. And so off of the the licensing
11:21
background and into sort of some of the more exciting features, you know, what's been happening, what have we been working on with web page tests since the acquisition. And so, you know, Core Web Vitals, they're the latest and greatest. They're coming out, they're going to be impacting search results. We've been trying to figure out how do we expose the underlying
11:45
what's going on with the Core Web Vitals more to developers. And so one of those is Largest of the Core Web Vital metrics. It's the one has the core content become visible. And so
12:02
one of the recent additions is in the filmstrip view of web page test, if you ask it to highlight the Largest Contentful Paint events, the candidate events as the page is loading will get highlighted in blue. And then the last Largest Contentful Paint, which is the one that ends up being reported as, okay, this is the largest thing that was loaded
12:25
in the loading sequence, gets highlighted in green. And so you can see what the LCP triggering event was. And you can do some validation. There have been browser issues where, you know, it reports the wrong thing or the content is visible way before LCP triggers. And
12:43
it sort of gives you a way to validate that as well. But it also lets you see what it's triggering on, so you can figure out, okay, how do I optimize in getting that piece of content loaded sooner? Continuous layout shift debugging. There's sort of a few aspects to this. And this
13:02
is the Core Web Vital that wants to make sure that the content on the page doesn't shift around a whole lot after the user loads the page. And so the first thing that we landed was, and you can say highlight layout shifts at the bottom of the filmstrip, it'll highlight in red a box around the piece of the content that moved since the previous frame.
13:24
So you can visually see what's going on. What that doesn't tell you is how much the content moved and how much it contributed to the overall CLS. And so what we also added was below the filmstrip there's a number that tells you, okay, on the right is the total layout shift
13:43
for the cumulative layout shift for the whole page loading, and then on the left is how much this frame contributed to that. And so like in this case the first frame, it looks like a whole lot of the content moved, but it actually moved a very little bit, so it doesn't contribute to the CLS a whole lot. Whereas the last frame where it put a sort of video header across the
14:05
whole thing and moved all of the content down, that one was both a large piece of content and moved a lot and contributed significantly. That's the main source of the CLS in this page loading. And then we get into sort of what really excites me as sort of expanding the browser
14:23
footprint. The Chrome team is awesome. They've got awesome developing tools, debugging tools, remote management support, and so a lot of testing is done with Chrome. Unfortunately that
14:40
doesn't cover all of the market. There's a huge amount of the market that is on Safari, and I've been trying to figure out, okay, what can we do to get better testing on Safari and make it easier to test on Safari? You can emulate an iPhone on Chrome, but it's a different rendering engine. It's a different networking stack and prioritization. It's a different
15:10
content that's targeted at the iPhone user agent string. It doesn't actually behave like Safari would, and so my first pass at that is we added support for Epiphany in Linux. So you can test
15:23
with WebKit. It gets you close. It gets you the rendering engine and the JavaScript engine from Safari, but it doesn't get you the networking stack, and to me that's kind of a big deal. And so much more recently, and it should be hopefully announced by the time this comes out,
15:44
is added support for the iOS simulator. And so WebPagetest has had support for iOS device testing, but scaling devices is hard. You need a lot of devices, Raspberry Pis, the networking setup is complex. We can now run WebPagetest in the iOS simulator,
16:03
testing every device form factor that's available using the real Safari engine with real networking stack and the whole shebang. And we get all of the the detail that we get when we normally run Safari on real devices. So we get the full waterfall. We get almost
16:22
every feature that's available in WebPagetest is available with Safari as well. And the really exciting piece to me is Apple Silicon. So we can now run WebPagetest on M1 Macs using the native binaries. So Chrome ARM build is interesting, but what's really interesting is the iOS simulator
16:46
on an M1 Silicon Mac because it's running the same CPU that the phones are running. And an iOS simulator versus an iPhone 12 return identical results. And so with one Mac Mini
17:05
now you can test every device form factor running the real silicon that are running iPhones running Safari. And so you can do all sorts of really scalable iOS performance testing and Safari performance testing now. And so I'm really excited to see where we can take that.
17:23
And that's it. I've got a couple minutes for questions and I'll hang around. So thank you very much. And you can always ping me on Twitter as well at PatMeanIn.
Recommendations
Series of 6 media