We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Python Packaging - current state and overview

00:00

Formal Metadata

Title
Python Packaging - current state and overview
Title of Series
Number of Parts
160
Author
License
CC Attribution - NonCommercial - ShareAlike 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Python Packaging - current state and overview [EuroPython 2017 - Talk - 2017-07-10 - PythonAnywhere Room] [Rimini, Italy] Historically, Python packaging has been a source of significant pain for even the most devoted Python enthusiasts. I've found myself in the situation, where I did know the basic concepts behind the tools, but despite that only thing I could do was following tutorials. That was the time to change it and that's the reason this presentation was written. In this talk, I'll provide a quick overview of the current state of Python packaging tools. I'll mostly focus on setuptools, pip and wheels, putting an emphasis on their superiority over their precursors. I'll also list down the honorable mentions of tools worth knowing. Then I'll share examples of how you can use the features of the Setuptools library - those well known and those we use when pip-installing packages, but most of us can't name them. The point of this presentation is to explain how to use tools which are all there, just waiting to make developing, testing, and distributing our Python packages easier. Doesn't matter if you're a Python expert or a beginner - the knowledge covered by this presentation will be useful despite your level
BitSet (mathematics)NeuroinformatikLecture/Conference
Electric currentPauli exclusion principleSource codeMaxima and minimaData storage devicePhysical systemBitSet (mathematics)Pole (complex analysis)Electronic mailing listPresentation of a groupAuthorizationComputer architectureOcean currentSource codeCASE <Informatik>NumberProcess (computing)InternetworkingMathematicsFunctional (mathematics)Connected spaceProjective planeRevision controlProduct (business)Category of beingLogic gateCondition numberGoodness of fitSpacetimeProgram slicingParameter (computer programming)Computer fileWeb pageSheaf (mathematics)Data conversionMultiplication signLevel (video gaming)State of matterWordLibrary (computing)AreaPropositional formulaEmailIntegrated development environmentExpert systemMereologySoftware developerRadiusDescriptive statisticsInterior (topology)Online helpReading (process)Pauli exclusion principleUniform resource locatorGroup actionVirtualization
Installation artLaptopSoftware testingExt functorPoint (geometry)Price indexMetadataService (economics)Pauli exclusion principleData conversionComponent-based software engineeringCompilerDistribution (mathematics)Computer filePoint (geometry)Configuration spaceBitMultiplication signSoftware testingStress (mechanics)Revision controlPattern language2 (number)Physical systemSound effectSubject indexingQuicksortVideo gameIntegrated development environmentData storage deviceSoftware developerComputer fileLine (geometry)WordRange (statistics)Right angleLevel (video gaming)NeuroinformatikSet (mathematics)Installation artBinary codeWebsiteMedical imagingGradientDifferent (Kate Ryan album)Row (database)Functional (mathematics)Projective planeService (economics)SpacetimeMetadataCartesian coordinate systemReading (process)Distribution (mathematics)Arithmetic meanData conversionDirectory serviceMathematicsCore dumpUniverse (mathematics)Group actionPauli exclusion principleOcean currentAuthorizationScripting languageVideo game consoleLaptopElectronic mailing listJSONXMLUML
DistanceDistribution (mathematics)Covering spaceCompilerPhysical systemConnectivity (graph theory)Data conversionElectronic mailing list1 (number)Right angleBinary codeComputer fileXML
Installation artRevision controlDampingSoftware testingPauli exclusion principleSound effectFunction (mathematics)Formal languageLibrary (computing)Set (mathematics)Sound effectWebsitePasswordIntegrated development environmentSubject indexingSource codeSoftware testingProduct (business)Instance (computer science)Demo (music)Installation artLengthDistribution (mathematics)Multiplication signHash functionPresentation of a groupRevision controlPoint (geometry)Computer fileBEEPRight angleConvolutionArithmetic meanPersonal digital assistantMathematicsDirectory serviceConnectivity (graph theory)Filter <Stochastik>CompilerPiOnline helpFile formatVector spaceBus (computing)HorizonSocial classPattern languageRepository (publishing)FreezingServer (computing)NamespaceProjective planeGroup actionUniform resource locatorRoboticsType theoryMereologyProxy serverData storage deviceAreaCellular automatonLine (geometry)DatabaseControl flowVariable (mathematics)Binary codePoint cloudBitComputer clusterJSONXMLUML
Installation artFlash memoryMobile appMathematicsMobile appInstallation artVideo gameRevision controlPoint (geometry)Multiplication signXML
WindowGastropod shellView (database)Inflection pointMultiplication signSource codeBuildingDistribution (mathematics)Binary codeComputer fileLibrary (computing)InformationDirectory serviceSoftware testingSource codeJSON
WindowView (database)Gastropod shellMaxima and minimaSource codeHand fanSummierbarkeitDemo (music)Point (geometry)View (database)Mobile appSoftware testingRight angleSource codeJSON
Web pageView (database)PasswordSoftware testingProjective planeSource codeJSONComputer animation
Gastropod shellView (database)Route of administrationSoftware testingConnected spaceSource codeComputer animation
View (database)WindowComputer fileBitPasswordTouchscreenVariable (mathematics)Row (database)Demo (music)Integrated development environmentSource code
Gastropod shellOnline helpView (database)Information managementRegular graphLocal GroupOpen setEntrainment (chronobiology)Software developerMotion blurWage labourMenu (computing)Intrusion detection systemSlide ruleDemo (music)Point (geometry)InternetworkingMultiplication signInformationWeb pageDigital photographyBitPresentation of a groupElectronic mailing listComputer animationSource code
Wage labourMoving averageSoftware testingAuthorizationComputer animationLecture/Conference
Service (economics)Revision controlGroup actionPersonal identification numberSoftware repositoryCollisionState of matterDirectory serviceServer (computing)ExpressionLecture/Conference
View (database)Service (economics)Single-precision floating-point formatCartesian coordinate systemData managementAreaSoftware developerAuthorizationIntegrated development environmentRepository (publishing)Product (business)File archiverDifferent (Kate Ryan album)Game controllerInformationProjective planeZoom lensLevel (video gaming)Set (mathematics)Touch typingVirtual machineParticle systemLecture/Conference
BitComputer animation
Transcript: English(auto-generated)
Hi. So we've already had a little bit of introduction on the keynote today about a few things that I'm going to say, but I'm going to say more about it, of course. I'm going to introduce myself. My name is Jakub. I live in Poland, in Krakow. I attend a lot to local community, which is PyCon.
I also attend to PyCon.pl, which will be held in a month from now. So everybody is invited. We speak mostly English in there. And also recently I started teaching children. I started making conferences to invite children to the world of technology.
And this is Koderik and Stata, the two biggest cities in Poland. So that's a little bit about me and why I decided to make this talk. A few years ago I have put my first package on PyPI, and I have no idea how I did it.
I just ran some tutorial, copied some files, pasted it in, ran some comments, it worked. The package is still on PyPI. I don't think anyone has downloaded that yet. It's a minor package. But that was everything I knew about setup tools. And for those three years from that time, I've been waiting for someone
to make a good talk about Python packaging on any conference that I've attended. And no one did it. And even here I was the only one who proposed that talk. So finally I decided that, okay, I need to learn it by myself, I need to read it, consult it with people who use it every day, and make this talk on my own. And that's why I'm here.
So don't see me as an expert. I'm not a Python package authority, I'm just someone who read a lot about it and tried to pack it all into one 45-minute presentation. So I guess that there are people here in this room who have better knowledge on this topic than I have.
When I finally decided that I need to touch this topic, usually we have a simple architecture when we work on it. We have a project which has few dependencies. And that's what I was usually doing. But when I started working at YouGov a year and a half ago, the architecture was much more complicated.
And it is not true because the dependencies are also, there are a lot of common dependencies. So having all of these systems that we use, the services, with the requirements written in the requirements.txt files, and having them separately and keeping them on some shared storage, that would be awful.
And that is why we use setup tools and every package that we use has a well-written setup.py file. And when I first saw it, I thought, whoa, that's amazing. I have no idea how it's working, but that's amazing. So now I have a better understanding on that.
But the other purpose of this talk is to say a little bit about the current state of Python packaging, because it's still moving. Right now, there are a lot of changes going on. What I'm really glad, these are changes of existing systems. These are propositions how we should evolve.
This is not like in JavaScript where someone just adds a new library and says this is the new one that everybody should be working. But there are five people who do that simultaneously so no one knows what to use. In Python, we are evolving. So there is a packaging Python org held by Python packaging authorities, which shows us the current state.
And this is the list of tools that we should use as Python developers. And this is divided into two parts. The first one are installation to recommendations, which is, of course, PIP and virtual end. There are more propositions in this area.
For example, conda for holding virtual ends on your local environment and some more stuff. And there are packaging tools recommendations, which are setup tools that will be the main topic of this conversation. Bdist wheels, which I will also cover. And Twine, which I thought some time ago that it will be surely dead soon, but it is not.
Twine is a tool to upload packages to PyPI. And the biggest advantage of Twine for some time, at least when I started doing that presentation about a year ago, was that it uses HTTPS, what also setup tools does from some Python version.
But Twine has some more advantages and will be surely used more in the future. So this is what I'm going to talk about today. First of all, if you have your setup.py file, you can of course run dash dash help, which will briefly say what you can do.
But much more helpful function is dash dash help commands, which really shows you what commands you can run on your setup.py file. So just to know what the level is, who used setup.py command here?
Okay, and who wrote setup.py file? Okay, a lot of you. So I will be short on the next section. How does the setup.py file look like? It is just a setup function with a lot of keyword arguments.
This one is downloaded as an example PyPI project. I know that the keyword arguments here have spaces near the equals, and that's not PEP8, but I've noticed that yesterday, so I didn't manage to fix that. And what do we have in the setup.py file? Of course, a name of the project, which must be a unique name for the PyPI.
And it should be a name which says something about the project. That's some artifacts. And the URL, so probably the URL of the package will be just a GitHub page where you host it. Or if you have some recommendations, then that might be this.
The version of the project. And this is something that I'm going to stop for a while. You can write a version like that, and you can claim that you will always remember to bump the version when you upload the package, but you won't. And I also believe that I will do that the same, and I did forget once or twice.
So that is where the Handful package comes in. And this is the setup tools SCM. And this is a cool feature that automatically bumps your version, and that's only one of the features of that tool.
So of course, your project is held in Git, because everyone uses Git. And it reads the tag from the Git. You just need to add the use SCM version true in your package, and set the setup requires. Setup requires is a list of packages that will be automatically installed whenever you run any setup.py command.
And I've heard a sentence that it's not the best to put setup tools SCM in the setup requires, because you might not have internet connection when you want to run setup.py functions. And that's true, but in that case you just need to have setup tools SCM somewhere locally hosted and install that.
And for the versions of the packages, it is not fully semver, it is PEP 440. So usually you just use number dot number dot number, but there is also a possibility to use those.
Beta, demo, and all the other naming conventions, and it's all written well in the PEPs. So this is only one feature of setup tools SCM, but there will be more. Outer and outer email, I think that's pretty obvious. It's good to set up true data so someone might reach you, not to give any fake email.
The description, and that's also a good practice, try to keep a long description the same as a README file. Don't try to paste it in here, because again, you will one day forget to update that. Read is of course some IO read function that will just read the README file that is also attached to the project.
That's a good convention to do. Also, always try to add a license. If you don't add any license, then no one is really available to use your package.
So if you don't care about license, just add some BSD or common creative, and that means that people will be available to use that. I'm not a lawyer, so I don't even know what those licenses are, but if you want to read a little bit about it and add it to your project.
And the keywords, keywords and classifiers, these are things that says more about the product we use. Keywords are used for better search of the project, and classifiers is a list of things that group our product, that classifies our product.
And this is an example of the setup tools. It has some keywords set and a list of categories, and these categories are just the classifiers. And my personal best classifier that I love is private do not upload, and not everybody knows you won't upload your package if you have this as a classifier.
So you won't accidentally upload some bad version of your package. Strange things happening. And the packages. Also, you can list your packages in the setup.py file, but what for if you can just write find packages?
And even just internal packages equals find packages prices will work, but you can set up the source or modify it on the go. Just good to know that it's there. And there's more. So you probably know that
there is a requirements.txt that is not a convention, but most people use requirements.txt. But for the Python packaging, you keep what you usually keep in the requirements.txt, and then install your requires. So that is just the list of packages that are dependencies of our package. They can be pinned to some specific version or to the range of versions.
So exactly like in the requirements.txt. And what is interesting, you can still have requirements.txt, and you're encouraged to do that. And you can write dot in the line of your requirements.txt, and that dot will install packages from the install requires of your setup.py file.
I don't have slides for that, but that's something that you can do. So often there is a way to keep requirements.txt with some additional dependencies that you want to use locally for local development.
But the install requires that is something that when someone else writes pip install name of your package, everything from the install requires will get installed on his computer. So don't add things that you want to need, because let's not be rude. Don't make people download more data than they need.
There are also extras. So this is an example from IPython, and you can have some extras which says that if you have some extra IPython, like a notebook IPython, that will install some additional dependencies for you. And you run it with the install IPython and notebook in the braces.
You can also add tests require, which will be run during the Python setup.py test, but this requires a bit of configuration. Actually, I talked to a guy from Python packaging authorities. He said that he's not encouraged to use test requires
anymore, and that it's much better to use extras testing or test, and use talks for installation of those packages. And that is the way that probably they will be going to not use test requires that much anymore.
And talks have some ways to configure to read those testing files, so I won't go deep into that. And the last things of the setup.py file, the entry points, and that is something that I absolutely love. I didn't know that when I run virtualenv, name of the environment, I actually use entry points.
So I guess that everybody here, at least once in his life, used entry points. You can add console scripts that will just save someone from writing those six additional characters, seven including space, every time they want to run virtualenv.py.
And for your package, you can write any entry point that you want. This is just to save people time. You can write that in README to summarize the most common usages of your application, and that's a really great way to set up your environment more user-friendly, your package more user-friendly.
Now, that was the setup.py, and there might be a setup.config. That's a separate file. It looks like it, for me, when I looked at that for the first time, I didn't have an idea what are all those lines, but that's pretty simple.
In the braces, like the global, it says when this config should be added. So global means that every time. So every time you run setup.py function, verbose xwalls once means that it will add dash dash verbose to your action.
So the second one, bdstwheel, every time you build a wheel, it will add dash dash universal. You won't see that, but this will be happening. And if you equal something different than one, like the easy install, every time you run
Python setup.py install, it will add dash dash index URL equals defpy of the company. So pretty handful for the people who don't remember to write those functions, or if you want to encourage to do that. You can also attach some metadata, like license file, or even attach it to some tools like pytest.
And the last file that is required for the setup.py is manifest.in. So this is the, when you do a project, you don't always do that fully in Python. You might have some HTML or some images that you want to provide, or anything. Any static, any assets.
You keep this noted in manifest.in, and that means that when you will build this package, this data will be in the package. When someone will download this, we'll also download the manifest.in.
And that is the second great thing of the setup.scm. Everything not in gitignore will be added invisibly to the manifest.in. So just keep your gitignore file written well, and setup.scm will care for the rest. And getting to the point where I really love using setup tools, and I love the functionality that it provides.
Currently I'm working on four or five services, which talk to each other. And usually I develop one of them, sometimes two. So I have an environment that has all of these packages installed, and they are running some locally, some against, some staging, etc.
When I want to write changes to some package, when I want to edit that, I just enter the directory of that project, run python setup.py develop, and automatically I will start using my local files of that project.
When I'm done with development, I run python setup.py install, and I'm using the most recent package version that is available. And for me it's like a switch between developing this package, using this package as a dependency.
And that's working pretty well for more than a year, and I haven't found any disadvantages of that. So for me that is like the core gain of this. But there are also other advantages. I want to say also a few words about X versus wheels.
X are considered deprecated, so you shouldn't use X anymore. You should use wheels. And wheels have some advantages over X that I want to cover briefly. So wheels is just a way of keeping your binary distribution out of the package.
It has an official PEP that X didn't have, and its naming conversion says a lot about the package. So this is the name of the package, dash version of the package, dash version of the wheel,
so wheels can be versioned inside them, dash which Python you want to use, if that's Python 2.7, 3.5, or it can be both of them, then operating system, usually known, but you can make some Mac-only packages or Linux-only, ABI, and the wheel name.
So what are the advantages? There are no PyCI files inside. They are being generated upon installation. That's why one wheel might be working both on Python 2.7 and 3.4.
The naming conversion, of course, that's what I covered. And the installation of C components doesn't require a compiler, so that's why you can use wheels probably on any system better than you would have used X.
And that is strongly encouraged to now upload binary distributions in wheels. There is a Python wheels website. This is not that actual. Right now they have 254 packages out of 360 mostly installed ones covered. They still didn't cover pycrypto, SQLAlchemy, these are the most known packages I've picked up from the list.
I guess that this uses some problems with these C components, and that's why mostly this is scripting packages and database packages. But it's moving forward. When I started doing this presentation, it was like 200 of these.
Now it's 254. It's going forward. Now a little break. Do you know what will happen when you run this command? Anyone knows? This is a mistake. Yeah.
Yeah. But that will work. There is a package called requirements-dev-text in the PyPI. And someone mentioned that on one conference, and I think it's a lovely idea. And this is one of the issues on that package.
So that was a short break, and now getting to the DevPy. As I mentioned, we use a lot of packages that are connected to each other, and now there comes the problem of the storage of that data.
Of course we can have some cloud storage or some server storage where we put the packages' wheels and we download them from there and just pip install the wheel. But that would be problematic. And there comes DevPy with help. DevPy is just a proxy to DevPy.
So you can put your packages on your DevPy, and they will be only visible for you, for users of the DevPy. But if there is no package on your instance of the DevPy, that will point you to the PyPI. So you can use that nearly the same as you would have put your package on the PyPI.
But in the company, you want to have your packages private. You don't want to upload them to public, because that is company stuff, of course. And with the DevPy, how do you use that? First of all, the most important part is the PyPI RC, which is required for the all-setup.py work.
That's where you set up the upload point for your packages. And you just set the index server to your DevPy, and you specify it more with the repository URL of that DevPy,
and your username and password. If you don't provide a password here, it will prompt you whenever you try to upload or register a package. And you can register and upload packages using Python setup.py register, first of all. You need to register your package before you upload that using setup.py.
I will get to Twine in a second. And then whenever your package is ready, you just sdist upload. And that is one important thing. You should make sure that you have no local changes when you do that. If you have local changes, they will also get uploaded,
and your version of the project will have some hash, meaning that you have changed, you have altered the current version. I have done that once or twice, and then you need to download the DevPy tool to remove that package from DevPy. That's problematic. So that's the point where you need to just remember to usually git stash all your changes before you upload to your package.
But you can also do that with Twine. First of all, you still use setup.py to create your distribution, like sdist or bdist will, sdist stands for source distribution, bdist binary distribution.
And then you just upload, if Twine upload this to whatever package you want. And that's just this. That's just that. And you can have even better ways to set it up. That is the thing that I hate, that you need to have your password either in that file or write it every time you use that.
But with Twine you can set the environment variable. So if you have some keypass, keychain, or any tool that stores your passwords on your PC, then you can just set it up to set this as an environment variable and don't care about your password anyway.
That's not still probably prompting and writing your password manually will be more secure. But that's for sure more secure than keeping your password plain text in a PyPI or C. Now for the testing of the packages.
So there are a few things that we might want to remember about. Of course, whenever we want to upload a package, we test it. Everybody does that. So there are a few steps that we can take. First one is just generate your distribution file and install that.
I didn't even realize before making this presentation that you can pip install tar.gz with the package. But you can do that. And you can also pip install just the wheel. So you can build your wheel, move it to somewhere else, prepare pure virtualenv and try installing your wheel. That should work.
That's the first step. So before you go public, make sure that your package is packed well. The second step that you can do, and that's also not very commonly known, is that you can put test.py.py in your PyPI or C. So you can upload your packages to test.py.py instance,
which is not sharing its resources with a production PyPI. So you set up just like earlier on with a DevPy, the test index for your PyPI or C. And that is something that... There will be a demo later on. I finished preparing that yesterday at 2 a.m.
So that might not be perfect. But I've learnt a bit when doing that. So test.py.py is considered deprecated. Now we're moving to pypy.org. And there is test.pypy.org. And when you want to upload to test.py.org,
you upload to the slash legacy. And that's still in a bit... Well, it's evolving right now, because I will be covering pypy.org later on. And I found some drawbacks of moving to that, but that will be held during the demo. But we should slowly use more pypy.org
than pypy.py.org. And whenever you're done with the installation of your package, you just pip install dash i, which goes for the index, the URL of your index, so of your test.py.py instance, and your package name,
and that should install the package for you. So that is like the... If that works, then production PyPI will work for you too. So we have the package tested, and that's the last thing that I have incoming. PEP20, last line. Namespaces are a great idea. You can have namespaces in your packages.
So let's consider that we have a library for formatting sound. Formatting and editing sounds. This library might be huge, because sound is a huge idea. So we might want to divide it into two main areas, like formats, effects, and filters.
Every directory, like formats, effects, and filters, might be a separate package that you upload to sounds.formats, and they can be maintained separately. And you can install them normally, and just import the sound.effects. And it works.
One thing that you should not mix is you should never install namespace packages with easy install, because that's horrible. And actually you should never use easy install at all. And these are just some advantages of PIP. And the biggest advantages, which I guess is just that easy install
is not an easy way to install stuff. PIP is a way more easier one. And it has the PIP install requirements and PIP3s, which are great, great advantages. So what's next? There is a roadmap on Python packaging authorities,
and it's pretty long. You can read that if you want. So they have a lot of actions on the horizon. There is an idea to keep the requirements in the PIP file, which will be written in the TOM. But that is probably future.
And of course the PyPI. So we are used to going to PyPI, python.org, but the PyPI, which will no longer be a cheese shop, but that will be a warehouse, it's on. You can use that right now. You can upload packages with Twine,
and Twine is a recommended way to upload packages to that. And it's alive. So I think that's the biggest change that is happening right now, literally right now. So it should work. Let's see. So I don't have much time,
but I have a package which has an app.py. It's just a Flask package. And it's set up the pile with some version, name, entry points, and install requires. So I can try to, not much time, so let's just build source distribution
and binary distribution. What happens now? I have the build directory, which has some stuff that comes from the source distribution. And dist library, which is just a binary distribution of the file. And I have an info which has some more or less interesting stuff that requires.
And everything that you can set up on your own in five minutes and test when you have more time than I have right now. What can we do next? Then I can set that to develop. And that's fine.
I can run demo. And that's working. Demo is an entry point. So right now I have this package installed in my virtualenv and the demo is an entry point which just starts the app run. But moving further, I want to twine upload to the test,
where test is the, sorry, not this one. That's an old one. Europe, I can show this one, but I want to upload wheel. Wheel. I want to upload wheel.
It asks me for a password, which I provide. And it says that it's uploaded. So let's check if it is uploaded. I'm on the test.py.org. Projects, EuroPython show. And it's here. It says that it's me. So I guess it all worked fine.
So let's see if it really worked fine. I have a clear virtualenv. Nothing in here. And I want to pip install EuroPython show from the test.py.org.
And something's wrong with the connection. So, okay, but the package is here. So you can see that it's here.
Probably, I've modified my pyprrc file a bit, but I might show that to you, because I have no plain text password in that file. So that's just the pyprrc. I'm not sure if you download, probably you need your password set up
as the environment variable, but that's something that I won't do right now on the big screen while recording. But considering the demo is working on, and that used to work earlier on, I have another entry point which just points to the localhost 5000
and shows, I thought the internet is wrong, which shows an about me page, which means that we're coming to an end. So anytime you want to know something about me, then you can read,
I will cover that one too. So if you want to read a little bit about my presentations that I've run on different conference, there is a list on the about me page. Every week I get information from them that zero people have visited this page, so hopefully that will change. And there's a recommended reading,
so you can take a photo of that. That is the most important things that I base this talk on. So Packaging Python Org, which has a lot of interesting stuff, mostly current, and some information about Wills versus X, two tutorials on how to start. They might not be that actual right now
while we're moving to the test.py. And of course, Python Packaging Authorities.io. So what's going on to be changed in the nearest future? And that's it.
Are there any questions? I will see, but there was someone first.
Thank you, and right now, if we need some private package, we just add it on our private GitHub, and then in requirements, they will just link to the repo. And what's the advantages of using devpy or this second tool, I don't remember the name,
over this approach, just using separate URL with even version? I would say that the devpy is, here was the next question, devpy is a good way to keep your packages. They are versioned there, so you can,
in the situation where we have a lot of packages, consider one package is one service, and these services talk to each other, pin dependencies to some versions of the package. It's a great way to, that we can just bump the version of one of our services in the setup.py, and that will gather this from the devpy
and bump the version. So we have more flexibility on what we use. If we would hold that data on some server, on some directory, then I guess that we would need to make more actions to, for example,
bump a package version. Okay. Are there any thoughts, moving up to the next level, zooming out from a single package and managing lots of packages which are usually used to define your project? So going back from just handling a single package,
stepping up to project management. Is there anything? Not really, because I can't find any reason why you want to bulk update your packages. Because you are modifying an application consisting of maybe 20 or 50 packages,
and so you want to upload a whole bunch of packages, push them into production environment or whatever. Okay, I guess that I might find some reasons why you want to do that, but I would go for a single package approach more,
because you have more information and more control on your package when you're just editing one of them. For example, again, if there are some services and different services use one of your packages,
then you might want a separate approach for that package only. Thank you. You mentioned DevPy, which kind of mirrors the PyPy packaging archive. I used a product years ago called Python Eggbasket,
which is just a purely local repository for your packages when you were in developing, you just had a copy of it locally. Are you aware of anything like that, that is purely local, that runs on your machine? I know there's been some attempts before I joined the company
on using different package hosting services, but the reason we use DevPy might be that one of the Python packaging authorities works in the company and he was helping in development the setup tools and the DevPy, so I guess that was why we chose DevPy,
and it's working well. This is the area that no one really wants to touch, and as long as it's working fine, I guess that there won't be an approach to move to a different end. I can recommend DevPy because it's pretty stable and reliable, but if you know that there are other tools, well, I recommend that everybody
reads a little bit about those tools and choose which fits the best for the purpose. Okay, I guess that's all, so thanks a lot.