Private Python package management

It might not be the solution for you, but I tell what we do.

  1. Prefix the package names, and using namespaces (eg. company.product.tool).
  2. When we install our packages (including their in-house dependencies), we use a requirements.txt file including our PyPI URL. We run everything in container(s) and we install all public dependencies in them when we are building the images.

We use VCS for this. I see you've explicitly ruled that out, but have you considered using branches to mark your latest stable builds in VCS?

If you aren't interested in the latest version of master or the dev branch, but you are running test/QA against commits, then I would configure your test/QA suite to merge into a branch named something like "stable" or "pypi-stable" and then your requirements files look like this:

pip install git+https://gitlab.com/yourorg/yourpackage.git@pypi-stable

The same configuration will work for setup.py requirements blocks (which allows for chained internal dependencies).

Am I missing something?


Your company could redirect all requests to pypi to a service you control first (perhaps just at your build servers' hosts file(s))

This would potentially allow you to

  • prefer/override arbitrary packages with local ones
  • detect such cases
  • cache common/large upstream packages locally
  • reject suspect/non-known versions/names of upstream packages

Tags:

Python

Pypi