July 26, 2021 - 5 min
- how could I do a
yarn upgrade-interactive --latestto update dependencies?
- where’s the dependency lockfile?
- why do we need to compile C-libraries for Python dependencies?
Luckily for me, my colleague Ondrej patiently explained how to approach these questions.
pip installs packages globally unless we activate a virtual environment.
In Python, the default package manager is call
npm and npmjs.com.
pip installs packages system-wide. This contrasts with the need to pass a
--global flag to npm.
To remedy that, we can use a local Python environment (a bit like having our
venv folder in our current directory:
# Create the venv folder container the virtual environment $ python3 -m venv venv
To make sure
pip knows it should install packages in this virtual environment, we must activate it. Depending on your shell, you should source the appropriate
activate file in
activate, for Bash and ZSH,
activate.fish, for Fish Shell,
activate.csh, for csh
Activate.ps1, for PowerShell.
In my case, I will source the one for ZSH:
$ source venv/bin/activate
pip install will install packages in the local folder instead of targeting the global scope. My “Python node_modules folder” is
pip-compile to generate a lockfile and
pip-sync to install dependencies from that lockfile
To deal with dependencies, we must install the
pip-tools package in our virtual environment:
pip install pip-tools
This will allow us to use
pip-sync later on.
Once that’s done, we could for example decide to start a Django project, “the web framework for perfectionists with deadlines”. The latest version of the package at the time of writing is 3.2.5. Let’s create our Python equivalent of a
package.json file to express our project needs the Django package pinned to version 3.2.5:
# requirements.in django==3.2.5
Note: the better equivalent to a
package.json would be a
setup.py file, since it contains additional fields like the name of the local package, its version, etc. but we just want to write down the dependencies.
EDIT #1: it’s apparently not common practice to pin dependency versions in
requirements.in, just in case you really need to (if there are breaking changes for example). Thanks for pointing it out, Ondrej.
To generate a corresponding lockfile of the transitive dependencies of Django, we can run the following command:
pip-compile --output-file requirements.txt requirements.in
This will create the following
# # This file is autogenerated by pip-compile with python 3.8 # To update, run: # # pip-compile --output-file=requirements.txt requirements.in # asgiref==3.4.1 # via django django==3.2.5 # via -r requirements.in pytz==2021.1 # via django sqlparse==0.4.1 # via django
Now that we have this file, we can finally retrieve and install the packages with the following command:
Note: if you try to run
pip-sync without a
requirements.txt, you will get the following error:
No requirement files given and no requirements.txt found in the current directory
It is indeed necessary to run
npm oudated, we can list the outdated Python dependencies with
$ pip list --outdated
For example, after downgrading Django to 3.2.3 in my
requirements.in, regenerating the lockfile with
pip-compile, re-installing the dependencies with
pip-sync, the command above will show:
Package Version Latest Type ---------- ------- ------ ----- Django 3.2.3 3.2.5 wheel setuptools 44.0.0 57.4.0 wheel
To update the dependency, I can change the version back to 3.2.5 in the
requirements.in and go through the same dance:
$ pip-compile --output-file requirements.txt requirements.in $ pip-sync $ pip list --outdated Package Version Latest Type ---------- ------- ------ ----- setuptools 44.0.0 57.4.0 wheel
EDIT #2: a batteries-included equivalent to Yarn is called Poetry. Thanks for sharing, Dániel!
You might wonder what this “wheel” type is for the Django and setuptools packages.
They are still mysterious to me, but here’s my current understanding: in Python, some dependencies need to be built for specific architectures, as they might be written in Cython, “a superset of the Python language that additionally supports calling C functions and declaring C types on variables and class attributes”. Wheels and Eggs are packaging formats for these.
In particular, it is not a good idea to use Alpine Docker images for Python projects as a lot of packages don’t have pre-built wheels for that architecture: your Docker build step will need to compile the packages from source, slowing down your entire process (and requiring you to install the necessary build tools for this compilation to happen!).
All in all, a project needs to:
- specify external dependencies and their desired version with semantic ranges
- have a way to retrieve said dependencies
- keep track of which versions were effectively installed to improve reproducibility.
- Despite all these measures, the Docker file of an unmaintained 3-year old project could not build. Software rots!
Personal blog written by Robin Cussol
I like math and I like code. Oh, and writing too.