Normal Classic Pip VS Pip-Compile & Pip-Sync
# Perhaps it’s time to make a change

Why should you care? Well, more companies and teams are starting to use pip-compile
and pip-sync
instead of the classic pip freeze
stuff. So this could be the new norm in a couple of years. This article discusses:
- difference in generating
requirements.txt
- difference in installing stuff from
requirements.txt
First things first, Python virtual environment
python -m venv env # create a virtual Python environment named 'env'
env\Scripts\activate.bat # activating env for Windows
source env\bin\activate # activating env for MacOS/Linux
It is good practice to use a Python virtual environment so that we can install certain versions of our libraries without these installations messing with our main Python interpreter.
Normal pip (classic pip)
Generating requirements.txt
using pip freeze
:
pip freeze > requirements.txt
^ this command creates a text file requirements.txt
which contains all our necessary dependencies. Every installed library in env
will be listed here.
Installing the stuff inside requirements.txt
:
pip install -r requirements.txt
The -r
flag means ‘read’ — this command installs every single line inside requirements.txt
using pip.
The problem with normal pip
Let’s talk about 2 kinds of dependencies
- the stuff we need directly for our project — the main dependencies
- the stuff our main dependencies depend on — the sub dependencies
I’m going to pip freeze
a virtual Python environment with just numpy and pandas installed.
numpy==1.25.2
pandas==2.0.3
python-dateutil==2.8.2
pytz==2023.3
six==1.16.0
tzdata==2023.3
Here, we can see that apart from our main dependencies numpy
and pandas
, we have other stuff like python-dateutil
pytz
etc etc. These are our sub dependencies.
Now, where are these subdependencies from? We can’t really tell whether they are from numpy or pandas here. Which gets infinitely more annoying if we have 50 other main dependencies.
But why do we need to care? Well, sometimes we need to upgrade our main dependency versions. And by extension, we need to upgrade our sub dependency versions or stuff might go horribly wrong.
Which is a problem, as pip freeze
doesn’t differentiate between the main dependencies and sub dependencies.
Installing pip-tools
pip install pip-tools
pip-tools
contain both pip-compile
and pip-sync
, which we will now explain.
Enter pip-compile
Let’s say our project only needs numpy
and pandas
. Let’s first create a requirements.in
text file
numpy pandas
Next, let’s run pip-compile requirements.in
. This automatically generates a requirements.txt
file with the content:
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile requirements.in
#
numpy==1.25.2
# via
# -r requirements.in
# pandas
pandas==2.0.3
# via -r requirements.in
python-dateutil==2.8.2
# via pandas
pytz==2023.3
# via pandas
six==1.16.0
# via python-dateutil
tzdata==2023.3
# via pandas
^ here, we generate a requirements.txt
file from our input requirements.in
file. And here we can tell where our dependencies are coming from, and by extension, differentiate between our main dependencies and sub dependencies.
To install the stuff inside requirements.txt
, we use the same command as before pip install -r requirements.txt
Pip-compile from pyproject.toml
Previously, we created our requirements.txt
based on requirements.in
. We can also create requirements.txt
from inputs other than requirements.in
.
One popular example would be a pyproject.toml
:
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "my-project"
version = "0.0.1"
description = "some description"
requires-python = ">=3.11"
authors = [
{name = "myname", email = "[email protected]"}
]
dependencies = [
"numpy",
"pandas"
]
We can use the command pip-compile pyproject.toml
to generate a requirements.txt
based on our pyproject.toml
. This is what our requirements.txt
looks like now:
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile pyproject.toml
#
numpy==1.25.2
# via
# my-project (pyproject.toml)
# pandas
pandas==2.0.3
# via my-project (pyproject.toml)
python-dateutil==2.8.2
# via pandas
pytz==2023.3
# via pandas
six==1.16.0
# via python-dateutil
tzdata==2023.3
# via pandas
- Each dependency’s source is listed still
- We can differentiate between main and sub dependencies
Enter pip-sync
I use pip-sync
in place of pip install -r requirements.txt
now.
- it updates a Python virtual environment to follow a
requirements.txt
file exactly - dependencies that are not installed are installed
- dependencies that aren’t supposed to be there are uninstalled
Yeah, it saves us the time we might have needed to uninstall certain outdated libraries.
pip-sync
^ just run this command, and your existing Python virtual environment will automatically be synced up with your requirements.txt
Conclusion
I foresee that this way of dealing with Pip could replace the old way in a couple of years maybe (it’s replacing it slowly now!) as I’m seeing more and more Python projects use this in place of pip freeze
and pip install
.
Hopefully this piece of knowledge turns out to be somewhat useful for you some time in the near future!
Some Final words
If this story was helpful and you wish to show a little support, you could:
- Clap 50 times for this story
- Leave a comment telling me what you think
- Highlight the parts in this story that resonate with you
- Sign up for a Medium membership using my link ($5/month to read unlimited Medium stories)
These actions really really help me out, and are much appreciated!
Ebooks I’ve Written: https://zlliu.co/ebooks
LinkedIn: https://www.linkedin.com/in/zlliu/
My Workspace: https://zlliu.co/workspace