requirements.txt vs setup.py
Solution 1:
requirements.txt
:
This helps you to set up your development environment.
Programs like pip
can be used to install all packages listed in the file in one fell swoop. After that you can start developing your python script. Especially useful if you plan to have others contribute to the development or use virtual environments.
This is how you use it:
pip install -r requirements.txt
It can be produced easily by pip
itself:
pip freeze > requirements.txt
pip
automatically tries to only add packages that are not installed by default, so the produced file is pretty minimal.
setup.py
:
This helps you to create packages that you can redistribute.
The setup.py
script is meant to install your package on the end user's system, not to prepare the development environment as pip install -r requirements.txt
does. See this answer for more details on setup.py
.
The dependencies of your project are listed in both files.
Solution 2:
The short answer is that requirements.txt
is for listing package requirements only. setup.py
on the other hand is more like an installation script. If you don't plan on installing the python code, typically you would only need requirements.txt
.
The file setup.py
describes, in addition to the package dependencies, the set of files and modules that should be packaged (or compiled, in the case of native modules (i.e., written in C)), and metadata to add to the python package listings (e.g. package name, package version, package description, author, ...).
Because both files list dependencies, this can lead to a bit of duplication. Read below for details.
requirements.txt
This file lists python package requirements. It is a plain text file (optionally with comments) that lists the package dependencies of your python project (one per line). It does not describe the way in which your python package is installed. You would generally consume the requirements file with pip install -r requirements.txt
.
The filename of the text file is arbitrary, but is often requirements.txt
by convention. When exploring source code repositories of other python packages, you might stumble on other names, such as dev-dependencies.txt
or dependencies-dev.txt
. Those serve the same purpose as dependencies.txt
but generally list additional dependencies of interest to developers of the particular package, namely for testing the source code (e.g. pytest, pylint, etc.) before release. Users of the package generally wouldn't need the entire set of developer dependencies to run the package.
If multiplerequirements-X.txt
variants are present, then usually one will list runtime dependencies, and the other build-time, or test dependencies. Some projects also cascade their requirements file, i.e. when one requirements file includes another file (example). Doing so can reduce repetition.
setup.py
This is a python script which uses the setuptools
module to define a python package (name, files included, package metadata, and installation). It will, like requirements.txt
, also list runtime dependencies of the package. Setuptools is the de-facto way to build and install python packages, but it has its shortcomings, which over time have sprouted the development of new "meta-package managers", like pip. Example shortcomings of setuptools are its inability to install multiple versions of the same package, and lack of an uninstall command.
When a python user does pip install ./pkgdir_my_module
(or pip install my-module
), pip will run setup.py
in the given directory (or module). Similarly, any module which has a setup.py
can be pip
-installed, e.g. by running pip install .
from the same folder.
Do I really need both?
Short answer is no, but it's nice to have both. They achieve different purposes, but they can both be used to list your dependencies.
There is one trick you may consider to avoid duplicating your list of dependencies between requirements.txt
and setup.py
. If you have written a fully working setup.py
for your package already, and your dependencies are mostly external, you could consider having a simple requirements.txt
with only the following:
# requirements.txt
#
# installs dependencies from ./setup.py, and the package itself,
# in editable mode
-e .
# (the -e above is optional). you could also just install the package
# normally with just the line below (after uncommenting)
# .
The -e
is a special pip install
option which installs the given package in editable mode. When pip -r requirements.txt
is run on this file, pip will install your dependencies via the list in ./setup.py
. The editable option will place a symlink in your install directory (instead of an egg or archived copy). It allows developers to edit code in place from the repository without reinstalling.
You can also take advantage of what's called "setuptools extras" when you have both files in your package repository. You can define optional packages in setup.py under a custom category, and install those packages from just that category with pip:
# setup.py
from setuptools import setup
setup(
name="FOO"
...
extras_require = {
'dev': ['pylint'],
'build': ['requests']
}
...
)
and then, in the requirements file:
# install packages in the [build] category, from setup.py
# (path/to/mypkg is the directory where setup.py is)
-e path/to/mypkg[build]
This would keep all your dependency lists inside setup.py.
Note: You would normally execute pip and setup.py from a sandbox, such as those created with the program virtualenv
. This will avoid installing python packages outside the context of your project's development environment.
Solution 3:
For the sake of completeness, here is how I see it in 3 4 different angles.
- Their design purposes are different
This is the precise description quoted from the official documentation (emphasis mine):
Whereas install_requires (in setup.py) defines the dependencies for a single project, Requirements Files are often used to define the requirements for a complete Python environment.
Whereas install_requires requirements are minimal, requirements files often contain an exhaustive listing of pinned versions for the purpose of achieving repeatable installations of a complete environment.
But it might still not easy to be understood, so in next section, there come 2 factual examples to demonstrate how the 2 approaches are supposed to be used, differently.
- Their actual usages are therefore (supposed to be) different
-
If your project
foo
is going to be released as a standalone library (meaning, others would probably doimport foo
), then you (and your downstream users) would want to have a flexible declaration of dependency, so that your library would not (and it must not) be "picky" about what exact version of YOUR dependencies should be. So, typically, your setup.py would contain lines like this:install_requires=[ 'A>=1,<2', 'B>=2' ]
-
If you just want to somehow "document" or "pin" your EXACT current environment for your application
bar
, meaning, you or your users would like to use your applicationbar
as-is, i.e. runningpython bar.py
, you may want to freeze your environment so that it would always behave the same. In such case, your requirements file would look like this:A==1.2.3 B==2.3.4 # It could even contain some dependencies NOT strickly required by your library pylint==3.4.5
-
In reality, which one do I use?
-
If you are developing an application
bar
which will be used bypython bar.py
, even if that is "just script for fun", you are still recommended to use requirements.txt because, who knows, next week (which happens to be Christmas) you would receive a new computer as a gift, so you would need to setup your exact environment there again. -
If you are developing a library
foo
which will be used byimport foo
, you have to prepare a setup.py. Period. But you may still choose to also provide a requirements.txt at the same time, which can:(a) either be in the
A==1.2.3
style (as explained in #2 above);(b) or just contain a magical single
.
.
The latter is essentially using the conventional
requirements.txt
habit to document your installation step ispip install .
, which means to "install the requirements based on setup.py" while without duplication. Personally I consider this last approach kind of blurs the line, adds to the confusion, but it is nonetheless a convenient way to explicitly opt out for dependency pinning when running in a CI environment. The trick was derived from an approach mentioned by Python packaging maintainer Donald in his blog post.
-
-
Different lower bounds.
Assuming there is an existing
engine
library with this history:engine 1.1.0 Use steam ... engine 1.2.0 Internal combustion is invented engine 1.2.1 Fix engine leaking oil engine 1.2.2 Fix engine overheat engine 1.2.3 Fix occasional engine stalling engine 2.0.0 Introducing nuclear reactor
You follow the above 3 criteria and correctly decided that your new library
hybrid-engine
would use asetup.py
to declare its dependencyengine>=1.2.0,<2
, and then your separated applicationreliable-car
would userequirements.txt
to declare its dependencyengine>=1.2.3,<2
(or you may want to just pinengine==1.2.3
). As you see, your choice for their lower bound number are still subtly different, and neither of them uses the latestengine==2.0.0
. And here is why.-
hybrid-engine
depends onengine>=1.2.0
because, the neededadd_fuel()
API was first introduced inengine 1.2.0
, and that capability is the necessity ofhybrid-engine
, regardless of whether there might be some (minor) bugs inside such version and been fixed in subsequent versions 1.2.1, 1.2.2 and 1.2.3. -
reliable-car
depends onengine>=1.2.3
because that is the earliest version WITHOUT known issues, so far. Sure there are new capabilities in later versions, i.e. "nuclear reactor" introduced inengine 2.0.0
, but they are not necessarily desirable for projectreliable-car
. (Your yet another new projecttime-machine
would likely useengine>=2.0.0
, but that is a different topic, though.)
-