How do you use pip, virtualenv and Fabric to handle deployment?
Solution 1:
"Best practices" are very context-dependent, so I won't claim my practices are best, just that they work for me. I work on mostly small sites, so no multiple-server deployments, CDNs etc. I do need to support Webfaction shared hosting deployment, as some clients need the cheapest hosting they can find. I do often have to deploy sites multiple times in different environments, so repeatable scripted deploys are critical.
- I don't use pip bundles, I install from a requirements.txt. I do run my own chishop server with sdists of everything I need, so there aren't multiple single points of failure in the build process. I also use PIP_DOWNLOAD_CACHE on my development machines to speed up bootstrapping project environments, since most of my projects' requirements overlap quite a bit.
- I have Fabric scripts that can automatically set up and configure nginx + Apache/mod_wsgi on an Ubuntu VPS, or configure the equivalent on Webfaction shared hosting, and then deploy the project.
- I do not use --no-site-packages with virtualenv, because I prefer having slow-moving compiled packages (Python Imaging Library, psycopg2) installed at the system level; too slow and troublesome to do inside every virtualenv. I have not had trouble with polluted system site-packages, because I generally don't pollute it. And in any case, you can install a different version of something in the virtualenv and it will take precedence.
- Each project has its own virtualenv. I have some bash scripts (not virtualenvwrapper, though a lot of people use that and love it) that automate deploying the virtualenv for a given project to a known location and installing that project's requirements into it.
- The entire deployment process, from a bare Ubuntu server VPS or Webfaction shared hosting account to a running website, is scripted using Fabric.
- Fabric scripts are part of the project source tree, and I run them from a local development checkout.
- I have no need for SCons (that I am aware of).
Deployment
At the moment a fresh deployment is split into these steps:
-
fab staging bootstrap
(server setup and initial code deploy) -
fab staging enable
(enable the Apache/nginx config for this site) -
fab staging reload_server
(reload Apache/nginx config).
Those can of course be combined into a single command line fab staging bootstrap enable reload_server
.
Once these steps are done, updating the deployment with new code is just fab staging deploy
.
If I need to roll back an update, fab staging rollback
. Nothing particularly magical in the rollback; it just rolls back the code to the last-deployed version and migrates the database to the previous state (this does require recording some metadata about the migration state of the DB post-deploy, I just do that in a text file).
Examples
I haven't used the Fabric scripts described in this answer for a few years, so they aren't maintained at all and I disclaim responsibility for their quality :-) But you can see them at https://bitbucket.org/carljm/django-project-template - in fabfile.py
in the repo root, and in the deploy/
subdirectory.
Solution 2:
I use fabric to build and deploy my code and assume a system already set up for that. I think that a tool like puppet is more appropriate to automate the installation of things like apache and mysql, though I have yet to really include it in my workflow.
Also, I usually have a different virtualenv per project. They are created from a 'base' install of python where - as Carl pointed out - you can leave some global python libraries.
So in terms of workflow that would be:
- puppet to install required services (web server, database, ssh server, ...)
- puppet to set up required users and base folders
- fabric to create virtualenv for the application
- fabric to pip install from requirements.txt
- fabric to deploy your app
- fabric to deploy configuration files (web server, ...)