How to set target hosts in Fabric file

I want to use Fabric to deploy my web app code to development, staging and production servers. My fabfile:

def deploy_2_dev():
  deploy('dev')

def deploy_2_staging():
  deploy('staging')

def deploy_2_prod():
  deploy('prod')

def deploy(server):
  print 'env.hosts:', env.hosts
  env.hosts = [server]
  print 'env.hosts:', env.hosts

Sample output:

host:folder user$ fab deploy_2_dev
env.hosts: []
env.hosts: ['dev']
No hosts found. Please specify (single) host string for connection:

When I create a set_hosts() task as shown in the Fabric docs, env.hosts is set properly. However, this is not a viable option, neither is a decorator. Passing hosts on the command line would ultimately result in some kind of shell script that calls the fabfile, I would prefer having one single tool do the job properly.

It says in the Fabric docs that 'env.hosts is simply a Python list object'. From my observations, this is simply not true.

Can anyone explain what is going on here ? How can I set the host to deploy to ?


I do this by declaring an actual function for each environment. For example:

def test():
    env.user = 'testuser'
    env.hosts = ['test.server.com']

def prod():
    env.user = 'produser'
    env.hosts = ['prod.server.com']

def deploy():
    ...

Using the above functions, I would type the following to deploy to my test environment:

fab test deploy

...and the following to deploy to production:

fab prod deploy

The nice thing about doing it this way is that the test and prod functions can be used before any fab function, not just deploy. It is incredibly useful.


Use roledefs

from fabric.api import env, run

env.roledefs = {
    'test': ['localhost'],
    'dev': ['[email protected]'],
    'staging': ['[email protected]'],
    'production': ['[email protected]']
} 

def deploy():
    run('echo test')

Choose role with -R:

$ fab -R test deploy
[localhost] Executing task 'deploy'
...

Here's a simpler version of serverhorror's answer:

from fabric.api import settings

def mystuff():
    with settings(host_string='192.0.2.78'):
        run("hostname -f")

Was stuck on this myself, but finally figured it out. You simply can't set the env.hosts configuration from within a task. Each task is executed N times, once for each Host specified, so the setting is fundamentally outside of task scope.

Looking at your code above, you could simply do this:

@hosts('dev')
def deploy_dev():
    deploy()

@hosts('staging')
def deploy_staging():
    deploy()

def deploy():
    # do stuff...

Which seems like it would do what you're intending.

Or you can write some custom code in the global scope that parses the arguments manually, and sets env.hosts before your task function is defined. For a few reasons, that's actually how I've set mine up.


Since fab 1.5 this is a documented way to dynamically set hosts.

http://docs.fabfile.org/en/1.7/usage/execution.html#dynamic-hosts

Quote from the doc below.

Using execute with dynamically-set host lists

A common intermediate-to-advanced use case for Fabric is to parameterize lookup of one’s target host list at runtime (when use of Roles does not suffice). execute can make this extremely simple, like so:

from fabric.api import run, execute, task

# For example, code talking to an HTTP API, or a database, or ...
from mylib import external_datastore

# This is the actual algorithm involved. It does not care about host
# lists at all.
def do_work():
    run("something interesting on a host")

# This is the user-facing task invoked on the command line.
@task
def deploy(lookup_param):
    # This is the magic you don't get with @hosts or @roles.
    # Even lazy-loading roles require you to declare available roles
    # beforehand. Here, the sky is the limit.
    host_list = external_datastore.query(lookup_param)
    # Put this dynamically generated host list together with the work to be
    # done.
    execute(do_work, hosts=host_list)