Config deployment on multiple servers

You can use any of the modern change automation tools (Puppet, Chef, cfengine, bcfg2, and so forth) for this. Any of them can deploy files, and restart services when files they manage are modified.

I've had great success with Puppet over the last few years in several environments.

Once you start using the tool for everything, it has the added benefit of documenting both your process and infrastructure.

Back it by a versioning tool such as git or svn and now you have .. a versioned infrastructure.


I generally agree with bdha's answer - use a config management tool to manage you changes. Another point I want to make is that you should strive to use your system's package management tool as much as possible for everything that isn't a configuration file. It is much easier to manage a system that has a collection of packages installed than a system with a bunch of manual file edits (or a system with a bunch of automated file edits via puppet).

If you have configuration files that never change, those are also candidates for inclusion in system packages. Learn how to build packages in your system's package tool, and how to stage them in a centralized repository so you can then use tools like yum to manage and install them.

Also consider carefully your software push system. A lot of people use puppet or cfengine to do this, but again there are some more specialized tools that may scale better as your environment gets larger. Example of these types tools include Capistrano and Pogo.


If you have a big number of servers you should definitely look at puppet or chef, they're the best solution that'll take care of all your requirements and even reload the server config as soon as the new one is acknowledged.

If you find that a bit overkill you could just make a scripts with some cross ssh keys from a central location to push the config, if I were you I would use mercurial or bazaar on that central repo to track changes and be able to roll back easily in case things go bad.