When should I create a new user account to run software on a server?
In general, when should one create a new user account to run a piece of internet-facing software on a server?
For instance, suppose I'm using a shared Debian server (e.g. via Dreamhost) and I want to run some websites using WordPress, some using Redmine, some using Ruby on Rails, maybe some using Django, and I'd like to serve Mercurial repositories too.
On Dreamhost servers and many other similarly set-up servers, this could all be done under a single user account, but I can see some drawbacks to that approach:
- A lengthier .bashrc
- If that one account becomes compromised, so do all the sites running under it.
On the other hand, having lots of user accounts could become a bit of a pain to keep track of, especially if some of them have identical requirements in terms of installed software. For instance, having one account for each website running WordPress might be overkill.
What's the best practice? Is it simply a question of reducing the number of hosted sites (or hosted repositories, etc) per user account proportionately to one's level of paranoia?
Please post your opinions on this, giving your reasons for them.
Also, if you have any reasons for thinking that the approach taken on a private server or VPS should differ from the approach taken on a shared server, please outline what they are and, again, your reasons for them.
Solution 1:
I am generally a fan of "One user for anything that opens listening socket on the network" -- One for Apache, one for Mail, one for DNS, etc.
This is (as last I heard) still Best Current Practice, and the reason behind this is plain and simple paranoia: These services are exposed to the Big Bad Internet, if someone finds a vulnerability and exploits it before I have a chance to patch the software at least I'm confining them to one user account, with only the privileges required to run the single service it's responsible for.
Generally speaking I consider this level of isolation to be sufficient to protect the system, though each application is an island of vulnerability (for example if someone installs a vulnerable WordPress plugin all the things Apache has access to (i.e. all the websites) are effectively vulnerable in the event of a compromise.
An extended version of that argument can thus be made for sandboxing Shared Hosting clients' websites with its own Apache configuration and user (you don't necessarily have to install a full web stack for each site, just a separate apache configuration specifying a different user), the downside being that each site is now running a bunch of Apache processes, so your RAM usage just went up substantially, and things that are world-readable are still vulnerable if any single Apache instance/user gets compromised.
Extending further the argument for putting each Apache in a chroot (or jail if you on BSD systems) can be made for even more security, but now you're talking about additional disk space since each chroot/jail will need all the software required to run the site it contains (and the need to update this software for every site rather than just one master copy on the server when patches come out), plus the RAM requirement just like when you had separate users/apache instances.
This mitigates everything except an OS/Kernel bug that lets users break out of the chroot (which becomes the argument for running every site on a separate physical server - which then becomes the argment for separating the sites into different vlans/subnets, etc.)
As with all risks, you can't eliminate it: you can only mitigate it down to an acceptable level based on the potential harm/cost of a compromise, the likelihood of a compromise, and the cost of each level of mitigation.
For my money, for a non-critical, non E-Commerce shared hosting environment the basic "One user for Apache, one for DNS, one for mail, etc." safety net is enough. If there's a need for security beyond that level your users should be seriously considering their own hardware.
Solution 2:
Generally what I do is have one user for external facing services that is not allowed to login ("nobody" for example), and one account that is allowed login and su or sudo. Naturally make sure your usernames are different and not easily guessable.
I don't see having one user per service as necessary unless you are running a shared hosting environment where each customer has a login. If you realistically see yourself as a very attractive target for hacking, you may as well isolate as much as possible. However, unless you are doing something very controversial or hosting financial data you aren't really that attractive of a target.