Linux hardening - web servers

What are your checklist/routine when setting up a Linux web server?

What do you recommend to achieve maximum security?

Is there any preferred way to perform repeated maintenance?


Solution 1:

  • First of all, be aware that any scripting ability in Apache (php, cgi, ruby,...) is the potential equivalent of a shell account with privileges of the user running the script.

  • If the server is shared with multiple users, you might want to think about using suexec (- or ITK MPM - Suggested by David Schmitt) so not every script runs as the same apache user.

  • Virtualize or chroot apache, so that any compromise is at least somewhat contained in an additional layer of security. Be aware that when you chroot apache, maintenance may become harder, as you end up moving libraries to the jail etc. If you're on FreeBSD you can use a jail instead, which is much easier to maintain, since you can just install apache from ports, and run portaudit from within it, without having to worry about any library dependencies and moving files manually, which always becomes an ugly mess. With BSD jails you can simply keep using the package management system (ports). (On GNU/Linux you can also use VServer for virtualization. - Suggested by David Schmitt )

  • (obviously) Keep up with updates and patches, not only for Apache, but also PHP, ruby, perl, etc... don't just trust your OS to give you all the updates either. Some distro's are extremely slow with their patches. Limit exposure time to 0-day vulnerabilities as much as possible. Stick the milw0rm feed in your RSS reader, subscribe to the insecure.org mailing lists, etc... Not only will it help you learn about vulnerabilities before your OS gets around to releasing a patch, you will also learn about vulnerabilities in certain php cms applications for example, which may not even be managed or patched by your OS at all.

  • Use something like tripwire/aide, audit, or mtree(on BSD) to keep track of changes on your filesystem. This one is really important. Have any changes mailed to you regularly, review them manually, every day. If any file changes that shouldn't change, investigate why. If some malicious javascript somehow gets inserted into your pages through whatever method, you WILL catch it this way. This not only saves your server, but also your users, as your own webpages can be abused to infect your visitors. (This is a very very common tactic, the attackers often don't even care about your server, they just want to infect as many of your visitors as possible until discovered. These attackers also don't even bother to hide their tracks usually. Catching a compromise like this as fast as possible is very important.)

  • Using stuff like suhosin to protect php helps. But also learn to understand it, tweak it's config to your application's expected parameters.

  • Using a kernel patch such as PaX may help protect you from many buffer overflow vulnerabilities. Even if your software is vulnerable. (This does not make you invulnerable, it's just yet another, minor, layer.)

  • Don't get over-confident when using some security tool. Understand the tools you use, and use common sense. Read, learn, keep up with as much as you can.

  • Consider using mandatory access control (eg: SELinux). It allows you to specify, for each application, what it is allowed to do, in great detail. What files is it allowed to access. What kernel calls is it allowed to make, etc. This is a very involved process and requires lots of understanding. Some distro's provide pre-made SELinux policies for their packages (eg: Gentoo ). This suggestion is kind of a contradiction to the one below, but still valid, nevertheless.

  • Keep things simple. A complex security strategy may work against you.

  • In Apache, set up a very restrictive default rules (Options None, Deny from all, etc...) and override as needed for specific VirtualHosts.

  • Deny access to all dotfiles (which also immediately covers .htaccess files)

  • Always use https anywhere there is any sort of password authentication.

  • Firewall should be a deny-by-default policy. Build some specific rules in your firewall to log specific traffic.

  • Set up log parsing scripts to scan your logs for anomalies. (the prelude IDS suite can do this, but honestly, I recommend you build up your own scripts over time, as it will help you understand your own tools and rules better.)

  • Have the server mail you daily reports on last logged in users, active connections, bandwidth used, etc...

  • Have a cron scan for suid binaries, world writeable files, and stuff like that, and have them mailed to you.

  • For any of the stuff you set up that gets mailed to you, you should build up a list of exceptions over time. (folders to ignore filesystem changes on, 777 files to allow, suid binaries to allow). It is important that you only get notified of things that shouldn't happen. If you get a mail every day with trivial stuff, you will start to ignore them, and they will become pointless.

  • Have a good solid layered redundant backup strategy. And don't just assume that making an image or copy of everything works. For example, if MySQL is in the middle of writing to a table during your backup, your MySQL binary files may be corrupted when you restore your backup. So you will need a cron that mysqldump's your databases on top of regular images or nightly tarballs or version control or whatever else you have setup. Think about your backup strategy. I mean, REALLY think about it.

  • Don't rely on lists like this for security :) Seriously! You'll find lots of these all over the internet, go read them all, research every suggestion, and use common sense and experience to make up your own mind. In the end, experience and common sense are the only things that will save you. Not lists, nor tools. Do read, but don't just copy without understanding.

Solution 2:

I recommend the Linux Security Checklist, from SAN. I use that, plus another in-house procedures. The checklist might be a bit out-dated, but many of the key points stand true.