Simple web server monitoring (alive)
Any tip about software to monitor if a web server is up and running on linux ? It should be able to run with not knowing anything more than the URL. And it must have functionality to send an email alert when the site goes down. Should not be hard to write a script for this myself but seems pointless if there is already something nice out there.
Note that I am going to monitor internal servers, so this need to be a tool that runs on my machine on the same network, not external web based services.
And note that small and simple solutions are preferred.
Update: I eventually created a small python script that I am currently using for this, it can be found here.
Solution 1:
You can use wget in a script like this
wget --timeout=3 --tries=1 --spider --no-check-certificate http://serverfault.com
if [ $? -ne 0 ];then
echo "Site Down" | mail -s "Site Down" [email protected]
fi
And you will get an email if wget cannot access the site first time within three seconds.
Set up a cron job to run the script every few minutes.
There are many other alternatives but this is probably the simplest to set up from scratch.
Solution 2:
You have many options, I'll give you two.
Nagios is a full-blown monitoring application capable of monitoring much much more than http, but it handles that as well. It can also create all kinds of repots ("Tell me the uptime percentage of our server/service X this week/month/year...")
Monit is another popular choice. Maybe not as feature-filled as Nagios, but nevertheless it's nice.