script to automatically test if a web site is available

Well... The most simple script, I cam write:

/usr/bin/wget "www.example.com" --timeout 30 -O - 2>/dev/null | grep "Normal operation string" || echo "The site is down" | /usr/bin/mail -v -s "Site is down" [email protected]

Add it to cron as:

* * * * * /usr/bin/wget "www.example.com" --timeout 30 -O - 2>/dev/null  | grep "Normal operation string" || echo "The site is down" | /usr/bin/mail -v -s "Site is down" [email protected]

But it is too simple to tell you what the problem is if it exists.

UPD: Now this one-liner checks for a specific string on the page ("Normal operation string"), which should appear only on normal operation.

UPD2: A simple way to send the error page in the e-mail:

/usr/bin/wget "www.example.com" --timeout 30 -O - 2>/dev/null | grep "Normal operation string" || /usr/bin/wget "www.example.com" --timeout 30 -O - 2>/dev/null | /usr/bin/mail -v -s "Site is down" [email protected]

It's minus is that the page is re-requested in case of first test failure. This time the request may be successful and you won't see the error. Of course, it is possible to store the output and send it as an attachment, but it will make the script more complex.


Take a look at this script:

  • http://answers.google.com/answers/threadview/id/276934.html

curl is a command-line utility to fetch a URL. The script checks the exit code ($? refers to the exit code of the most recent command in a shell script) and if it was anything other than 0, reports an error (an exit code of 0 generally refers to success). As mentioned in HUB's answer, you can also just || on the command-line to run a second command when the first one fails.

Once you have the status figured out, you just have to send yourself some mail. Here is an example that uses the mail command to send mail from a shell script, assuming the box you're testing from has SMTP setup:

  • http://www.crazysquirrel.com/computing/debian/scripts/email-via-script.jspx

BTW: if you're not good at shell scripting, don't limit yourself to a shell script. You could use a ruby script, a php script, any kind of script your server can run! Just add the #!/path/to/executable line at the beginning of the script - for instance:

#!/usr/bin/php


Check this script. it's checking against a list of websites and sends email (to list of emails) whenever something wrong (http response different from 200). The script creates a .temp file to "remember" which website(s) failed at last check so you won't get multiple emails. the .temp file is deleted when the website is working again.

#!/bin/bash
# list of websites. each website in new line. leave an empty line in the end.
LISTFILE=/scripts/isOnline/websites.lst
# Send mail in case of failure to. leave an empty line in the end.
EMAILLISTFILE=/scripts/isOnline/emails.lst

# `Quiet` is true when in crontab; show output when it's run manually from shell.
# Set THIS_IS_CRON=1 in the beginning of your crontab -e.
# else you will get the output to your email every time
if [ -n "$THIS_IS_CRON" ]; then QUIET=true; else QUIET=false; fi

function test {
  response=$(curl --write-out %{http_code} --silent --output /dev/null $1)
  filename=$( echo $1 | cut -f1 -d"/" )
  if [ "$QUIET" = false ] ; then echo -n "$p "; fi

  if [ $response -eq 200 ] ; then
    # website working
    if [ "$QUIET" = false ] ; then
      echo -n "$response "; echo -e "\e[32m[ok]\e[0m"
    fi
    # remove .temp file if exist.
    if [ -f cache/$filename ]; then rm -f cache/$filename; fi
  else
    # website down
    if [ "$QUIET" = false ] ; then echo -n "$response "; echo -e "\e[31m[DOWN]\e[0m"; fi
    if [ ! -f cache/$filename ]; then
        while read e; do
            # using mailx command
            echo "$p WEBSITE DOWN" | mailx -s "$1 WEBSITE DOWN" $e
            # using mail command
            #mail -s "$p WEBSITE DOWN" "$EMAIL"
        done < $EMAILLISTFILE
        echo > cache/$filename
    fi
  fi
}

# main loop
while read p; do
  test $p
done < $LISTFILE

Add the following lines to crontab config ($ crontab -e)

THIS_IS_CRON=1
*/30 * * * * /path/to/isOnline/checker.sh

Available on Github


I know that all the above scripts are exactly what you've asked, but I would suggest looking at monit because it will send you an email if apache is down but it will also restart it (if it's down).


I would recommend pingdom for this. Their free service allows you to check 1 site, but that's all you need to check 1 server. If you have an iPhone they push-message you for free, so no need to buy SMS credits from them, and they have multiple settings you can use. Mine is set to notify me after 2 retries (10min) and every 10min downtime after that. It's awesome, since it also checks for HTTP 500 messages indicating a site is down. If it fails, it immediately checks your site again from a different server in a different location. If that one fails, well, that triggers your preference in how/when you'd like to get notified.