Nagios Monitoring Text on a Website

Solution 1:

Try leaving put the -u. -u gives the path (page) to retrieve, default is "/", it doesn't take the entire URL. Here is my output when I leave it out.

$ ./check_http -H www.google.com -s "Privacy"
HTTP OK HTTP/1.0 200 OK - 0.041 second response time |time=0.040579s;;;0.000000 size=5257B;;;0
$ ./check_http -H www.google.com -s "Privacyblahdibla"
HTTP CRITICAL - string not found|time=0.048169s;;;0.000000 size=5257B;;;0

If you want to get a specific page, use the -u like this

$ ./check_http -H www.google.com -u "/ig" -s "Privacy"
HTTP OK HTTP/1.0 200 OK - 0.166 second response time |time=0.165896s;;;0.000000 size=87843B;;;0

Solution 2:

Another option is to use the check_curl from monitoringexchange.org. In reality this is a titch too complicated for what you are trying to do, but I have found it extends the functinality when you need it for parsing data from websites and inserting input

Contents of my customized non-variable check_curl below:

 #!/bin/bash
 PROG=/usr/local/bin/curl
 FILE=/tmp/check_curl
 HALT=PRIVACY

 $PROG -k -s http://www.google.com > $FILE

 STATUS=`grep Error $FILE | awk '{ print $0 }'`

 grep -w $HALT $FILE > /dev/null
 if [ $? -eq 0 ]
    then
      WORKING=`grep Privacy $FILE | awk '{ print $5 }'`
      echo "Works, Returns data of $WORKING"
      rm $FILE
 else echo "Doesn't return $WORKING"| $STATUS"
      exit 2
  fi