Bash script to check if a public HTTPS site is up [closed]
Solution 1:
Here is a way to do it using wget instead of curl. Keep in mind that MacOS doesn't come with wget by default.
A successful web request will return a code of 200, a failure will return a 300, 400, 404, ect... (see REST API codes)
This line will return a 1
if the web request was successful, otherwise it will return 0
wget -q -O /tmp/foo google.com | grep '200' /tmp/foo | wc -l
1
Solution 2:
One of many:
if curl -s --head --request GET https://example.com | grep "200 OK" > /dev/null; then
echo "mysite.com is UP"
else
echo "mysite.com is DOWN"
fi
Solution 3:
Nagios' check_http plugin can do this and much more, including checking for specific text in the response. You can run it from a shell script independently of Nagios itself:
$ check_http --ssl -H www.google.com -r 'Feeling Lucky'
HTTP OK: HTTP/1.1 200 OK - 11900 bytes in 0.086 second response time |time=0.085943s;;;0.000000 size=11900B;;;0
$ echo $?
0
Solution 4:
Why not use a full solution for monitoring? I've found monit to be pretty good for this: http://mmonit.com/monit/
(this comes after years of using home brewed bash scripts - i've found monit to be more transportable to different boxes, and more robust than some messy scripts. Don't know about their paid version).