Piping wget output to /dev/null in cron
Solution 1:
You could do it like this:
*/5 * * * * wget -O /dev/null -o /dev/null example.com
Here -O
sends the downloaded file to /dev/null
and -o
logs to /dev/null
instead of stderr. That way redirection is not needed at all.
Solution 2:
Do you need to actually download the contents or just receive the 200 OK? If you only have to have the server process the request, why not simply use the --spider
argument?
Solution 3:
I would use the following:
/5 * * * * wget -O - mysite.com > /dev/null 2>&1
The -O -
option makes sure that the fetched content is send to stdout.
Solution 4:
You say you only need the "200 OK" response in a comment.
That allows for solution with some additional advantages over those ofwget -O /dev/null -o /dev/null example.com
. The idea is not to discard the output in some way, but not create any output at all.
That you only need the response means the data that is downloaded into the local file index.html does not need to be downloaded in the first place.
In the HTTP protocol, the command 'GET' is used to download a document. To access a document in a way that does everything except actually downloading the document, there is a special command 'HEAD'.
When using 'GET' for this task, the document is downloaded and discarded locally. Using 'HEAD' does just what you need, it does not transfer the document in the first place. It will allways return the same result code as 'GET' would, by definition.
The syntax to use the method HEAD
with wget
is a little odd: we need to use the option --spider
. In this context, it just does what we want - access the URL with 'HEAD' instead of 'GET'.
We can use the option -q
(quiet) to make wget
not output details about what it does.
Combining that, wget
will neither output anything to stderr, nor save a document.
wget -q --spider 'http://example.com/'
The exit code tells us whether the request was successful or not:
$ wget -q --spider 'http://example.com/'
$ echo $?
0
$ wget -q --spider 'http://example.com/nonexisting'
$ echo $?
8
For a command in crontab
, the fact that there is no output in both cases means you can use getting no output as an indication of errors again.
Your example command would be changed to this:
*/5 * * * * wget -q --spider mysite.com
This has the same advantages as wget -O /dev/null -o /dev/null example.com
. The additional advantage is that the log output, and the document output, are not generated, instead of generated and discarded locally. Or course the big difference is avoiding to download and then discard the document, index.html
.
Solution 5:
to keep Phusion Passenger alive.
May your question should be about this, the webpage says:
A fast and robust web server and application server for
This shouldn't require any keepalive scripts.
Otherwise kasperd's solution is perfect.