tail -f equivalent for an URL
There may be a specific tool for this, but you can also do it using wget. Open a terminal and run this command:
while :; do
sleep 2
wget -ca -O log.txt -o /dev/null http://yoursite.com/log
done
This will download the logfile every two seconds and save it into log.txt
appending the output to what is already there (-c
means continue downloading and -a
means append the output to the file name given). The -o
redirects error messages to /dev/null/
.
So, now you have a local copy of log.txt and can run tail -f
on it:
tail -f log.txt
I answered the same question over here with a complete shell script that takes the URL as it's argument and tail -f
's it. Here's a copy of that answer verbatim:
This will do it:
#!/bin/bash
file=$(mktemp)
trap 'rm $file' EXIT
(while true; do
# shellcheck disable=SC2094
curl --fail -r "$(stat -c %s "$file")"- "$1" >> "$file"
done) &
pid=$!
trap 'kill $pid; rm $file' EXIT
tail -f "$file"
It's not very friendly on teh web-server. You could replace the true
with sleep 1
to be less resource intensive.
Like tail -f
, you need to ^C
when you are done watching the output, even when the output is done.
curl with range option in combination with watch can be used to achieve this:
RANGES
HTTP 1.1 introduced byte-ranges. Using this, a client can request to get only one or more subparts of a specified document. Curl supports this with the -r flag.
watch -n <interval> 'curl -s -r -<bytes> <url>'
For example
watch -n 30 'curl -s -r -2000 http://yoursite.com/log'
This will retrieve the last 2000 bytes of the log every 30 seconds.
Note: for self signed https use --insecure curl option