tail -f equivalent for an URL

There may be a specific tool for this, but you can also do it using wget. Open a terminal and run this command:

while :; do 
    sleep 2
    wget -ca -O log.txt -o /dev/null http://yoursite.com/log
done

This will download the logfile every two seconds and save it into log.txt appending the output to what is already there (-c means continue downloading and -a means append the output to the file name given). The -o redirects error messages to /dev/null/.

So, now you have a local copy of log.txt and can run tail -f on it:

tail -f log.txt 

I answered the same question over here with a complete shell script that takes the URL as it's argument and tail -f's it. Here's a copy of that answer verbatim:


This will do it:

#!/bin/bash

file=$(mktemp)
trap 'rm $file' EXIT

(while true; do
    # shellcheck disable=SC2094
    curl --fail -r "$(stat -c %s "$file")"- "$1" >> "$file"
done) &
pid=$!
trap 'kill $pid; rm $file' EXIT

tail -f "$file"

It's not very friendly on teh web-server. You could replace the true with sleep 1 to be less resource intensive.

Like tail -f, you need to ^C when you are done watching the output, even when the output is done.