Force flushing of output to a file while bash script is still running
I have a small script, which is called daily by crontab using the following command:
/homedir/MyScript &> some_log.log
The problem with this method is that some_log.log is only created after MyScript finishes. I would like to flush the output of the program into the file while it's running so I could do things like
tail -f some_log.log
and keep track of the progress, etc.
Solution 1:
I found a solution to this here. Using the OP's example you basically run
stdbuf -oL /homedir/MyScript &> some_log.log
and then the buffer gets flushed after each line of output. I often combine this with nohup
to run long jobs on a remote machine.
stdbuf -oL nohup /homedir/MyScript &> some_log.log
This way your process doesn't get cancelled when you log out.
Solution 2:
script -c <PROGRAM> -f OUTPUT.txt
Key is -f. Quote from man script:
-f, --flush
Flush output after each write. This is nice for telecooperation: one person
does 'mkfifo foo; script -f foo', and another can supervise real-time what is
being done using 'cat foo'.
Run in background:
nohup script -c <PROGRAM> -f OUTPUT.txt
Solution 3:
bash itself will never actually write any output to your log file. Instead, the commands it invokes as part of the script will each individually write output and flush whenever they feel like it. So your question is really how to force the commands within the bash script to flush, and that depends on what they are.
Solution 4:
You can use tee
to write to the file without the need for flushing.
/homedir/MyScript 2>&1 | tee some_log.log > /dev/null
Solution 5:
This isn't a function of bash
, as all the shell does is open the file in question and then pass the file descriptor as the standard output of the script. What you need to do is make sure output is flushed from your script more frequently than you currently are.
In Perl for example, this could be accomplished by setting:
$| = 1;
See perlvar for more information on this.