Is there a way to redirect output to a file without buffering on unix/linux?

I have a long running batch process that outputs some debug and process information to stdout. If I just run from a terminal I can keep track of 'where it is' but then the data gets too much and scrolls off the screen.

If I redirect to output to a file '> out.txt' I get the whole output eventually but it is buffered so I can no longer see what it is doing right now.

Is there a way to redirect the output but make it not buffer its writes?


You can explicitly set the buffering options of the standard streams using a setvbuf call in C (see this link), but if you're trying to modify the behaviour of an existing program try stdbuf (part of coreutils starting with version 7.5 apparently).

This buffers stdout up to a line:

stdbuf -oL command > output

This disables stdout buffering altogether:

stdbuf -o0 command > output

You may achieve line buffered output to a file by using the script command like so:

stty -echo -onlcr   # avoid added \r in output
script -q /dev/null batch_process | tee output.log        # Mac OS X, FreeBSD
script -q -c "batch_process" /dev/null | tee output.log   # Linux
stty echo onlcr

On Ubuntu, the unbuffer program (from the expect-dev) package did the trick for me. Just run:

unbuffer your_command

and it won't buffer it.