Force line-buffering of stdout in a pipeline
Usually, stdout
is line-buffered. In other words, as long as your printf
argument ends with a newline, you can expect the line to be printed instantly. This does not appear to hold when using a pipe to redirect to tee
.
I have a C++ program, a
, that outputs strings, always \n
-terminated, to stdout
.
When it is run by itself (./a
), everything prints correctly and at the right time, as expected. However, if I pipe it to tee
(./a | tee output.txt
), it doesn't print anything until it quits, which defeats the purpose of using tee
.
I know that I could fix it by adding a fflush(stdout)
after each printing operation in the C++ program. But is there a cleaner, easier way? Is there a command I can run, for example, that would force stdout
to be line-buffered, even when using a pipe?
Solution 1:
you can try stdbuf
$ stdbuf --output=L ./a | tee output.txt
(big) part of the man page:
-i, --input=MODE adjust standard input stream buffering
-o, --output=MODE adjust standard output stream buffering
-e, --error=MODE adjust standard error stream buffering
If MODE is 'L' the corresponding stream will be line buffered.
This option is invalid with standard input.
If MODE is '0' the corresponding stream will be unbuffered.
Otherwise MODE is a number which may be followed by one of the following:
KB 1000, K 1024, MB 1000*1000, M 1024*1024, and so on for G, T, P, E, Z, Y.
In this case the corresponding stream will be fully buffered with the buffer
size set to MODE bytes.
keep this in mind, though:
NOTE: If COMMAND adjusts the buffering of its standard streams ('tee' does
for e.g.) then that will override corresponding settings changed by 'stdbuf'.
Also some filters (like 'dd' and 'cat' etc.) dont use streams for I/O,
and are thus unaffected by 'stdbuf' settings.
you are not running stdbuf
on tee
, you're running it on a
, so this shouldn't affect you, unless you set the buffering of a
's streams in a
's source.
Also, stdbuf
is not POSIX, but part of GNU-coreutils.
Solution 2:
Try unbuffer
which is part of the expect
package. You may already have it on your system.
In your case you would use it like this:
./a | unbuffer -p tee output.txt
(-p
is for pipeline mode where unbuffer reads from stdin and passes it to the command in the rest of the arguments)
Solution 3:
You may also try to execute your command in a pseudo-terminal using the script
command (which should enforce line-buffered output to the pipe)!
script -q /dev/null ./a | tee output.txt # Mac OS X, FreeBSD
script -c "./a" /dev/null | tee output.txt # Linux
Be aware the script
command does not propagate back the exit status of the wrapped command.
Solution 4:
You can use setlinebuf from stdio.h.
setlinebuf(stdout);
This should change the buffering to "line buffered".
If you need more flexibility you can use setvbuf.
Solution 5:
The unbuffer
command from the expect
package at the @Paused until further notice answer did not worked for me the way it was presented.
Instead of using:
./a | unbuffer -p tee output.txt
I had to use:
unbuffer -p ./a | tee output.txt
(
-p
is for pipeline mode where unbuffer reads from stdin and passes it to the command in the rest of the arguments)
The expect
package can be installed on:
- MSYS2 with
pacman -S expect
- Mac OS with
brew install expect
Update
I recently had buffering problems with python
inside a shell script (when trying to append timestamp to its output). The fix was to pass -u
flag to python
this way:
-
run.sh
withpython -u script.py
unbuffer -p /bin/bash run.sh 2>&1 | tee /dev/tty | ts '[%Y-%m-%d %H:%M:%S]' >> somefile.txt
- This command will put a timestamp on the output and send it to a file and stdout at the same time.
- The
ts
program (timestamp) can be installed with themoreutils
package.
Update 2
Recently, also had problems with grep
buffering the output, when I used the argument grep --line-buffered
on grep
to it stop buffering the output.