catching stdout in realtime from subprocess
Solution 1:
Some rules of thumb for subprocess
.
-
Never use
shell=True
. It needlessly invokes an extra shell process to call your program. - When calling processes, arguments are passed around as lists.
sys.argv
in python is a list, and so isargv
in C. So you pass a list toPopen
to call subprocesses, not a string. - Don't redirect
stderr
to aPIPE
when you're not reading it. - Don't redirect
stdin
when you're not writing to it.
Example:
import subprocess, time, os, sys
cmd = ["rsync.exe", "-vaz", "-P", "source/" ,"dest/"]
p = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
for line in iter(p.stdout.readline, b''):
print(">>> " + line.rstrip())
That said, it is probable that rsync buffers its output when it detects that it is connected to a pipe instead of a terminal. This is the default behavior - when connected to a pipe, programs must explicitly flush stdout for realtime results, otherwise standard C library will buffer.
To test for that, try running this instead:
cmd = [sys.executable, 'test_out.py']
and create a test_out.py
file with the contents:
import sys
import time
print ("Hello")
sys.stdout.flush()
time.sleep(10)
print ("World")
Executing that subprocess should give you "Hello" and wait 10 seconds before giving "World". If that happens with the python code above and not with rsync
, that means rsync
itself is buffering output, so you are out of luck.
A solution would be to connect direct to a pty
, using something like pexpect
.
Solution 2:
I know this is an old topic, but there is a solution now. Call the rsync with option --outbuf=L. Example:
cmd=['rsync', '-arzv','--backup','--outbuf=L','source/','dest']
p = subprocess.Popen(cmd,
stdout=subprocess.PIPE)
for line in iter(p.stdout.readline, b''):
print '>>> {}'.format(line.rstrip())