Saving stdout from subprocess.Popen to file, plus writing more stuff to the file


I'm writing a python script that uses subprocess.Popen to execute two programs (from compiled C code) which each produce stdout. The script gets that output and saves it to a file. Because the output is sometimes large enough to overwhelm subprocess.PIPE, causing the script to hang, I send the stdout directly to the log file. I want to have my script write something to the beginning and end of the file, and between the two subprocess.Popen calls. However, when I look at my log file, anything I wrote to the log file from the script is all together at the top of the file, followed by all the executable stdout. How can I interleave my added text to the file?

def run(cmd, logfile):
    p = subprocess.Popen(cmd, shell=True, universal_newlines=True, stdout=logfile)
    return p

def runTest(path, flags, name):
    log = open(name, "w")
    print >> log, "Calling executable A"
    a_ret = run(path + "executable_a_name" + flags, log)
    print >> log, "Calling executable B"
    b_ret = run(path + "executable_b_name" + flags, log)
    print >> log, "More stuff"

The log file has: Calling executable A Calling executable B More stuff [... stdout from both executables ...]

Is there a way I can flush A's stdout to the log after calling Popen, for example? One more thing that might be relevant: Executable A starts then pends on B, and after B prints stuff and finishes, A then prints more stuff and finishes.

I'm using Python 2.4 on RHE Linux.

11/22/2012 2:01:09 PM

Accepted Answer

You could call .wait() on each Popen object in order to be sure that it's finished and then call log.flush(). Maybe something like this:

def run(cmd, logfile):
    p = subprocess.Popen(cmd, shell=True, universal_newlines=True, stdout=logfile)
    ret_code = p.wait()
    return ret_code

If you need to interact with the Popen object in your outer function you could move the .wait() call to there instead.

7/6/2010 11:03:52 PM

As I understand it A program waits for B to do its thing and A exits only after B exits.

If B can start without A running then you could start the processes in the reverse order:

from os.path import join as pjoin
from subprocess import Popen

def run_async(cmd, logfile):
    print >>log, "calling", cmd
    p = Popen(cmd, stdout=logfile)
    print >>log, "started", cmd
    return p

def runTest(path, flags, name):
    log = open(name, "w", 1)  # line-buffered
    print >>log, 'calling both processes'
    pb = run_async([pjoin(path, "executable_b_name")] + flags.split(), log)
    pa = run_async([pjoin(path, "executable_a_name")] + flags.split(), log)
    print >>log, 'started both processes'
    print >>log, 'process B ended'
    print >>log, 'process A ended'

Note: calling log.flush() in the main processes has no effect on the file buffers in the child processes.

If child processes use block-buffering for stdout then you could try to force them to flush sooner using pexpect, pty, or stdbuf (it assumes that the processes use line-buffering if run interactively or they use C stdio library for I/O).

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow