tags:

views:

52

answers:

1

I have the following snippet calling a perl script which writes to STDERR and STDOUT. I've been following the recomended procedures such as auto flushing the STDOUT and STDERR in the perl script and using streamgobbler threads. I've noticed this to help my issue in to a degree but at times where the perl script generates large volumes of output it will still fill up its pipes and hang. The only thing that seems to stop this adding the following to my perl script however obviously I would like the output so this is not an option.

update >>

Another interesting occurence is when I cat the /proc/pid/fd/pipe# in linux it causes the pipe to be read to be accessed. This seems to dump the content of the pipe meaning my perl process can again write to it and thus complete. Must be therefore my java process is not reading the process output stream properly.

PERL :

close STDOUT 
close STDERR

My java looks like the following

   parserProcess = run.exec(config.getCMDArray(),env);

StreamGobbler errorGobbler = new StreamGobbler(parserProcess.getErrorStream(), "ERROR");

  // any output?
     StreamGobbler outputGobbler = new StreamGobbler(parserProcess.getInputStream(), "OUTPUT");

  // kick them off
     errorGobbler.start();
     outputGobbler.start();

whereby StreamGobbler is ->

class StreamGobbler extends Thread
{
    InputStream is;
    String type;

StreamGobbler(InputStream is, String type)
{
    this.is = is;
    this.type = type;
}

public void run()
{
    try
    {
        InputStreamReader isr = new InputStreamReader(is);
        BufferedReader br = new BufferedReader(isr);
        String line=null;
        while ( (line = br.readLine()) != null)
            System.out.println(type + ">" + line);    
        } catch (IOException ioe)
          {
            ioe.printStackTrace();  
          }
}
}



 String line="";

 status = parserProcess.waitFor();

Thanks in advance

+2  A: 

I think you're describing normal behavior for your perl and java apps. The actual size of the buffer behind a pipe is undefined/OS dependent but it could be as little as 1KB or 16KB. It is good and sensible for the output-producing process to block until its text has been consumed; what would be the alternative? Allocating ever bigger amounts of memory for the pipe buffer?

If you're using the above code to experiment, then your processing speed is bottlenecked by Java's console output, i.e. the System.out.println() calls. I think if you commented the printing out (just for testing), you'd see Java slurping up the Perl output quite snappily.

Carl Smotricz
An alternative is to do your interprocess communication through files. This has the advantage of not blocking your output (or at least until the disk is full) though it has disadvantages, too (needing to clear the EOF condition on the input stream, performance, that disk filling up thing, ...)
mobrule
Well, he could let the perl program write to file and run to completion, then fire up the java app and run that... we don't know if there's really a stringent requirement that both programs run side-by-side. But patching together piping based on files seems like an awful idea.
Carl Smotricz