views:

45

answers:

4

I want to execute a batch file (this will start another java app) in the middle of my program. I dont want to wait or see whether it executed successfully nor I wanted to capture erros from executing that batch file. After I started that batch file , I want to do other stuff rather than waiting for it after i execute that batch.

Do I need to take care of stdout and stderr? IS there any way to get rid of taking care of stdout and stderr.

This is my second post to clear my confusion on this topic so please be specific to the question and just dont throw the link for how to process.exe or processbuilder.

Any help is appreciated.

A: 

I think you require only this :

Runtime run = Runtime.getRuntime();  
Process p = null;  
String cmd = "D:\\a.bat";     
try {     
  p = run.exec(cmd);
}catch(Exception e){
  //do handling
}

//your code

Remember to call p.destroy() later in your code. You could also spawn above code in separate thread. Hope this helps.

YoK
+1 - @user234194 stick @YoK's code into a Main method in a new class and try it out, that's the best way to clear up your confusion IMHO
James B
-1: You're `destroy`ing the created process **immediately** after creating it (the process executes asynchronously) so at best this is a no-op and at worst it leaves junk around from the forcible termination of the script. This code doesn't read from the process' output, and hence can cause a deadlock/resource leaks too even if it were to work.
Andrzej Doyle
@Andrzej Doyle I have changed my answer.
YoK
OK, it's not catastrophic now, but it's still not correct. The question was fundamentally "do I need to take care of stdout and stderr?". Your answer implies **no**. The correct answer is **yes**.
Andrzej Doyle
A: 

If your process does not produce any output to stdout or stderror then you probably don't need to handle it. If your sub process does produce output it may block forever while trying to write to stdout, it depends on how much the OS pipes buffer. There is an example somewhere here of a stream flusher -- which everyone uses.

Justin
+1  A: 

I have found that if you don't at least eat the stdout and stderr eventually you will run out of memory. It was also preventing me from running more than one Process simultaneously.

I've been using a class I called ProcessStreamEater to do this.

public class ProcessStreamEater implements Runnable
{
   private final Process proc;

   public ProcessStreamEater(Process proc)
   {
      this.proc = proc;
   }

   @Override
   public void run()
   {
      InputStreamReader r = new InputStreamReader(proc.getInputStream());
      try
      {
         while(r.read() != -1)
         {  // put stuff here if you want to do something with output
            // otherwise, empty
         }
      }
      catch(IOException e)
      {
         // handle IO exception
      }
      finally
      {
         if(r != null)
         {
            try
            {
               r.close();
            }
            catch(IOException c)
            {}
         }
      }
   }
}

Then when I use it to eat stuff...

   ProcessBuilder pb = new ProcessBuilder(args);
   pb.redirectErrorStream(true);
   final Process proc = pb.start();
   executorService.execute(new ProcessStreamEater(proc));

where executorService was created with Executors.newCachedThreadPool()

Jay R.
Don't forget `proc.getErrorStream()`! Otherwise, very similar to what I'd advise in this situation.
Andrzej Doyle
That's why I call redirect error stream. They both go to stdout then.
Jay R.
+3  A: 

Short answer: No. As per Process' javadocs,

Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

So you have to take care of the stderr and stdout handling if you want your program to be remotely robust.

Having said that, if you really don't care about their content (which is a bad idea - what if the batch fails with a useful error message that you discard?), you can just fire off threads to read from them, as per Jay R.'s answer. This will let your logic thread continue without worrying about the state of the streams, and the stream eaters will run in the background until the streams are exhausted. You might even want to create a wrapper around Runtime.exec() that fires off threads to do this for you, if you find yourself doing this a lot.

I would, however, at least log the output from the batch process, if you're not going to intepret it in your code. When something goes wrong with the batch, analyzing the problem will be much easier with the process' output to pore over.

Andrzej Doyle
I think i understand what you are trying to explain. What if another program that I am going to execute from batch keeps all the logs and eror handling stuffs. ANd one more question, What if I have to execute multiple batch files , do I need to wait for each one to complete and process another and again when another completes start another.
@Andrzej - I like your answer, better than mine. :)
Jay R.
@user213194 - If you want to execute multiple batch files, then you could run them concurrently from Java so long as you're creating enough threads to read from all of the processes at once. (This assumes that the batch files can run without interfering with each other, of course, which is nothing to do with Java or `Process`.)
Andrzej Doyle