tags:

views:

193

answers:

2

I am running multiple batch files in remote machine using a Perl script residing in the local machine and I want to run this batch files for a long duration.

The problem is, the Perl program which is running in the local machine is halting and executing the later commands only after the batch files ends.

I want to run batch files in remote machine and I want to execute the rest of the commands in my Perl script without halting.

Please help me out.

+2  A: 

How are you running the remote processes? The best answer will probably depend on the specific implementation. But assuming you're using something like Net::SSH or Expect or some sort of RPC mechanism, the easiest thing is probably to fork a new process to run the remote job and then continue on with your script.

my $pid = fork;
if ( ( defined $pid ) and $pid == 0 ) {       
    # child process
    do_remote_batch_jobs();
} elsif ( defined $pid ) { 
    # parent process
    do_other_stuff();
} else { 
    # fork error
    die "Unable to fork: $!";
}
friedo
I am running remote process using rsh my $command = "rsh <station ip> perl <path of the perl program\perl program_name.pl>;system($command);i have created perl program in remote machine to run the batch filesand these batch file is running successfully but the commands after this system($command) which is in local machine is not executing until the batch file ends
+1  A: 

Yes you could use fork but I think a better solution would be to have a script at the remote machine which accepts a batch job and returns its id.
Also,the current status of a submitted job can be retrieved by using the same script. This way the client(i.e your machine) would be independent of managing jobs.

Neeraj