tags:

views:

115

answers:

3

I have a perl script, script.pl which, when run, does a fork, the parent process outputs its pid to a file then exits while the child process outputs something to STOUT and then goes into a while loop.

$pid = fork();

if ( ! defined $pid ) 
{
    die "Failed to fork.";
}
#Parent process
elsif($pid)
{
    if(!open (PID, ">>running_PIDs"))
    {
        warn "Error opening file to append PID";
    }
    print PID "$pid  \n";
    close PID;
}
#child process
else
{
    print "Output started";

    while($loopControl)     
    {
           #Do some stuff
    }
}

This works fine when I call it locally ie: perl script.pl.

The script prints out some things then returns control back to the shell. (while the child process goes off into its loop in the background).

However, when I call this via ssh control is never returned back to the shell (nor is the "Output started" line ever printed.

ie: $ ssh [email protected] 'perl script.pl'

However, the interesting thing is, the child process does run (I can see it when I type ps).

Can anyone explain whats going on?

EDIT:

I ran it under debug and got this:

### Forked, but do not know how to create a new TTY.

Since two debuggers fight for the same TTY, input is severely entangled.

I know how to switch the output to a different window in xterms and OS/2 consoles only. For a manual switch, put the name of the created TTY in $DB::fork_TTY, or define a function DB::get_fork_TTY() returning this.

On UNIX-like systems one can get the name of a TTY for the given window by typing tty, and disconnect the shell from TTY by sleep 1000000.

+4  A: 

ssh [email protected] 'nohup perl script.pl'

You aren't able to exit because there's still a process attached. You need to nohup it.

Brian Roach
Doesn't seem to have an effect. =[
Razor Storm
+1 Right diagnosis. -1 Wrong prescription.
mobrule
+2  A: 

What is happening is that ssh is executing 'perl script.pl' as a command directly. If you have 'screen' available, you could do:

$ ssh [email protected] 'screen -d -m perl script.pl'

to have it running on a detached screen, and reattach later with screen -r

ninjalj
This works, control is returned. What about running this remotely via ssh is different than running it locally? Howe come when it is local it has no problems?
Razor Storm
@Razor, when running remotely, there *is* no shell -- the ssh daemon on the other side runs and attaches itself to the script. You could probably also achieve what you want by running `bash -c "perl script.pl"` as the ssh command
friedo
However, the parent process ends, so shouldn't the attached daemon stop running as well?
Razor Storm
+5  A: 

Whenever you launch background jobs via non-interactive ssh commands, you need to close or otherwise tie off stdin, stdout, & stderr. Otherwise ssh will wait for the backgrounded process to exit. FAQ.

This is called disassociating or detaching from the controlling terminal and is a general best practice when writing background jobs, not just for SSH.

So the simplest change that doesn't mute your entire command is to add:

#close std fds inherited from parent
close STDIN;
close STDOUT;
close STDERR;

right after your print "Output started";. If your child process needs to print output periodically during its run, then you'll need to redirect to a log file instead.

pra
Awesome that worked perfectly!
Razor Storm